Why We Get Risk So Wrong: The Control Paradox in Decision-Making

picture3

In July 2010, I landed in Islamabad just minutes after AirBlue Flight 202 crashed into the nearby Margalla Hills, killing 152 people and leaving a scar in the mountainside that haunted me for the two years I was stationed in Pakistan. Every time I looked at that mark on the side of the mountain, I felt like I’d narrowly escaped death.

Of course, that was nonsense. I was on a different airline, in a different type of plane, with different pilots, among many other variables—including that commercial planes almost never crash. Yet my colleagues and I would drive around Pakistan in all kinds of weather, at all hours, barely giving a thought to the far greater risks we faced on those roads.

This same cognitive distortion plays out every day in boardrooms, crisis management centers, and executive suites across the world. We systematically misjudge risk—not because we lack intelligence or data, but because of two fundamental psychological forces: availability bias and our overwhelming need to feel in control.

The Perception Gap

A recent MSN poll of nearly 500,000 respondents asked which mode of transportation people believed to be safest. The results revealed a striking disconnect from reality:

Perceived Safety (% of respondents):

Car: 43%

Plane: 39%

Train: 13%

Bus: 2%

Other: 4%

Now compare this to the actual data on deaths per 100 million miles traveled from the National Safety Council:

Actual Deaths (per 100,000,000 miles):

Car: 0.56

Bus: 0.02

Train: 0.03

Plane: 0.002

The numbers tell a clear story: automobile deaths dwarf all other modes of transportation—essentially the opposite of our collective perception. The National Safety Council estimates that in any given year, you have a 1 in 7,296 chance of dying in an automobile accident and a 1 in 905,176 chance of dying in a commercial plane crash. You’re more likely to be murdered (1 in 18,989) than die in a plane crash.

The Psychology Behind the Distortion

Two psychological mechanisms create this misperception. The first is availability bias. Any airline mishap becomes national news—a plane skidding off a runway or passengers injured by turbulence generates immediate headlines on our phones. We all have vivid images from 9/11 available for instant recall. This information becomes easily accessible when we assess relative safety. Meanwhile, though approximately 100 people die in automobile accidents in the U.S. each day, these tragedies rarely make news beyond local coverage. We have to work much harder to bring a fatal car crash to mind.

As Nobandegani et al. (2018) documented, we significantly overrepresent the possibility of extreme, catastrophic events while underrepresenting higher probability, more mundane events that are, by their nature, less immediately available for recall.

But availability bias alone doesn’t explain our fear of flying. The deeper force at work is our fundamental need for control and autonomy.

Since the 1970s, Edward Deci and Richard Ryan’s research on Self-Determination Theory has established that autonomy – a person’s ability to act on their own values and interests – is not merely a preference but a psychological necessity. The most striking finding from decades of autonomy research is this: when autonomy is diminished, mortality rates increase. Our need for choice and control is so powerful that it literally impacts our survival.

When we drive, we feel a strong sense of autonomy. We make our own choices, hold the steering wheel, control the pedals, and experience the illusion that we can influence outcomes when faced with danger. The statistics show this feeling of control doesn’t translate to actual safety, but the psychological impact is profound.

On an airplane, we experience none of this. From the moment we enter the airport, we lose control. We’re told which lines to stand in, when to remove our shoes, to put our seats upright—all while the people actually flying the aircraft sit behind an impenetrable door, completely out of our sight and influence. We have zero control, and this generates anxiety that our cognitive systems interpret as evidence of danger.

The Organizational Parallel

These same forces shape organizational decision-making with costly consequences. Executives and crisis management teams routinely overreact to highly visible threats while systematically underestimating more probable but less dramatic risks.

Consider the corporate equivalent of the airplane crash: a data breach that makes headlines, a public relations crisis, a product recall. These events trigger immediate, massive resource allocation—crisis teams assembled, consultants hired, policies rewritten. Meanwhile, the statistical equivalents of daily car accidents—employee burnout leading to turnover, gradual erosion of institutional knowledge, slow degradation of quality control processes—receive minimal attention despite their cumulative impact dwarfing that of any single crisis.

The control dynamic manifests differently but just as powerfully in organizational settings. Research consistently shows that diminished autonomy doesn’t just harm individual wellbeing—it degrades decision quality, reduces creativity, and increases risk-taking behavior as people attempt to reassert control in whatever domains remain available to them.

During restructuring, mergers, or leadership transitions—precisely when organizations need the best thinking from their people—autonomy is typically at its lowest. Employees who feel they have no control over their situation become risk-averse in their official roles while potentially taking inappropriate risks elsewhere, seeking any avenue to restore their sense of agency.

Practical Applications

Understanding these cognitive patterns creates opportunities for better organizational outcomes.

First, when making strategic decisions, actively consider whether you’re working from readily available information or from statistically sound data. Research shows that making decision-makers aware of availability bias and encouraging them to consider alternatives beyond the most accessible options measurably improves judgment. One study found that simply making physicians aware of availability bias and suggesting they consider different diagnostic options improved their accuracy and decision-making quality.

Second, in risk assessment, distinguish between the visibility of a threat and its probability. The most dramatic risks aren’t necessarily the most consequential. Build systems that force attention to high-probability, low-visibility risks before they accumulate into crisis.

Third, recognize that organizational changes that reduce employee autonomy will predictably degrade performance—not because people are being difficult, but because you’re triggering fundamental psychological responses. When change is necessary, the question isn’t whether to preserve autonomy but how. Research demonstrates that small gestures have disproportionately large impact on people’s sense of agency. Allowing someone to choose where they sit rather than assigning them a seat, soliciting input on implementation details even when the overall direction is set, providing options within constraints—these aren’t cosmetic adjustments but strategic interventions that maintain the psychological conditions for effective performance.

Finally, when leading others through uncertain situations, understand that anxiety often signals perceived loss of control rather than actual danger. Your role isn’t just to make the right strategic choices but to create conditions where others can maintain enough autonomy to contribute their best thinking. The more uncertain the environment, the more critical this becomes.

The goal isn’t to eliminate these cognitive biases—they’re hardwired into how we think. The goal is to recognize when they’re operating and build decision processes that account for them. The executive who understands why they instinctively focus on the visible crisis while missing the accumulating mundane risks can build systems to compensate. The leader who grasps why their team becomes risk-averse during restructuring can preserve the autonomy necessary for continued effectiveness.

We can’t change how our minds assess risk and control. But we can change how we structure our organizations to work with these realities rather than against them.

___

Somoetic Intelligence Group specializes in science-based approaches to organizational communication, decision-making, and leadership development. For information about how we can help your organization improve decision quality and leadership effectiveness, visit somoetic.com.

Scroll to Top