Words by Carley Convenience

Exploring Psychological Biases

Human decision-making is rarely purely rational. Instead, it is shaped by cognitive shortcuts known as psychological biases, which are systematic patterns of deviation from normatively rational judgment. These biases explain why individuals, organizations, and governments frequently prioritize short-term convenience over long-term safety, ethics, or optimal outcomes.

In high-stakes scenarios like the ongoing operation of Ronald Reagan Washington National Airport (DCA) despite well-documented risks, these biases reveal themselves powerfully. Even after the devastating January 29, 2025, midair collision that claimed 67 lives, the airport remains open with only incremental changes. This persistence highlights how deeply ingrained biases favor ease and familiarity over decisive, protective action.

Hyperbolic Discounting (Present Bias)

Hyperbolic discounting, also called present bias, describes the tendency to overvalue immediate rewards while steeply discounting future benefits or costs. People prefer smaller payoffs now over larger ones later, even when the delayed option is objectively better. This bias drives the choice of convenience because its benefits (saved time and reduced effort) are felt instantly, whereas safety risks manifest unpredictably in the future.

In the DCA context, lawmakers and officials experience immediate gains from the airport’s proximity: quick flights, efficient schedules, and minimal disruption. The potential catastrophe, however, feels distant and abstract. Research shows this bias influences everything from health choices (skipping exercise for instant gratification) to environmental policy (delaying action on climate threats). At DCA, present bias explains why expansions continued despite warnings. Convenience today outweighs hypothetical dangers tomorrow. As one analysis of intertemporal preferences notes, this inconsistency leads to regret when future costs materialize, as they tragically did in 2025.

Status Quo Bias

Status quo bias is the preference for maintaining current conditions, even when change would yield better results. Rooted in loss aversion (fearing losses more than equivalent gains) and fear of regret, people view deviation from the norm as risky. Phrases like “if it ain’t broke, don’t fix it” capture this inertia, but it persists even when things are broken.

For DCA, the status quo, keeping a conveniently located but hazardous airport operational, prevails because closure would require upheaval: longer commutes for elites, redistributed flights, and political backlash. Psychological studies link this bias to emotional comfort in familiarity and perceived risk in change. Decision-makers avoid the “regret” of disrupting routines, overlooking accumulated evidence of danger (short runways, congested airspace, repeated near-misses). This bias compounds in institutions, where inertia allows problems to fester until crises force minimal adjustments, exactly the pattern seen after the 2025 crash.

Optimism Bias

Optimism bias leads individuals to believe they are less likely to experience negative events than others. This overconfidence downplays personal risks and fosters complacency. In aviation, it is particularly dangerous: pilots and regulators often underestimate hazards because “it hasn’t happened to me” or “we’ve managed before.”

At DCA, optimism bias manifests in the belief that restrictions and route changes sufficiently mitigate risks, despite inherent flaws like restricted airspace and high traffic. Studies on aviation safety highlight how this bias causes disregard for protocols or warnings. Pilots see themselves as more skilled; officials view the system as resilient. Before 2025, experts’ alarms were dismissed; after the crash, the same bias supports reopening fully, assuming another disaster is unlikely. This illusion erodes vigilance, turning “manageable” risks into accepted ones.

Interplay with Other Biases: Normalcy and Familiarity

These core biases interact with others. Normalcy bias assumes disasters that haven’t occurred recently won’t happen at all, leading to inadequate preparation. Familiarity bias makes repeated exposures (like routine near-misses at DCA) seem less threatening. Together, they create a “safety trap”: convenience feels safe because it is familiar, and risks fade into the background until catastrophe strikes.

In broader society, these biases explain privacy trade-offs (sharing data for app convenience), unsustainable consumption (disposable items for ease), and delayed policy reforms. In government, powerful stakeholders amplify them. Self-interest aligns with biased reasoning, protecting perks while externalizing risks to the public.

Overcoming These Biases

Awareness is the first step. Structured decision-making tools, such as mandatory risk audits independent of stakeholders, can counteract inertia. Framing changes as gains (safer travel for all) reduces loss aversion.

On this anniversary of the 2025 tragedy, reflecting on these biases underscores the need for accountability. True progress at DCA, or anywhere, requires overriding convenience with deliberate, bias-resistant choices.

Psychological biases are evolutionary shortcuts that once aided survival but now sabotage complex modern decisions. Prioritizing convenience over safety is not mere laziness; it is wired into our brains. Recognizing this does not excuse inaction. It demands we fight harder for the right path, honoring lives lost by preventing the next preventable tragedy.

Want the Details of this Particular Human Failing?

Here’s an experience pilot’s analysis of the DCA crash.

Comments are closed