Normalcy Bias:
The Ostrich Effect
Published By Ian Nicholson
Normalcy bias, also known as normality bias, is a cognitive bias that leads people to underestimate or dismiss the likelihood and potential effects of a crisis or disaster.
This bias often occurs because individuals believe that things will carry on as usual, despite any signs or warnings suggesting otherwise. As a result, people may not take the necessary precautions and preparations to protect themselves from potential threats.
The ostrich effect is another term commonly used to describe normalcy bias. This name is inspired by the myth that ostriches bury their heads in the sand when faced with danger, avoiding confronting the reality of the situation.
In reality, ostriches do not engage in this behaviour, but the metaphor serves to illustrate how normalcy bias can lead people to ignore or downplay potential threats.
Understanding normalcy bias is essential because it can have serious consequences during crises or disasters. This cognitive bias can hamper appropriate decision-making, leaving individuals ill-prepared to handle their challenges. For instance, people might choose not to evacuate in the face of an approaching hurricane, believing that the storm’s impact will be less severe than predicted.
In order to combat normalcy bias and make better decisions, it is important to recognize when this cognitive bias is at play. One approach to doing so is by consciously challenging our assumptions and considering alternative perspectives or outcomes.
By engaging in critical thinking and being open to the possibility of disruption, we can more accurately assess potential threats and take appropriate action to protect ourselves and others.
Normalcy Bias in Disasters
This bias often causes people to believe that things will carry on as usual, leading to inadequate or inappropriate preparations for a crisis or an underestimation of its severity.
During natural disasters like hurricanes, tsunamis, and earthquakes, normalcy bias can prevent individuals from taking necessary precautions, such as evacuating or preparing emergency supplies. This mindset is often accompanied by a sense of disbelief that the impending effects of the disaster could significantly impact their lives or environment. For example, they might think, “It can’t happen here.”
Amanda Ripley, author of The Unthinkable: Who Survives When Disaster Strikes – and Why, identifies common response patterns of people in disasters and explains that there are three stages of response: “denial, deliberation, and the decisive moment.”
The COVID-19 pandemic illustrated the effects of normalcy bias, as people may have initially underestimated the virus’s potential impacts on their daily lives and overall society. In the early stages of the pandemic, some individuals were slow to adopt public health measures like wearing masks, washing hands, maintaining physical distance, and getting vaccinated, often due to a belief that the situation was not as severe as experts warned
In addition to natural disasters and pandemics, normalcy bias can also influence people’s perceptions of emergencies related to war or civil unrest. Individuals may not fully grasp the reality of potential dangers until they are directly affected, which may lead to delayed evacuations or lack of proper safety measures.
Examples of Normalcy Bias
The phenomenon can appear in phenomena like vehicle accidents. Car crashes happen all the time, but the ordinary person only thinks about them once in a while.
Those who espouse conspiracy theories or apocalyptic future scenarios point to the normality bias as a major reason why others dismiss their claims. Survivalists, for example, who believe that the United States is on the verge of totalitarian rule blame normalcy bias for the fact that most Americans do not share their concerns.
Lack of planning is another example. Making emergency or contingency plans is something that many individuals put off.
The normalcy bias may lead them to assume that disasters are unlikely to occur or that, if they do occur, the problem will be handled by someone else. This tendency toward normalcy can lead to inadequate crisis preparation, leaving individuals defenceless and unprepared when a true disaster happens.
One historical example of normalcy bias is the tragedy of Pompeii, where people failed to comprehend the impending danger of Mount Vesuvius’ eruption despite seeing initial signs that warned of the volcano’s activity. Many inhabitants continued with their daily routines, believing that the situation would return to normal until it was too late.
Influence of Optimism
Optimism plays a significant role in the development and manifestation of normalcy bias. When people are overly optimistic, they tend to underestimate the likelihood or impact of negative events, leading to a positive bias in their decision-making process.
This overly positive viewpoint can arise from optimism bias, which refers to the tendency for individuals to believe they are less likely than others to experience unfavourable events.
Optimism can contribute to a person’s belief that nothing can seriously disrupt their everyday life. This belief might be rooted in an overly optimistic outlook where individuals expect problems to resolve themselves without their active involvement or need for precautionary measures.
Consequently, this mindset increases the potential for normalcy bias to occur, as it leads people to downplay or completely dismiss the possibility of a crisis or disaster.
In worst-case scenario bias, the opposite of normalcy bias, small deviations from normalcy are interpreted as warning signs of impending disaster. It is important to strike a balance between optimism and realism. A confident and knowledgeable understanding of risks and potential outcomes can help individuals avoid falling victim to normalcy bias.
By acknowledging and preparing for the possibility of adverse events, without reacting to every sign of potential negative outcome, one can maintain a clear and neutral perspective while making important decisions in times of crisis.
Impacts of Normalcy Bias on Actions
When faced with threat warnings or potential risks, individuals affected by the normalcy bias may ignore valuable information, believing that negative outcomes are less likely to happen or may even choose to maintain the status quo. As a result, analysis paralysis, or negative panic, can occur as individuals become overwhelmed with the situation, delaying vital decisions or efforts to mitigate risks.
In many cases, normalcy bias can lead to an underestimation of the severity of a crisis. Individuals may not only downplay the potential damage but may also fail to recognize warning signs altogether. This skewed risk perception can hinder proper judgment, leading people to focus on maintaining an illusory sense of security and stability, rather than taking necessary precautions.
Denial
Denial is a common psychological defence mechanism that people use to cope with or avoid uncomfortable situations, emotions, or facts. Cognitive dissonance is another psychological phenomenon closely related to denial and normalcy bias.
When people experience cognitive dissonance, they feel a deep sense of discomfort due to holding two contradictory beliefs. To resolve this discomfort, individuals may attempt to rationalize their experiences by dismissing critical information to maintain their existing beliefs, which can contribute to or exacerbate normalcy bias.
Disbelief and denial can lead to the development of conspiracy theories. These theories often arise as alternative explanations for events, provided as a means to support existing beliefs without confronting the uncomfortable reality of a situation.
Conspiracy theories can contribute to the perpetuation of normalcy bias by creating a false sense of security, which, in turn, can hinder individuals from properly processing and responding to threats.
Role of Past Experiences
Past experiences often serve as a basis for individuals to assess the likelihood of future events. This is particularly evident in the way people handle crisis situations.
One factor that ties past experiences to normalcy bias is the recency bias. Recency bias refers to the tendency of people to give more importance to recent events, while downplaying the significance of events that happened in the distant past.
When individuals are faced with a potential crisis, they may not recognize the severity of the situation if nothing similar or severe has happened in the recent past. This contributes to normalcy bias, as individuals tend to underestimate the likelihood of a disaster and its potential consequences.
Anecdotal evidence also plays a role in reinforcing normalcy bias. People often rely on personal stories or experiences as a frame of reference when predicting the likelihood of certain events.
However, anecdotal evidence can be misleading — it may not be an accurate representation of the overall probability of a given event. When faced with threats or disasters, individuals may fail to recognize these events’ actual likelihood and consequences based on their interactions with others or their own past experiences.
Another of the main factors contributing to normalcy bias is our reliance on familiarity and the availability heuristic. People tend to rely on past experiences and familiar patterns to make judgments about the likelihood of future events.
When faced with a potential threat, individuals may subconsciously downplay its severity due to the rarity of such occurrences in their personal experience.
This relates to the availability bias, where people make decisions based on the ease with which relevant examples come to mind. Situations that seem improbable or unfamiliar may be discounted, leading to underestimating risks or dismissing them entirely.
The framing effect also plays a role in normalcy bias: how the information is presented can influence an individual’s perception of its importance, potentially causing them to overlook crucial details.
Selective perception is another significant aspect of normalcy bias. It occurs when someone’s personal beliefs or preferences lead them to focus on certain information while disregarding other, equally important data. This can create a perception bias where the person makes decisions based on an incomplete understanding of the situation.
References:
- Davies, S. J. J. F.; Bertram, B. C. R. (2003). “Ostrich”. In Perrins, Christopher (ed.). Firefly Encyclopedia of Birds. Buffalo, NY: Firefly Books, Ltd. ISBN 978-1-55297-777-4
- Evans, Dylan. Risk Intelligence: How To Live With Uncertainty. Free Press/Simon & Schuster, Inc., 2012; ISBN 9781451610901
- McRaney, David (2012). You Are Not So Smart: Why You Have Too Many Friends on Facebook, Why Your Memory Is Mostly Fiction, and 46 Other Ways You’re Deluding Yourself. Gotham Books. ISBN 978-1-59240-736-1
- Murata, A., Nakamura, T., & Karwowski, W. (2015). Influence of cognitive biases in distorting decision making and leading to critical unfavorable incidents. Safety, 1(1), 44–58.
- Omer, Haim; Alon, Nahman (1994). The continuity principle: A unified approach to disaster and trauma. American Journal of Community Psychology. 22 (2): 273-287 doi:10.1007/BF02506866
- Ripley, Amanda (June 10, 2008). The Unthinkable: Who Survives When Disaster Strikes – and Why. Potter/Ten Speed Press/Harmony. ISBN 978-0-307-44927-6
- Ross, C. (2020). Covid-19 pandemic from a “normalcy bias” approach. Journal of Community & Public Health Nursing, 6(242).
Last Updated on February 9, 2024
Comments