Module 7.6 – Risk Assessment and Bias

As mentioned in the beginning of this module, we live in a world of uncertainty, and can never eliminate risk entirely. However, people often perceive risk in dramatically different ways. This is not always a bad thing, as it can expose new information or different perceptions of impact or likelihood of a threat. However, divergences can also result from systematic errors or omissions in the way we analyse the facts.

The word for such systematic error, from a statistical viewpoint is bias. Such bias is perfectly normal; all human beings exhibit biases of some sort. However, this does not mean that we should simply accept bias as inevitable; we must do our best to reduce or neutralize it, and the best way to do this is to first become aware of the bias. The following points explain some of the more common forms of bias that often appear when people analyse risk.

Recency bias – Similar events to those that have occurred recently may seem more likely to repeat than they really are. When a devastating earthquake occurs, for example, people often become hyper-aware of the threat of earthquakes over the risk of other threats even though seismic activity in the region may be prone to happen cyclically over a period of hundreds or thousands of years.

Media bias – Spectacular events are likely to receive substantial media coverage and may seem more likely than they really are. This applies to most violent crime, including terrorism. In most countries, the chances of dying in a road accident are much higher than of being killed maliciously. Various diseases are more likely still; however, these mundane dangers are unlikely to appear often in the news, and therefore may not be appreciated in their true proportion to overall risk.

Control bias – Events that we can control may seem less risky to us than ones beyond our control. A common example of this is the difference in perceptions of flying and driving. Flying is statistically much safer than driving, but when we are behind the wheel of our car we have the illusion of control, believing that if the situation demands it, we will be able to avoid an imminent accident. Here media bias may also play a role, as large-scale plane crashes, however uncommon, make for more dramatic news coverage, distorting the real likelihood of becoming a victim of such an event.

Acceptance bias – Risks that we willingly accept often seem less dangerous than ones that are thrust upon us. Smoking, investing in stocks, or applying for a dangerous field assignment may feel less risky than having to breathe second-hand smoke, having your money stolen, or being sent on a dangerous assignment without your consent.

Impact-likelihood blurring bias – Events that are very high-impact may seem more likely, and events that are very likely may seem to have a higher impact than they actually do. The critical impact of murder, for example, may lead us to rank it as moderately likely or even likely when it is in fact very unlikely. We may feel that since the event is “extreme,” both impact and likelihood should be high, when in fact only one factor (impact) is high. Likewise, repeated minor car accidents may lead some to incorrectly inflate the impact, thinking that “the effect of so many accidents must surely add up to a high impact!” While the idea of a cumulative effect is correct, this effect is already captured in your risk analysis by its high likelihood ranking, so considering it again on the impact side incorrectly inflates the real risk.

Confirmation bias – Once we hold a belief, we tend to be very reluctant to change it. This can lead us to filter out any information that would contradict our usual way of thinking and to only consider facts that corroborate our pre-conceived notions. This bias often plays a role in prejudicial assessment of other ethnic, political, or religious groups, for example. Once we hold such a bias, we will immediately seize upon facts that support our prejudiced notions while not noticing or disregarding those that contradict them. To sum it up, we tend to see only the information that supports what we already believe to be true.

Biases are normal and inevitable for all of us, but when analysing risk, they can cause significant errors. For this reason it is important for risk managers to try to understand and overcome their own biases when making assessments. One of the best ways to do this is to conduct risk assessment with others, allowing the group process to test and expose biases when they occur.