Disasters such as the Space Shuttle Challenger explosion and the Chernobyl nuclear accident are said to have involved flawed decisions, driven by individual perceptions. Not surprisingly, individuals’ behaviors—and the perceptions that drive those behaviors—are key to determining the safety performance of an organization.

Executive Summary

The same part of the brain that prompts employers to count someone out of the running for a job because unconscious negative associations trigger their decisions can also prevent the proper assessment of risk, consultant Angela Peacock believes. Here, she describes how disasters like Deepwater Horizon came about. Leveraging the idea that decisions driven by individual beliefs, feelings and biases can lead to tragic outcomes, she makes the case for diversity and inclusion as a cornerstone of a risk-aware culture.

In the wake of the Deepwater Horizon disaster, for example, it was reported that rig staff had tested the concrete seal on the excavated well before removing the drilling column. The results indicated that the seal was not secure and removing the column might cause a catastrophic blowout. So, why were the signs ignored? A disaster analyst said that confirmation biases caused workers to explain away the results rather than investigating further.

Underpinning behavioral safety is a cycle linking beliefs, feelings, behaviors and results in a self-affirming loop. If we believe something is lower risk than is actually the case, our confidence and bravado will drive positive behaviors that may, with a measure of good fortune, carry us through. Our perception is then reinforced and confidence increased.

Enter your email to read the full article.

Already a subscriber? Log in here