Skip to Content, Skip to Navigation

Safeguard OSH Solutions - Thomson Reuters

Safeguard OSH Solutions - Thomson Reuters

Alert24 - Safeguard Update

Learning how not to think

Learning how not to think
Article Type:
Publication Date:
New Zealand

The human brain is hot-wired to think in ways that prevent us learning from accidents that happen to other people, a human factors scientist told the summit.

Brionny Hooper from Scion said research has revealed a number of biases in the way we perceive the actions and situations of others which, unless we learn to avoid them, will distort our understanding of why accidents occur.

"These glitches in the brain's programming actually lead us directly away from gaining any information from an incident," she said. "But if we understand how the way we think is preventing us from learning, we can overcome them."

She identified three errors in the way humans process information that impair incident investigations - hindsight bias, over-estimating the role of internal factors in determining other people's actions, and over-simplification of complex situations.

Hindsight bias uses understandings gained after an incident to judge actions and decisions made before it occurred, often resulting in unreasonable conclusions. "Hindsight bias tells us the Titanic should have carried more lifeboats, but up to the moment it sank, it was universally considered unsinkable. From that perspective, it was a really conservative decision to have any lifeboats on board."

To overcome hindsight bias, Hooper said, we must avoid thinking about what could or should have prevented the accident and focus on how the situation actually unfolded for those involved. "If they had known they were going to get hurt they wouldn't have done what they did, so you have to work out why their actions made sense to them."

A second processing bias is a tendency to attribute our own mistakes to external factors, such as faulty equipment, and those of others to internal or personal factors, such as laziness or lack of knowledge.

"When analysing other people's behaviour we over-estimate the impact of personality and abilities," she said. "To overcome this we must focus on the situation rather than the person, starting from the assumption that they were doing the best they could with the information and resources they had."

The third glitch is a tendency to regard a work environment as constant, and attribute any variations to mistakes made by those involved.

"We're programmed to believe that a system is stable and safe, which means that when things go wrong we immediately put the blame on poor decision making or incorrect actions at an individual level. If instead we train ourselves to look at the situation from the perspective of the guys on the ground we will see the vulnerabilities, risks and pressures that they were contending with."

Hooper will explain these ideas in more depth at a series of regional workshops, organised by the Forest Industry Safety Council, later in the year.

People Mentioned:
Brionny Hooper
Organisations Mentioned:
Reference No:

From Alert24 - Safeguard Update

Table of Contents