Back to home page

Go Figure: Why Safety Needs More Than Numbers

SafetyCulture News | By | 16 Nov 2018 | 3 minute read

safety needs

Quantitative risk assessment techniques, or those that assign numerical or statistical ratings to risk and safety issues, are used to estimate risk or find the probability of something going wrong. It’s the sort of approach that results in those workplace safety signs you’ll see spruiked on business entrances such as “200 days without an incident”.

This technique is widely deployed to provide valuable data that can then be analysed, modelled and used to pinpoint potential problems. But does quantitative safety analysis create more risk by incentivising managers to sweep minor incidents under the carpet?

According to the Associate Dean for Research at the University of Iowa College of Public Health, Professor Corinne Peek-Asa, risk arises when valuable data is misused, mismanaged or not taken into consideration.

“Quantitative approaches can have the impact of reducing reporting,” Peek-Asa says.

Contrary to this, a paper by George Apostolakis published in the Journal of Risk Analysis points out the many benefits of quantitative risk analysis in preventing incidents. Along with recording simple data sets such as the number of days without incident or the number of lost hours in a month, two of the more important yet less obvious benefits are:

  • increasing the probability that complex interactions between events/systems and operators will be identified; and
  • identifying the dominant or more likely accident scenarios so that the company’s resources are not wasted on unlikely incidents.

But as Peek-Asa would argue, that’s far from the end of the story. In fact, a purely quantitative approach to safety could introduce risk into an organisation.

Safety needsSafety sign, Soudan Underground Mine State Park, Minnesota.

Consider, for example, how statistics could be overlooked or moulded to fit a company’s overall strategic goals, pressure of deadlines and time constraints, pressure to perform or because of limited budgets.

However research has found that lack of reporting, or misreporting, isn’t always for machiavellian purposes, or indeed a sign of staff purposefully neglecting their duty.

Professor Diane Vaughan, a sociologist, notes in her book The Challenger launch decision: Risky culture and deviance at NASA, that organisations can fall victim to “the normalisation of deviance”.

This means staff become so used to small safety problems they are perceived to pose no dire safety risk. Vaughan proposed this “normalisation” theory as a reason NASA failed to prevent the space shuttle Challenger disaster of 1986: the organisation knew of “seemingly harmless” faults but decided not to act upon them.

Challenging Challenger

Peek-Asa says in light of the problems with quantitative risk analysis, it shouldn’t simply be swapped out for qualitative analysis. “Generally, any program can have unintended consequences or a negative impact if not implemented well.”

For example, in the lead-up to the space shuttle Challenger disaster, NASA opted not to undertake proper quantitative statistical risk analysis, deeming it impractical. Instead it opted for a qualitative analysis, with serious consequences. In The Challenger disaster: A case of subjective engineering, Trudy Bell and Karl Esch argue that NASA’s resistance to probabilistic risk analysis (a type of quantitative analysis) resulted in subjective qualitative risk assessment that contributed to the Challenger disaster. They say the lesson learned is the need “to use probabilistic risk assessment more in evaluating and assigning priorities to risks in design”.

Actions speak louder

A combination of quantitative and qualitative techniques is useful, but an organisation still needs proper systems and processes in place to ensure these techniques are used properly. Apostolakis argues that data from quantitative methods is most useful when used to inform decisions rather than to form the basis of managerial decisions.

“Ideally, some of both are used as metrics,” says Peek-Asa, “metrics are never as important as the activities underlying them.”

Like this article? Why not share it!

Important Notice
The information contained in this article is general in nature and you should consider whether the information is appropriate to your specific needs. Legal and other matters referred to in this article are based on our interpretation of laws existing at the time and should not be relied on in place of professional advice. We are not responsible for the content of any site owned by a third party that may be linked to this article. SafetyCulture disclaims all liability (except for any liability which by law cannot be excluded) for any error, inaccuracy, or omission from the information contained in this article, any site linked to this article, and any loss or damage suffered by any person directly or indirectly through relying on this information.