-->

Friday, June 11, 2010

The Fallacy of Safety

Let me begin by affirming that I am a great admirer of Ms. Deborah A.P. Hersman, the Chairman of the National Transportation Safety Board (NTSB). It seems to me that she well understands the power of moral suasion at her agency's disposal, and she makes good use of the bully pulpit that her office provides.

Given my admiration of her, I was taken aback and disappointed when, in her opening remarks for the Board's recent forum on Professionalism in Aviation, Ms. Hersman said this:

"[T]he American people rightly demand not 99-plus percent safety, they demand 100% safety."

Of late, we are daily reminded that we engage as a society in high-risk endeavors. Whether we are speaking of an aviation mishap or a major casualty in some other technology-intensive field, we understand that some combination of carelessness, incapacity and neglect tipped the balance from "ops normal" to catastrophe. The systematic defenses that had been designed to save the day were overwhelmed. And people died.

Predictably, the cries for safety ring out across the land. Managements and regulators are faulted because they failed to ensure that the operations were safe. We must be able to assure the public that they will be safe! Tell me, please, what does "safe" mean?

The dictionary teaches that "safe" means "secure from liability to harm, injury, danger, or risk." So, to render an activity safe, must we reduce the "liability to harm, injury, danger, or risk" to essentially zero? Is that to be our goal?

"A ship in harbor is safe, but that is not what ships are built for."

-- John A. Shedd, educator, Salt from My Attic, 1928

If, in our economic activities, we are to venture out of the harbor and onto the stormy seas of real life, we have to realize that there will be risk, there will be danger, that we shall go in harm's way. We cannot be safe.

Let me repeat that: We cannot be safe! Ever! To live, is to live with risk. To "demand 100% safety" (in Chairman Hersman's words) is to be at best naive and at worst a fool.

"Insisting on perfect safety is for people who don't have the balls to live in the real world."

-- Mary Shafer, NASA Dryden Flight Research Center

If we will wrap our minds around this simple, somewhat discomforting truth, perhaps we can abandon the fallacy of safety and start to think seriously about risk mitigation. To live in the real world, we need to understand the sources and magnitudes of our risks. We need to think through ways to avoid or to counteract them. And we need to so order our lives that adequate countermeasures will be at hand whenever danger looms.

Risk mitigation is always purchased. We must decide how much of it we wish to pay for. The calculus, whether explicit or instinctive, involves evaluation of the likelihood of mishap and the cost of consequences.

When the potential cost of failure is very high (say, a nuclear plant core melt-down) there is almost no limit to the risk-reduction budget. When the cost of failure is less extreme (as with aviation systems, where at worst a few hundred people are at risk) then we will spend a lot to reduce risk, but not an unlimited amount. We just can't afford to reduce risk in aviation to the level where it must be in nuclear systems.

Once the calculus of risk mitigation is completed and appropriate safeguards are in place, it's essential to avoid deviations that introduce unmanaged risk factors. There are always pressures, economic and political, to allow a risk in "just this once". And then if no mishap occurs, well, it must be all right to accept that risk routinely. That's called "the normalization of deviance," and we lost two space shuttles that way in spite of supposedly nuclear-quality risk management systems.

When the public asks of us, "Is aviation safe?" we can't honestly say, "Absolutely!" We have to answer: "We understand all of the risks and we take measures to deal with them. And we work every day on doing it better." And it has to be the truth.


Further reading: Prof. Sharon Beder, 'The Fallible Engineer', New Scientist, 2nd November 1991

3 comments:

Greg said...

Very well put. Nice post!

Christine Negroni said...

Excellent post, Frank, though I’ll argue with the notion that risk mitigation decisions are made on a sliding scale of potential damage. If that were the case, we’d have seen more risk mitigation on the BP oil platform than on airplanes. Public pressure is the wild card here. Complex systems are alien to most of us, but it’s all-too easy to imagine oneself on crashing jetliner. See my blog on this subject at http://christinenegroni.blogspot.com/2010/04/whats-safe-above-is-safe-below-applying.html

We cannot understand all the safety risks in even the most examined endeavors because every new technological advance introduces a myriad new opportunities for mayhem. Even the best minds can’t forecast the specifics.

Deb Hersman is right, to attempt the unachievable beats settling for anything less.

Frank Van Haste said...

Christine, I won't contend that "risk mitigation decisions are made on a sliding scale of potential damage," but rather that they should be. It's apparent that risk mitigation on the Deepwater Horizon was totally inadequate, given the potential damage, which was, I'd say, fully foreseeable.

As to the difficulty of understanding risks in complex systems, there is known technology for that task. It's called FMEA - Failure Modes & Effects Analysis. It ain't easy and it ain't cheap and of course it ain't perfect, but it works a whole lot better than relying on fallible gut instinct.

Finally, we'll have to agree to disagree on Ms. Hersman's endorsement of dreaming the impossible dream. I continue to regard pursuit of that chimera as the enemy of a proper risk mitigation strategy.

Thanks so much for contributing your views, which always provoke another layer of thinking.

Regards,

Frank