“I’m a conservative, but I’m not a nut about it.”  George H.W.Bush

We’ve all heard it.  “You need to be conservative, just to be on the safe side.”  It’s an article of faith in the safety community.  Like all articles of faith and other religious beliefs, any deviation is heresy with the potential for excommunication.  At the risk of being apostate, let me say from the outset, you can be too conservative. At some point, you can be so bundled up in safety that it becomes impossible to do anything else.

The Spirit of Saint Louis

In 1919, Raymond Ortieg, a New York hotelier, offered a prize of $25,000 for the first non-stop flight between New York and Paris. Six serious teams attempted to win the prize before Charles Lindbergh succeeded in May 1927. The attempts resulted in six deaths and three life-changing injuries.

Lindbergh, for his successful attempt, determined that having enough fuel to complete the flight and that staying awake for the 33½ hours it took to complete were the most important safety requirements.  Despite protests, he didn’t take a radio, parachute, sextant, or night flying equipment, although he did bring a life raft.  Unlike competing aircraft, The Spirit of St. Louis was not a tri-motor; Lindbergh and his designer decided that a single engine of proven design would be more reliable, weigh less, and allow them to carry more fuel, which is what they were convinced really mattered.

By any modern measure, Lindbergh’s solo flight from New York to Paris was not conservative. Ignoring the enormous fatality rate, which would be enough to discourage any modern attempt and would never allow for modern regulatory approval, Lindbergh refused all the safety measures that were recognized and generally accepted as good engineering practices. He flew with the windows open to help him stay awake, knowing that the drag would reduce his fuel economy by 10~15%, and still he landed in Paris after the 3,600-mile flight with enough fuel to fly another 500~1000 miles.

By Lindbergh’s standards, he was conservative.  He had a safety factor of 1.25 in the parameter he believed mattered most, fuel.  Had the safety factor been more, most are convinced that he would not have made it at all, instead wrapping up in telephone wires at the end of the runway at Roosevelt Field on Long Island.

Other crews in more conservative aircraft did not make it.

“An Abundance of Caution Does No Harm”

There is a Latin phrase, “abundans cautela non nocet” that means “an abundance of caution does no harm.”  It is sometimes phrased as “one can never be too careful.”  It must be true because it’s Latin.  Or is it?

No organization has infinite resources or infinite time. Every safety measure costs resources and time to implement, and costs resources and time to maintain.  Resources and time spent in one place are resources and time that cannot be spent elsewhere.  So, resources and time spent where they are not necessary take away from the resources and time that is available where they are necessary.  Once they are spent, there is no getting them back.  An abundance of caution where it is unnecessary harms the ability to be cautious where it is necessary.

Engineers Hate to be Wrong

My experience as an engineer and with other engineers has taught me two things about engineers:  We hate not knowing something. Worse, we hate being wrong. We want the world to be predictable, to be “well-behaved”.  When we are asked a question, we feel that we must answer it.

Have you ever worked on the estimate for a capital project? The first engineer that touches it comes up with an estimate.  Aware that projects are unpredictable, the engineer adds a 10% contingency to the estimate to be conservative, to be on the safe side.  After all, he’s an engineer, he’s supposed to know, and he certainly doesn’t want to be wrong.  Besides, no one ever got into trouble for coming in under budget. The next engineer takes the first engineer’s estimate, does some work on it, and then adds another 10% contingency, to be on the safe side. Same rationale.  As the estimate makes its way up the organization chart, the contingencies get compounded, so that after a half dozen people have touched the original project estimate, its twice as much as it would have been.  Then, if the project is typical, one of two things happen:  the project still proceeds and comes in under budget, or the project is cancelled because it is too expensive and there is no opportunity to discover if the budget was incorrect.  In either case, there is no evidence of being wrong, wrong being defined as being over budget.

Project estimates use contingencies in the per cent range.  Risk estimates use contingencies in the range of orders of magnitude.  Rather than increasing an estimate by 10%, we increase an estimate of risk by 10x.  If two estimates go into the risk assessment for a hazard, each conservative by a factor of 10, the total estimated risk is not 21% high; it’s a hundred times too high.

Consider another hazard with the same actual risk.  Instead of two estimates, though, it requires four estimates. When each estimate is conservative by a factor of 10, the total estimated risk is ten thousand times too high.

What’s the harm in that? To begin with, if we take the risk assessments seriously, we will misallocate finite resources to the second hazard, ignoring other hazards that need attention. Worse, when a risk assessment doesn’t make sense, it won’t be believed. If it contradicts the experience of people using it to make decisions, they will not dismiss their experience as unrepresentative.  They will dismiss the risk assessment as a foolish academic exercise that has no place in the real world.  Being too conservative undermines the ability to measure risk, and we can’t manage something that we can’t measure.

Unpredictability

We are conservative to deal with unpredictability.  Unpredictability, though, comes in two flavors, variability and uncertainty. How we deal with unpredictability depends on the flavor.

Variability is the unpredictability of the system, its operating range.  It’s a characteristic of the system. We can measure it, but we can’t do anything about it without changing the system. We measure it in terms of understanding the distribution of outcomes and often describe it in terms of mean and standard deviation. We are conservative when we plan for a range that captures most of the range. How conservative depends on how much of the range we wish to capture, but we can never capture the entire range. When there are two parameters, the variability of the system is much less than the sum of the variability of the two parameters. Sometimes, though, in our zeal to be conservative, we add them together and count ourselves done.  That’s just bad math.

Uncertainty, on the other hand, results from our lack of knowledge about the system.  The more we learn, the less uncertain we become. We can do something about uncertainty.  Where variability is a characteristic of the system, uncertainty is a characteristic of the observer. Being conservative, then, becomes a substitute for understanding the system, substituting belief for knowledge. That said, though, some conservatism is appropriate because unlike estimates of variability, which can be directly verified by measurement, estimates of uncertainty cannot be verified.

Conservative, But Not Too Conservative

We are much quicker to credit the decisions and decision-making processes when things work out, and just as quick to condemn decisions and decision-making processes when they don’t work out, even when the same decision-making process is used. Lindbergh wasn’t conservative, he was lucky. We credit his good luck to him. If he had died trying to cross the Atlantic, he simply would have been one of at least seven fatalities trying to win the Ortieg Prize, and we would attribute his bad luck to, well, bad luck. Importantly, Lindbergh’s decision affected only his own safety.

  

Photo credit:  Library of Congress, Copyright not renewed, 1927

As safety professionals, we don’t want to acknowledge the role of luck—good and bad—in what we do. It confirms what we already know, that there is uncertainty and that we cannot control everything.  That we cannot control everything does not mean, however, that we cannot control anything.  If we try too hard to control everything, though, we end up controlling nothing.

As engineers and managers, we are not just uncomfortable with unpredictability, we do not even want to admit that things are unpredictable. Instead, we claim to be conservative in our effort to compensate for both system variability and for our uncertainty. When we are too conservative or unevenly conservative, however, we misallocate finite resources and time, managing them poorly.

Be Realistic in Your Risk Analysis, Conservative in Your Risk Tolerance

The best way to avoid most of these problems is to be as realistic as you can be when you set out to analyze risk.  It is not your worst case that you need to be concerned with, it is your likely case. Make the assumptions that are most probable and arrive at your best estimate of the risk. Do this whether there is one estimate or a dozen estimates that go into the risk assessment.

Then, with risk assessment in hand, apply a conservative safety factor or compare to conservative risk tolerance criteria.  Be conservative, but be conservative once, with the knowledge of what you have done.

Author

  • Mike Schmidt

    With a career in the CPI that began in 1977 with Union Carbide, Mike was profoundly impacted by the 1984 tragedy in Bhopal and has been working on process safety ever since.