Chemicals & Materials Now!

From basic to specialty, and everything in between

Select category
Search this blog

Deepwater Horizon: Normalization of Deviance

Posted on October 14th, 2016 by in Chemical Manufacturing Excellence

deep water horizon

“Hope is not a strategy” – Mike Williams, as portrayed by Mark Wahlberg in Deepwater Horizon.

I just saw Deepwater Horizon with my wife. Before going, I stumbled across a movie review that called it a “poor man’s Towering Inferno.” I’m not sure what I was expecting from the movie, but I didn’t expect to be in tears at the end. After seeing the movie, the dismissive comment from the movie critic is infuriating—did he see the same movie I saw?

The movie obviously didn’t reach the critic, but it reached my wife. As we were leaving, she asked me to retire. “I know the work you do is to make plants safer, but you have to go to them to do it, and men like that company man from BP really exist. I want you to come home to me.”

The wrong lesson

One of the lessons that she took away from the movie is that the Deepwater Horizon tragedy was the result of a flawed, selfish decision made by an evil, greedy man.  It’s the wrong lesson.

We want to believe that a disaster is the result of a single, catastrophic decision.  In our narrative, we want to believe that decision to be based on a selfish, evil choice.  It is then easy to conclude that the best way to avoid disaster is to be good.  I’m not as reckless as that character in the movie, so my decisions won’t be irresponsible, and therefore disasters won’t happen.  Unfortunately, that’s not how it works.

If the management team aboard the Deepwater Horizon had known with certainty that the well was going to blow out, that flammable vapors would be sucked into air intakes and ignite, that the rig was going to be engulfed in flame by nightfall, and that eleven workers were going to die, there is no chance they would have proceeded.  Even the most reckless, irresponsible, negligent company will refuse to follow a course of action that they know will result in disaster.  Otherwise, they don’t stay in business.

On the other hand, if they had known with certainty that they had a good well and completion would go smoothly and uneventfully, because a thousand other wells had gone smoothly and uneventfully, they would have made the decision to proceed without giving it a second thought.  Because even the most responsible, careful, safety-conscious company would go ahead and follow a course of action that they know will result in success.  Otherwise, they don’t stay in business.

In the face of uncertainty

It’s not what companies do in the face of certainty that matters.  That’s the same for everyone.  It’s what they do in the face of uncertainty that matters.  The management team on the rig and throughout the organization decided to cut corners.  To be “aggressive.”  Not just on the day of the blowout, but in the days, weeks, and years leading up to the disaster.

The movie accurately illustrated several problems that led to the disaster.  Equipment didn’t work, so working with broken equipment was normal.  Because equipment didn’t work, discounting an instrument reading that didn’t give good news was easy to justify. Even in an emergency, pushing the emergency stop button (or as operators at some plants call it, “the get-fired button”) was taboo.

In the movies, deviating from doing things the right way typically results in swift, sure punishment.  In real life, deviating from doing things the right way typically results in…nothing.  Nothing other than getting things done faster or cheaper or both.  Because we usually get away with deviating from doing things the right way, deviations become the new normal.  The longer we get away with the deviations, the more we believe they are normal.  Then, when the deviations finally catch up with us, we are shocked that anything went wrong.  After all, it was so unexpected.  Because it was so unexpected, we are left grasping for a simple explanation—a single evil decision that led to the incident—and someone to blame.

So we find someone to blame, usually as far down in the organization as possible, and we convince ourselves that the problem is solved.


There is another name for deviating from how things have been done in the past.  Innovation.  Safety, like every arena of human endeavor, is a field ripe with opportunities for innovation.  So nothing about the concept of normalization of deviance should be taken to discourage safety innovation.  Innovation, however, differs from deviation.  Innovation is deliberate; normalization of deviance is something we just drift into.  Tasks and processes can be improved so that they are faster or cheaper and yet still be as safe.  A change that makes a task or process faster or cheaper at the expense of safety, though, is not an improvement.

There is a legitimate question to be asked about safety when contemplating something new:  how safe is safe enough?  When deciding to add new safety measures, it is appropriate to consider the cost of the measures and the benefit the measures provide.  Once the measures have been put in place, it is a different question entirely to remove those measures.  Making a task or process less safe is never progress, never an improvement.  Skipping steps designed to make a process safer, regardless of how many times those steps proved unnecessary in the past, makes the task or process less safe.  We stop at stop signs because that makes driving safer, no matter how many times we stop at a stop sign with no other traffic present.  Unless the step can be shown to be patently unnecessary, we continue to do it because we don’t get to pick the times it will be necessary.   That is the nature of uncertainty.

When we can show a step or safety measure to be patently unnecessary, or can replace it with something that can be shown equally or more effective while being faster or cheaper, then by all means, adopt the innovation.  The key to safety innovation, then, is explicitly showing that the new way is as safe or safer.  It is not a slow, unchallenged drift into less safe processes and procedures because “we haven’t had a problem yet.”


Eleven workers died in the Deepwater Horizon disaster.  The Gulf of Mexico will recover from that disaster, but those eleven workers are gone forever.  If nothing else, Deepwater Horizon reminds us of that tragedy and prompts us to consider the hubris that led to the disaster.

Safety is not a task to be done so we can move on to the next thing.  Safety takes constant reminders.  When our tasks and processes are more dangerous than we can tolerate, they will remind us.  When we have succeeded in reducing the risks of our tasks and processes to tolerable levels, the tasks and processes will no longer remind us, leaving it to us to remind ourselves.  The more successful that safety measures are at making a task or process safe, the more unnecessary those safety measures will seem.

In answer to my wife, I’m not going to retire any time soon.  Instead, I’m going to keep doing process safety work and teaching others how to do it. There is no, “done and move on.”  The longer we go without an incident, the easier it is to fall off the wagon, to take short cuts because things usually work out okay. We all can make a difference by reminding each other why we do things safely, even when it’s inconvenient or slower or more expensive.  If nothing else, the eleven fatalities on the Deepwater Horizon can serve a reminder. Let’s make sure that their deaths were not in vain.


All opinions shared in this post are the author’s own.

R&D Solutions for Chemicals & Materials

We're happy to discuss your needs and show you how Elsevier's Solution can help.

Contact Sales