We’re ten years removed from the Deepwater Horizon. What happened on April 20, 2010 was tragic. Eleven fatalities; it’s miracle anyone survived. The worst spill in US history. A financial and reputational debacle for the companies involved. Collateral damage across an entire industry. Criminal charges filed against two decision makers.
I know: “Tell me something I don’t know.”
Try this: nobody keeps track of these things, but I guarantee there’s never been an industrial event more sliced and diced than this one. See for yourself: type Macondo Well in your web browser. Nobody was about to let this crisis go to waste, so everyone got in on the act: politicians, regulators, academics, consultants, authors, and even producers, directors, screenwriters and actors. Anybody with a take gave theirs. It’s all there to be read, assuming you have time and appetite for such things.
This treasure trove of information begs a question: What have you learned from “those guys” and “their failure”?
Besides, “I hope I’m never that guy!”
If you figured this out a decade ago, no need to read on. On the other hand, if you are thinking, “I learned something, but I’m not quite sure what it is” I’ll spare you a deep dive into industrial history and tell you what I think you need to learn.
In five easy lessons, free of charge.
Regarding Investigations
For better or worse, the one thing about the past is that it’s in the past. So, why put more time and energy into rummaging through something that’s over and done with?
Before you brush off what might seem an obvious question with, “Everyone knows you need to understand what went wrong” a word of caution. Unless you’re studying for a PhD, investigating your failures or reading reports of other people’s failures isn’t the point of it. The purpose of analyzing failure is to be able to do something different so it won’t happen again.
Thing is, if you don’t understand what went wrong in the first place, good luck doing the right thing to make sure it won’t happen again.
Because you’ll need it.
Lesson 1: Follow Your Rules
Every operation has rules, but you’d be hard-pressed to find an outfit with more rules than BP. Safety rules are a byproduct of experience, almost always bad experience, causing some leader to decree, “We need a rule to make sure that doesn’t happen again.”
What happens when the new rule goes out to prevent that from happening again? People complain, and some don’t bother to comply, a point of great frustration to leaders. But, should the rules get in the way of getting the job done, sometimes the worst offenders are leaders themselves. They rationalize noncompliance saying “But this situation is different” or legalize by signing a waiver.
Case in point: if the leaders at NASA had followed their own rules, the seal problem on the Challenger’s solid rocket booster would have been fixed before the first launch and the launch on January 28. 1986 would have been scrubbed because the ambient temperature was below the minimum requirement. Instead of compliance, there were study committees and waivers.
As for the Macondo Well, to assure that the oil and gas downhole could never inadvertently find its way to the surface, a pressure test was required to prove the integrity of the casing and cement. Without that proof – a successful “negative pressure test” – the liquid rock keeping the hydrocarbons from coming up the well had to stay put.
So, they did their test, and the results came back inconclusive: the absence of proof.
That being the case, the problem in the well causing the failed test should have been analyzed and fixed, following the familiar process known as troubleshooting. Of course, troubleshooting would have required spending time and money. Probably a lot of both.
Instead of troubleshooting, those in charge concluded there was something wrong with the test and nothing wrong with the well. First, a hypothesis was offered to explain why the first test failed; various contradictory data ignored; a different type of test was conducted which told them what they wanted to hear. Voila: problem solved.
More correctly: a much bigger problem created.
Lesson 1 is obvious: If you don’t want to be “that guy”, follow the rules. Even when you really don’t want to.
Lesson 2: Manage Your Risks
In the high tech world of deep water exploration, BP’s risk management protocols rivaled that of NASA. But for all their computer modelling, risk quantification and decision trees, BP engineers missed the whole point of the exercise: to reduce risk!
Let’s uncomplicate this risk thing: risk is simply the probability that a hazard turns into an event. Like a well blowing out.
Starting from square one, drilling a well is risky. Drilling a twelve thousand foot well in five thousand feet of water is really risky. So much so that Exxon’s CEO said his company wouldn’t undertake something like that. But under the premise that higher risk can produce outsize rewards, BP pressed ahead with Macondo. Fair enough.
As to how that risk was managed in practice, one independent analysis (done by the University of California Berkley; see what I mean about everyone getting in on the act!) listed twenty one separate decisions made over the life of the Macondo Well that increased risk. “Not to use recommended casing centralizers” and “use nitrogen in cement mix to lighten the slurry density” and, yes, and not to follow “the Minerals Management Service approved plan for negative testing.”
As if drilling a deep well in deep water wasn’t risky enough, the decision makers kept doing things to up their risk. That’s not managing risk, that’s taking risk. A lot of risk!
Lesson 2: don’t go doing things that increase risk.
N.B.: the surest way to increase risk is not to follow the rules!
Lesson 3: Reality is Reality
We call it Performance Visibility: the degree to which a leader knows reality for what it really is. You can’t manage what you don’t know.
On the matter of Performance Visibility there’s an ironic twist in the Deepwater tragedy you might have heard about. Right in the middle of the well blow out, a safety award was being presented by BP to Transocean. Executives from both companies had helicoptered out to the rig for the ceremony; when the well blew one of them was hurt so bad, he later said, “I didn’t know through all of this whether I was going to make it.”
His name: Buddy Trahan. Twenty-three years in the business; hired out of high school to do maintenance work on rigs; at the time of the event, his job with Transocean was to oversee rig maintenance. From onshore.
But on this day, he was a visiting dignitary, there to accept a safety award. There are some of the opinion these visiting executives missed their moment: they should have sniffed out the problem, intervened, and changed the course of events.
Picture that: at 10 am the dignitary is in a conference room, making a speech on the importance of safety performance and being handed the big safety award. At 11 am, that same leader is now out on the rig floor (wearing FRC, of course), analyzing well logs, and making command decisions about well testing protocols.
Only in the movies does a storyline go like that.
Even if you think that sort of thing possible (because you know you are that good) is that any way to manage? At best, you can only solve one of those big problems at a time, and you actually have to be there at that perfect moment in time. What are the odds of that? And there is that full time job you have, back at the home office.
Let’s get real. The lesson here is not to go sniffing out crises and playing hero. It is to understand just how valuable raising your Performance Visibility is to managing safety performance so you can manage these things long before they turn into a crisis.
If you’re relying on the injury frequency rate to give you Performance Visibility for things safety you are setting yourself up to be one of “those guys.”
Like Buddy Trahan, who admitted “..had no idea what was fixing to happen.”
Lesson 4: Safety Is a Team Game
BP drilled a well that was a risky undertaking to begin with, then kept doubling down doing more and more things to increase risk. But had it not been for that one final decision to remove the drilling mud, this story might well have had a happy ending. That’s why it’s called risk, not certainty.
Viewed in hindsight, the events involving testing the integrity of the well and proving it safe were clearly a case of a “decision in search of the facts to support it.” In psychology, there’s a name for that: Confirmation Bias.
Dr. Maria Gottschalk defines Confirmation Bias as “the tendency to search for, interpret, favor, and recall information in a way that confirms one’s preexisting beliefs or hypotheses, while giving disproportionately less consideration to alternative possibilities.” It’s the perfect description of the process followed to test the integrity of the Macondo Well.
Don’t be too harsh in judging the principals involved: as humans, we do things like this all the time. Confirmation Bias is just one of many mental shortcuts (known as Cognitive Biases) we take to make life simpler and easier. Anyone can fall victim to Confirmation Bias; worse, when they do, they won’t even know it’s happening to them.
Make a hazard – by definition something capable of producing harm – the subject of the bias, the potential consequences of Confirmation Bias are serious. You might wish that someone would see the error of their ways, but in practice the best defense to Confirmation Bias is someone else who’s not thinking that way. Understand this and you’ll appreciate that safety is always better when played as a team game. That’s lesson 4.
Of course, when that someone sees something, they need to say something, which leads us to the final lesson.
Lesson 5: Don’t Grumble: Stop the Job!
Don’t think for a second that everybody on board the Deepwater was sucked in to thinking the same thing about those test results; this was not a case of groupthink. No less a leader than Transocean’s Drilling Manager was of the opinion that moving ahead with replacing the drilling mud was the wrong thing to do. But the customer must be served.
Lost in the mountains of criticism heaped on BP for its safety culture is the fact that the Deepwater Horizon wasn’t their drilling ship. The ship, captain and crew belonged to Transocean, a huge publicly traded company with its own corporate leaders, its own commitment to safety, its own culture, and its own customers.
BP was Transocean’s customer. You know all about customer relations, starting with the fact that customer is always right. Except when they are not. The customer suppler relationship must be understood in all of this, as it was likely the difference that made the difference.
In the aftermath of the blowout, those with direct involvement were summoned by Congress to explain what happened; some of the lesser players actually showed up and told their story. It played out like a scene from a movie.
On that fateful day, it seems there was an 11 AM team meeting where the decision to withdraw the drilling fluid was communicated. One eyewitness described the meeting as a “skirmish” where “’the company man’ was basically saying, ‘This is how it’s gonna be'” and the Drilling Manager in strong disagreement.
For the record, neither the company man nor the drilling manager showed up in front of Congress to explain their side of the story. There are laws that govern situations like that.
To continue the story, after a heated discussion, the room was cleared of all the participants save the company man and drilling manager. You, know, the customer and the supplier. Not hard to figure out who won that argument.
After the meeting, the testimony given Congress was “the Drilling Manager pretty much grumbled in his manner, ‘I guess that is what we have those pinchers for.’” Those “pinchers” were the ram arms on the blowout preventer, designed to close and seal the well in the event of a loss of well control.
No, the “pinchers” did not work as designed.
The final lesson, the most important lesson to be learned from that terrible tragedy. When the job isn’t safe, there’s always a choice: go along to get along, grumble, and hope for the best, or do the right thing, stop the job, and take whatever heat might be coming your way.
………
Five lessons learned by “those guys” in the worst way possible. Had any one of these lessons been put into practice in real time, likely the tragedy known as the Deepwater Horizon would have been averted.
If you don’t want to be that guy, learn those lessons and be sure to practice them.
Paul Balmert
April 2020
A note of thanks to our Erick Reyna for lending his three decades of experience in oil and gas exploration to this edition of the NEWS.