MANAGING SAFETY PERFORMANCE NEWS

What Now?

“Manage the cause – not the result 
 
     ~W. Edwards Deming
 

It’s a new year: goals set; plans made; execution underway. Game on!

Before you go all in on execution, it’s worth some reflection on those three steps: goals, plans, execution. Taken together, they very nicely sum up the job of every supervisor and manager on the planet.

For safety, what’s your goal for this year?

I’m sure it’s “zero.” I’ve yet to find a single leader anywhere on the planet who wants to see any harm come to any of their followers. As noble as that ambition is, if you’ve set zero as your goal for 2020, you may well have a practical problem. A practical problem if you manage an operation employing a lot of people numbering in the hundreds or thousands. Some managers do. The problem: few big organizations achieve a zero injury state. 

Not that zero is not worth trying for. But as a management practice, setting an unachievable short term goal isn’t a good one. One injury in January, and you’re done for the year. If you’re smart – you are – you can think your way around the problem: differentiate a “goal” from an “annual limit.” As in “Our long term goal is zero. This year, we’re going to make the next step in that direction by reducing the number of injury cases by……..” That way, you can have your cake and eat it, too.

As to what would represent a challenging but realistic improvement goal for safety performance for 2020, surely you’ve figured that out.

How will you measure your performance and improvement?

No doubt you’ve put a plan together. To state the obvious, you also have to measure the progress you’re making. Or not making. 

The number of injuries and/or the injury frequency rate (which allows for normalization of performance data) is probably the most commonly used safety metric on the planet. Some leaders are a bit embarrassed to admit to using that metric; as one CEO explained to me recently, “I know that’s not a leading indicator.” I suspect she’d heard from the experts who are highly critical of leaders who measure safety performance that way. 

I am not one of them.  The point of managing safety performance is to send people home alive and well at the end of the day; how could you not measure performance against that standard?

Just don’t make the mistake of thinking that one metric – injuries – tells you everything you need to know about safety performance. You don’t rely on one number to tell everything you need to know about all the other functions that add up to your business, do you? 

So, get more numbers for safety. Just like you have for everything else you manage.  

As to indicators……

Indicators Indicating….What?

Back in the day, when some genius first suggested the universe of safety metrics could be divided in two categories – leading and lagging indicators – everyone went along for the ride. It would have been better if somebody paused, reflected, and asked, “How many other business functions use that logic for their metrics?”  

I think you’d be hard pressed to find a single one. 

As to why not, it could be because those in charge understood the statistical science that governs relationships between certain metrics. That is the essence of an indicator.

In statistics, it’s known as correlation (not causation, which is an altogether different phenomena.) Correlation is something any student doing a graduate research project or a scientist conducting research understands perfectly. You don’t announce a relationship between metrics; you prove there’s a relationship. Proving correlation requires data – normally a lot of data – and applying statistical methods that date back more than a century.  

I can speak from personal experience on the subject. Half a century ago, I took the required statistics courses as an undergrad; later, for a statistics class project in grad school, I decided to set about establishing the statistical correlation among a collection of business metrics used where I was working. Passed the course; failed miserably at proving correlation among any of a long list of business performance metrics. There were no statistically valid indicators, leading or lagging. 

Seemingly undeterred by statistical science, OSHA is now promoting the use of leading indicators – and telling you there’s something wrong with the leading “lagging indicator” (Sorry, I couldn’t resist that one) known as the recordable injury frequency rate. They write, “Today, EHS practitioners continue to rely on injury rates….and other lagging indicators despite the growing acceptance of the fact that these failure focused measures are ineffective in driving continuous improvement efforts.”  

You’re probably too young to appreciate the irony in this. Where do you think the “recordable” in Recordable Injury Frequency Rate came from? 

Duh, OSHA.

The practice of counting, classifying and reporting injuries to a government agency was an important element of the original OSHA Act, dating to 1971. Five decades into the process, now they’re saying, “there’s growing acceptance of the fact that these failure focused efforts are ineffective”?

What now? Stop counting, classifying and reporting – to OSHA? Hardly.

But don’t miss the “growing acceptance of the fact” part of that statement. Facts require proof. Where’s the proof of that fact? Nobody should accept this as fact until they see the evidence that proves cause and effect – or statistical data demonstrating there is no correlation. 

Until I see hard evidence, I’m sticking with my fifty plus years of real world experience that has proven to me, time and time again, “If something doesn’t get measured, it doesn’t matter.” 

Performance Visibility

Reading that may lead you to think I’m one who doesn’t appreciate the value of predictive safety metrics. Hardly. But I am of the opinion that the process of managing safety performance would be better served if the term “leading indicator” (and it’s first cousin, “lagging”) would be dropped from the vocabulary of safety leadership practices.  

Replace it with predictive metric. That way, everyone will understand exactly what they are talking about: a number that does in fact reliably predict future performance. 

Do that, and the pressure will be squarely put on any so called predictive metric to perform as advertised.  Put up – or shut up. That is the point of the metric, is it not?

Here’s a perfect illustration of that point, coming straight from OSHA’s expert advice on leading indicators. OSHA suggests – states would be the more accurate characterization – one of the “good examples of a leading indicator is workers attending safety meetings every month.” 

I suppose they are referring to the monthly safety meetings like the ones we used to have in my old outfit. Work a twelve hour shift and hang over for an hour, or come in an hour early, and then work twelve more. In the opinion of many, it was “an hour of my life I won’t get back.” I tried to avoid them like the plague.

By OSHA’s logic, increasing the number of people going to safety meetings (or the number of safety meetings people have to attend) is highly correlated with a lower number of injuries. That is the essence of this leading indicator.

Had I known that to be true, every time safety performance went south and injuries were on the increase, all we needed to do was to hold more safety meetings. Really?

Of course you may well be thinking in your operation, more safety meetings would actually make things and people safer. That makes another important point about predictive metrics: they really should be proven reliable for your operation, not someone else’s. 

Entirely lost in all of the chatter about leading and lagging indicators is the other vitally important function that safety metrics can – and should – provide: hard data about what the heck is really going on, for all of those work processes that are supposed to be executed to help keep people safe. 

For more than a decade, I’ve been referring to that measurement function as Performance Visibility. Performance Visibility measures the degree to which a leader knows reality for what it really is. How can a leader effectively manage something as important as safety performance if they don’t know what’s really going on? Well designed and executed metrics help increase Performance Visibility.

If there’s a legitimate criticism of the use of the injury rate – in my view, there is – it’s that raw numbers (or the rate) reveal precious little about the state of all the processes necessary to send people home safe. Assuming the injury rate simultaneously measures both bottom line performance and the causes of performance is the big flaw of safety metrics. OSHA would have done you and your peers a service by taking that problem on. 

I suppose that task falls to me. 

When it comes to metrics, the goal every leader has for safety – zero harm – is far better served by following the lead of W. Edwards Deming: “Manage the cause, not the result.” 

What Now?

With a new year comes a renewed sense of optimism. I’ll be optimistic: you’re now convinced to quit obsessing over leading and lagging indicators, and instead poised to search for more and better data. Data that will enable you to understand what’s really going on in your organization, and in particular for those work processes you believe essential to cause your people to go home alive and well. Or not. Knowing the state of those work processes, you can then improve what needs to be improved.

You can read that as, “improving Performance Visibility to be able to manage the cause.”

If you don’t like what the numbers are telling you, you’ll have to face up to the question, “What now?” You know doing the same thing over and over, expecting a different result, is the definition of insanity. 

On the other hand, it is entirely possible that you’re unconvinced by all that I’ve put in front of you. You are strongly entrenched in the camp of leading (and therefore lagging) indicators. Fair enough; if you’re reading this, at least you’ve heard me out.  

Here’s the thing: sooner or later, one of those leading indicators is going to warn you of an impending problem. If you’re certain a leading indicator is telling you it’s only a matter of time before the problem shows up, won’t you ask, “What now?”

And if you do, what now?

Let me predict: you’ll search for more and better data to increase your Performance Visibility, particularly for those key work processes you believe essential to cause your people to go home alive and well. Or not. Knowing the state of those work processes, you’ll improve what needs to be improved.

You should read that as, “improving Performance Visibility to be able to manage the cause.”

Paul Balmert
January 2020

Spread the Word

Share on Facebook
Share on Linkdin
en_USEnglish