10 Health and Safety Myths – those times we smelled a rat
Today: #2 Lost Time Injury Rates – Dark Arts in the Boardroom. I’ve put together 10 of my favourite safety-time-wasters, irritations and myths. Expect one post about every three weeks. These are just my opinions and others may “feel” differently. But if anything does resonate with you, punch the air. Better still, if something leads you to question conventional safety doctrines, then join the party.
Lost Time Injury Rates – Dark Arts in the Boardroom
It’s not like Lost Time Injury Rates are a whizzo new idea is it? In all my 28 years in health and safety, and long before, we have worshipped at that altar. Despite all the proud claims about Positive Performance Indicators (PPIs), or “Lead Indicators”, I have rarely seen an organisation that really cares (I mean REALLY cares) about anything other than those bloody Lost Time Injury Rates! Wait. I’m not about to totally trash “lag” indicators. In my opinion, we’d be mad to ignore them.
To use an analogy: If my weight is too much and I want to slim down, I can plan for diet, exercise and have a target date. But unless I go and stand on the scales now and then, I really won’t know if my weight is going up, down or staying the same, will I? Sure, if I AM losing weight, it doesn’t prove a direct relationship between the exercise, diet and the weight loss, but at least something’s moving in the right direction!
Some commentators suggest that lag indicators are like looking in the car’s rear view mirror. I disagree. I think it’s more like checking your engine temperature gauge. Quite different and absolutely essential.
Where it gets to be a problem with the diet analogy is if I were to:
- Re-calibrate the scales.
- Ignore an increase in my weight because it was New Year.
- Change my story to a 12-month-moving-average to mask a recent upward trend.
- Assume that now I’ve reached my 3 month target, I will continue to lose weight.
So what are the limitations with Lost Time Injury Rates?
As if we don’t know! Here are the stories we’ve all heard and to which no one really wants to listen. And Total Recordable Injury Frequency Rates (TRIFR) aren’t much better. So that’s not really an answer.
First, the superficial but annoying problems with Lost Time Injury Rates:
This is sad but it happens all the time: “Evasion of the LTI definition”. LTI definitions typically define lost time injury as “loss of one whole shift or more”. So keep dragging the poor sod in and give him/her a safety book to read for an hour. Job done! Reward them by paying for the taxi. And Murray, don’t change the Lost Time Injury Rates numbers on the Accident Board. Same again tomorrow. OK, spot ya!
Massaging the numbers:
Even sadder, because it’s corrupt – knowingly changing the numbers by smoke and mirrors. Usually, what’s at stake is a safety objective, a personal bonus or fear of looking bad against a corporate benchmark. So Lost Time Injury Rates are always downward. Always. Don’t ask how this happens. There are closed door conversations. Even the Board probably know there’s something wonky, but as long as the figures are improving, it’s all sweet. Why was the scale used this time different from last time? Oh! We wanted to celebrate some different successes.
I worked as a somewhat junior health and safety coordinator for a large corporate in the 1980’s. Lost Time Injury Rates were the only show in town and they got reported to an overseas head office. Every month, the figures submitted were selectively edited at the personal whim of one middle manager and signed off by a member of the senior team. I think the record was a month where 36 lost time events, certified by doctors occurred, and only 8 were arbitrarily reported. It was, thankfully, a time of change in management when it all came out into the open and I was fortunate enough to be able to play my part in helping to work through a new approach. It was called reality.
Inconsistencies in Lost Time Injury Rates:
Example, under-reporting, over-reporting, variations in reporting due to renewed emphasis or cycles in management interest. Differing benchmarks, rates and other metrics. Arbitrary interpretations, low rates of participation.
Secondly, the more fundamental problems with Lost Time Injury Rates:
Lost Time Injury Rates have little predictive value:
Even genuine long term downward trends, (and there are precious few of those), cannot, by nature, be a reliable indicator of what’s going to happen next. If you stand at a roulette table and see 5 reds come up in a row, it’s absolutely no predictor of whether the very next roll will be a black or a red. A somewhat contrived analogy, granted, because safety outcomes aren’t entirely random. But in some circumstances, statistics flatter to deceive.
Even large organisations, with long term, large and (ostensibly) significant statistics get bitten on the bum. Just one example being the “7 year LTI free record” being celebrated by Transocean just as the Deepwater Horizon tragedy came along. But there was more. Independently of the event, and prior to it, workers had expressed fear of reporting mistakes, knew about routinely risky behaviour and had to tolerate unreliable and unsafe equipment. Some safety audits had also been “pencil whipped” to mask reality. This was apparently part of the findings of a confidential survey conducted by employees in the weeks prior to the explosion. Not that one event proves anything on its own. But put it all together and at the very least, there were strong indications of an unhealthy obsession with Lost Time Injury Rates and possibly a ticking time-bomb as a result.
Lofquist (2010) pointed out that favourable measurements can cause a “drift” in organisational focus which may not result in a “rare” accident until a long time later (a lag). This can lead to “accident cycles” of high reactive focus/low complacent focus. “Lag indicators are dubious predictions of future performance, especially when probability is low but the hazard level is high”: Mengolimina & Debarberis (2008).
4 finger cuts are 400% worse than 1 arm amputation in Lost Time Injury Rates land:
Well, it’s true isn’t it? Where’s the statistical significance in that? Let’s say 8 days lost for the cuts vs. probably a whole year or more for the amputation. Some international accident recording standards like ANSI “charge” the organisation a standard number of lost days for serious injuries and fatalities. But as long as you stick to good old Lost Time Injury Rates, the amputations and broken bones just click over as “1” each.
We are effectively measuring something that doesn’t happen:
Weike (1987) described safety as a “dynamic non-event” – the absence of harm. We may be counting LTIs but our goal is that they reduce in frequency. Our comparisons therefore rely on less and less happening. Our “reward” is directly related to longer and longer gaps where nothing happens. This of course becomes increasingly meaningless. If nothing is happening, what do we do next? Add to that, the fact we don’t normally know what events we may have prevented from happening, nor how we might have achieved that. So we glorify long periods of silence. The longer the better. It’s our Happy Place.
No wonder organisations get so obsessed. And so desperate in their zealousness for avoidance.
Random statistical variations can all but invalidate “trends”: Cadieux et al. (2006) state that where there are no accidents, it does not necessarily mean the workplace is safer than any other workplace that has had accidents at a particular time. In fact, for organisations with low accident rates, lag indicators are less useful because of the statistical fact of random variation (variation in the occurrence of low frequency events – my commentary added). In other words, don’t count your chickens, your luck may be running out).
Two authoritative summaries of Lost Time Injury Rates history closer to home
Issues in the Measurement and Reporting of Work Health & Safety Performance: A Review. Safe Work Australia (2013)
- “LTIs correlate poorly with human and financial consequences of work injuries and illnesses. i.e. not a valid or reliable measure of performance for either the consequences of OHS failure OR the success of OHS controls and initiatives.”
- “Growing anecdotal claims of individuals seeking to “manage the measure” rather than manage performance – deliberate manipulation and under reporting.”
- “LTIs vary widely in severity, from short absence to long absence, permanent disability and fatality.”
- “The uptake of the use of Positive Performance Indicators (PPIs) has been slow. And the jury is still out on their fitness for purpose.”
NZ Business Leaders’ Health & Safety Forum May 2016 – Benchmarking Report:
Key results in LTIFR & TRIFR. “After a 3 year reducing trend, 2015 saw a return to 2013 levels. Limitations on the data mean we are unable to draw firm conclusions on why this occurred. Potentially, it could reflect a change in the mix in the businesses taking part (or) better reporting of injuries.”
The above statement begs some obvious questions. Which are: “If 2015 figures had “limitations on the data”, then which year(s) didn’t?” And: “How would we know?” “Were some years accurate?” “Why was 2015 assumed to be the anomaly?” No one clearly knows the answers. Maybe, just maybe, it’s also because the whole thing is too open to abuse, interpretation, statistical wobbles and capricious behaviours to have meant anything in the first place?
So what’s the answer?
I don’t personally believe PPIs (Lead Indicators) are the answer. Obviously, organisations need to have continuous improvement. That’s a logical planning process. But, to me, the concept of measuring what is kind of “random good deeds” like audits, corrective actions and training seems barking mad. It’s like “If a little is good, more must be better. Let’s throw some safety stuff around today, it’s been a while.”
Assuming we know where to put all this effort most effectively is, to say the least, somewhat fanciful. It seems far, far worse than “looking in the rear view mirror” for lag indicators. It’s like modifying your car for speed and performance while driving in dense fog. It’s like firing a gun randomly in a forest at midnight to kill a rabbit.
Just leave future planning to a more logical analysis based on all the facts around, including known future changes. Call me old-fashioned, but let’s take a cool look at what’s likely to have the biggest impact and quietly do it. Why jump around counting how many times we do it, if doing it once – effectively, (or not at all), is as good if not better?
I suspect there is no absolute answer to measuring safety success. But I would certainly love to see the back of LTIs for all the above reasons. I’ve long held the belief that measuring lost work days as a rate per (n) employees – a Severity Rate, is better than LTIs. For the following reasons:
- You’d count all work related lost hours (to a reasonable level of accuracy) and aggregate them as standard 8 hour days.
- If a worker is on alternative duties or getting therapy for their injury, any hours off in a day are counted. So that does away with the “whole shift or more nonsense”.
- If a worker would benefit in the early stages of recovery from a day or more off, there’s no intense pressure to avoid counting those hours. In fact, those hours and the hours for rehab are an investment in limiting the eventual total number of days lost. That’s the end of the humiliation and ugliness of inventing demeaning work just to avoid LTIs.
- The metric would be “Days lost per 100 workers, per year” or something like that.
- This reflects the severity of injury and gives it weight. It may well focus the Board’s attention more productively.
It’s not perfect and people will still cheat and resist. But there’s less incentive to do so and the “punishment generally fits the crime”. In other words, the more severe the injury, the worse you look.
My 10 Health & Safety Myths. Planned topics and dates.
- # 1: Passion for Safety – Please no! 29 August 2019
- #2 Lost Time Injury Rates – Dark Arts in the Boardroom. 18 September 2019
- #3 Zero Harm – Stop Taking it Literally! 9 October 2019
- #4 We Have a Safety Culture – Yeah. Nah! 30 October 2019
- #5 Safety Audits – Smoke and Mirrors 20 November 2019
- #6 Safety Manuals – You’d Think it Would be Simple 11 December 2019
- #7 Policy Statements – You Are Committed to What? 5 February 2020
- #8 Hazard/Risk Registers – What Are They Really For? 26 February 2020
- #9 Accident Investigation – Tick & Flick 18 March 2020
- #10 Contractor Management – The Thin Paper Wall 8 April 2020
Simon Lawrence is Director of SafetyPro Limited.
Consulting for safety:
Call 0800 000 267 for a welcoming chat, or email email@example.com
Check out our SafetyBase software
- View a 4 minute video overview. Please like or share.
- Browse the SafetyBase website.
- Short cut to the all-important Pricing Page. No hidden costs.
- Download a PDF Fact Sheet to show to your Senior Leadership Team.
Call me, Simon, on 0800 000 267 or email firstname.lastname@example.org You could be trying out this highly effective health and safety software system in minutes