SHEQ culture researcher and consultant Corrie Pitzer reported his latest findings to the American Society of Safety Engineers, ASSE, in 2010, concluding that culture in non reporting organisations was not deficient or non existent, but that a culture of institutionalised pretence and delusion was cultivated. Pitzer describes seven corporate delusions;
1 ‘Risk control’ delusion is bred by sets of rules to defend workers and enable management, yet many managers ascribe improved results to non-regulated elements; awareness, education, team spirit, maturity, values.
2 ‘Compliance’ delusion is bred by data, yet trained workers, managements and governments often break rules.
3 ‘Consistency’ delusion lulls workers into procedures against their judgement and need to develop risk skills.
4 ‘Human error’ delusion is a stereotype that underestimates complex interactions between people, organisations, and dynamic environments.
5 ‘Predictability’ delusion assumes that analysis enables risk management, based on small databases and little data.
6 ‘Trend to zero’ delusion responds to social SHEQ demands, resulting in data ‘treatment’, with losses and harm incurred but not recorded.
7 ‘Invulnerability’ delusion leads to perceived safety and hidden risk taking.
Delusions permeate business and SHEQ culture, instead of values, accountability and leadership. Leaders use and share their values, challenge traditions, inspire and enable people to act, and accept accountability where defenses do not reach. An extract of Pitzer’s presentation follows below.
The Piper Alpha Oil Rig disaster in the North Sea on 6 July 1988, changed the safety thinking and focus of governments, SHEQ researchers, industries and organisations.
It resulted in new legislation, textbooks and a critical self-examination by the oil and gas industry. The inevitable investigation found a convergence of technical and systems problems, but in the organisation’s culture lurked less identifiable and less reportable ‘dark matter’.
Our human inclination is to ask what was ‘deficient’ in the corporate culture. Did they not care enough about their people? Was production more important? Did they coerce supervisors and employees to ignore safety precautions? Did they run a few minimal programmes and accepted risk, ‘flying by the seat of their pants’?
Safety performance in many companies and even industries had stalled at current levels in the last few years. Accidents rates are at a ‘plateau’ and yet, serious accidents and fatality rates continue.
Several organisations with ‘exemplary’ safety records, suddenly suffer catastrophic or multi-fatality events; Texas City refinery, NASA’s Challenger and Columbia, Chernobyl nuclear reactor.
Could we find common features in these organisations’ culture or mindset? What characterises their decisions? My research and review since 1994, starting in the Australian mining industry and including extensive research into international events since 1997, reveal that unexpected disaster follow on seven deadly delusions.
Not surprisingly, it is difficult to identify the true nature of a culture in an aftermath of death and destruction. No one dare conclude that the safety culture was ‘positive, focused and caring’ or that the company was doing ‘well’ in safety and was merely ‘unfortunate’, nor could everyone be blamed. Humanly, people look in investigation results to fix blame on some equipment, suppliers, systems, procedures or workers. But in reality there are usually many deficiencies in every organisation.
Piper Alpha Oil Rig was considered the company’s most productive and safest oil rig in the North Sea. The rig won a safety competition six months before the disaster, praised for its work permit system, yet the same system was found deficient in the disaster investigation.
Data versus trust
Why did the rig manager later comment on the deficiencies readily uncovered in the inquiry: “I knew everything was all right, because I never got a report that anything was wrong.”
Here is a strong indication of delusion, yet this is what most managers do every day: manage by exception. You have to trust your colleagues. But should you trust people if they do not trust you?
What if reporters of deficiencies are seen as spoil sports, and become targets, like ‘shot messengers’? Companies like to boast about meeting their obligations.
As frontline operators could attest, many accidents and incidents are ‘hidden’ and never reported, well known as Incurred But Not Recorded, IBNR.
A series of such delusions have been identified in the research of the author, starting with a paper in 1997 about the incidence of mine disasters in Australia. Peculiarly, the more recent disasters in that industry (during the 1990’s and 2000’s) tended to be at mine sites that could hardly be described as deficient in their management of safety.
They were operated by mining corporations with a tremendous and sincere focus on safety and in some cases, could be described as ‘best in the business’.
One such mine was Northparkes gold and copper mine, in New South Wales, Australia, owned and operated by North Mining in Australia, a company that for years was regarded as a leader in safety in the resources industry. The mine was later taken over by Rio Tinto, in 2000.
On the afternoon of the day shift on 24 November 1999 four men were killed at Northparkes E26 Lift One underground mine as a result of a massive collapse of rock and the subsequent devastating air blast.
Inquiries into this accident focused mainly on the risks and technicalities associated with block cave mining, there was an unique opportunity, unreported until now, to also study the safety culture and safety management systems of that mine, prior to the incident.
At that time, the resources industry of Australia operated an industry award system named Minex Awards. This award was given annually to the best mine in Australia from a safety perspective, and the strength of it was that it was only given after a very rigorous audit and analysis of participating mines’ safety systems, culture and performance by a visiting team of trained evaluators.
Northparkes Mine was given a ‘high commendation’ by the Minex panel prior to the disaster. The author was a member of the evaluation team in 1999, and had first hand insight into the quality and design of Northparkes safety management systems and culture.
It was a top performing mine in safety, with rigorous and robust safety systems, an outstanding management team and dedicated safety manager.
In 1999, the Australian Minerals Council commissioned an industry wide safety culture survey, with 42 participating mines and plants selected from mining locations, types of mining, and commodities. Northparkes Mine was a participant in this survey and analysis. The author’s company, Safemap, was responsible for the survey and subsequent report.
The survey of safety culture had placed Northparkes Mine third highest in the rank of ‘positive responses’ by employees. Safety culture, in several respects, was unequalled. This company’s responses levels remain among the highest ever recorded in the Safemap database of 200 companies and 100 000 employees.
Mine management was not ‘deficient’, it was simply too good for its own good. Their huge focus on safety, achievement of lofty goals and celebration of safety successes led to a mindset that they were leaders in safety, protected by an extraordinary safety system. Their record and accolades had trapped them into some of the seven deadly delusions.
Risk control delusion
Management and SHEQ systems create a myriad of rules and procedures, supposed to defend workers and enable management. Rules are the basis of most legislation, supplemented by rules of industries, employers, management, suppliers, standards, audits, certifiers, and systems.
Many organisations have comprehensive safety management systems, based on commercially available packages like DNV risk management system, or develop their own systems. Organisations like BP, Shell, Exxon Mobil, BHP Billiton, all have well developed and integrated systems with sophisticated auditing of compliance to these rules.
These systems are largely successful, but tend to create complexity. Layer upon layer of risk controls prompt behavioral responses that expose organisations in unpredictable ways.
A key element of all of these systems is a clear and unabated focus: to ‘control’ risks in the work place, which may be softened to ‘mitigate’ risks, in the sense ‘lessening or moderating’ the risks. However, due to a human purist impulse, no self respecting safety manager would be satisfied to ‘lessen’ risk.
Could risks by controlled? Purist safety managers say ‘yes’. If we reduce the likelihood of loss incidents, reduce the exposure of people to risk, and limit the potential impact of loss incidents, then we are mitigating. We manage controls, yet we do not control risk.
The delusion following extensive mitigating measures is one of control, collective comfort, and confidence that antecedents are identified, controls are managed and that incidents are less likely. This mindset is a critical danger zone, as in the NASA mindset about the nature of O-ring risks.
O-rings were classified on the NASA risk matrix as ‘Catastrophic, Unlikely, Redundant’. Classification soothed the corporate conscience into a comfort zone. ‘Deviant’ engineer Roger Boisjoly tried to alert his managers of the looming danger, his concerns were readily dismissed.
A cultural feature of Northparkes mine is another example of risk control comfort zone. A safety culture survey before the disaster indicated a very high overall positive response rate, third highest in the database, even today still in the top tier of the Safemap norm range. Issues such as trust in management, safety policy and focus, balance between production and safety, and most of the 41 factors in the model were highly positive, except for one factor: Risk Concern.
The posed statement: “I am worried about dangers in my workplace” had one of the lowest response levels recorded in the database to date. A correlation analysis was done on the responses to this question and a converse posed statement: “This company has good standards for safety”.
The result showed a significant inverse relationship of 0.89, a strong indication that respondents were significantly more confident, the more they believed the employer had things under ‘control’.
The finding is consistent with Adam’s Risk Compensation theory that states that people have an inherent ‘risk thermostat’ to adjust according to the level of risk perception in our environment.
Where we think we see more risk, we act more cautiously. Where we think we see less risk, we act more risky, or tolerate more risk. Control delusion is a danger zone that people enter willingly.
The safety manager at Northparkes mine said in an interview that on reflection, the management team were very ‘naïve’ at the time to have placed much faith in their risk management systems.
Behavioral safety has claimed dramatic improvements in safety performance in many organisations, and introduced a golden age in safety in the 1980s and 1990s, along with a golden age of compliance to legislation, standards, best practice, continuous improvement, data gathering, reporting and multiple rules, up to rules for corporate governance and transparent social responsibility.
Yet the traditional complaint of safety managers that trained workers who know and understand rules, also break rules. Organisations, industries and governments are likewise prone to breaking the letter and spirit of rules.
Behavioral safety, and its glib formulae, had made the human ‘mind’ more understandable to managers and safety managers, and provided a simple model to change the behavior of millions of managers, supervisors and workers, logged ‘improvements’ in results, yet increased IBNR, ‘hiding’ and obfuscation.
BBS made significant ‘progress’ toward ‘achieving’ the age old false goal of compliance.
It seems obvious that if drivers adhere to rules, nothing could go wrong. But plenty still does go wrong. Most accidents happen at or under the speed limit. Most pedestrians are killed at pedestrian crossings. Most rule breakers, like jaywalkers, are not killed, because they are alert.
The higher actual risk they face has been reduced to a lower virtual risk by their risk response (alertness). They deal with the higher risk more competently than what the pedestrian does with the lower risk and the net result is more safety!
Complying to a speed limit on a very slippery road may be very risky and breaking the speed limit to overtake another car that is traveling under the speed limit is safer than the alternative: to slowly overtake and be exposed on the wrong side of the road for longer.
Or crossing to the opposite lane on a dual carriage road to avoid an accident sounds obvious, but it indicates that humans have to apply their risk judgment skills to stay alive, and not merely to comply with rules.
The examples above are simplistic and in very ordinary, everyday situations. The complexity and variation of the average workplace far exceeds this and it is in the first instance impossible to set rules for all possible situations and conditions in the workplace.
Even if it was, there will be an endless number of times in a person’s work where slight variations of the actual conditions and dynamics of the task require him/her to respond in a complex way, judged at the moment and with lightning fast behaviors, without ‘thinking’.
The upside for the safety manager is that for the most part, accidents are largely prevented in an ordered world. But a reality of most industries today, also of traffic accidents in developed countries, is that plateaus of performance have been reached and which stubbornly persist.
James Reason (2000) reports that the aviation industry’s safety performance has been static for the past 20 years.
It is in these circumstances that the goal of compliance does not serve the goal of improvement anymore. In a highly ordered world, with low incidence rates, the lurking impact of ‘risk compensation’ causes havoc with the normal behaviors of people, create risk-taking behaviors that are not explained by simple ABC models and feed the delusion of risk control even further.
The delusion of consistency is closely related to the delusion of compliance.
James Reason published an insightful article in which he made the controversial statement: “following safety procedures has killed people” and he cites examples such as the Piper Alpha disaster as just one such case, where workers who strictly followed the safety procedure were the ones killed in the fire, while those who jumped into the sea, against procedures, survived.
This doesn’t imply that all safety procedures are wrong and shouldn’t be adhered to, but it does mean that human beings in a high-risk work environment should firstly apply their risk skills and risk judgment and that ‘consistency’ is not the be all and end all of safe behavior, as outlined with the pedestrian example above.
The range of human behaviors is infinite, complex and unpredictable, most of the time. The search for behavioral models and understanding in the field of psychology has led to the development of very simplistic models in safety, the so called behavioral safety approach.
It posits a very simple equation of Antecedent -Behavior -Consequence (ABC) and ‘reduced’ human behavior to the same equation. The behavioral model is essentially a Pavlovian view of people, who learn behaviors through a basic operant conditioning process like animals. The dog experiment with food and bells comes to mind.
While it would be an unhelpful overstatement to allege that behaviorism reduces the human to animal-like behaviors and that we are conditioned in most if not all we do, there is still a significant oversimplification that occurs in the name of behavioral psychology.
Human beings learn to deal with risks though a complex process of cognitive adaptation, often developing an intuition and competence that defies reasoned thinking.
This ‘capability’ allows them to deal with risk in a highly variable fashion, a readiness for any/many possibilities. But then our risk control logic says we should limit all variability and create consistency and compliance in the workplace. This logic seems flawed.
The safety models operating in high risk industries are very much an ‘engineering’ model that states that human variability is the enemy. Just as we, as engineers, seek to limit deviations from a standard operating procedure or process, we should also ensure that they perform, mechanically, against the specifications, with minimal divergence and to ensure the work environment is more predictable.
However, the human being’s ability to respond in variable ways, interpreting situations of risk, which dynamically change every moment, is the best defense we have – and by limiting that capability, we are eroding the overall safety systems most potent safeguards. For every mistake the human being makes, he or she has made many millions of safe and correct decisions.
‘Human error’ delusion
Human error is rooted in stereotypes that the safety profession holds, and linked to the other six deadly delusions. An old axiom in BBS is that the majority of accidents are due to human error, and that behavioral observation and coaching could eliminate error. This linear approach underestimates the complex interactions between people and complex, dynamic environments.
Behaviour also misses the point that human actions are only the small visible end of many complex social systems that ‘create’ human error.
The overall safety management system and the associated ‘culture’ of the organisation are intrinsically linked, combining in a dynamic way to create a social work environment that influences the human operator to take certain actions in a certain way.
Human operators quickly pick up the signals from supervisors and managers about their preferences and priorities, they quickly notice what gets ignored or responded to and they act accordingly.
They may fail to apply a certain safety procedure over a long period of time, without their being any ‘consequences’ from the supervisor for doing so and when an accident happens as a result of that failure, the clear culprit is the person who didn’t apply the procedure, while the unspoken social environment, or the behaviors of the supervisor, cannot be fingered so explicitly – and we have a clear case of “human error”!
The above is only one small example of the precipitating factors that ‘induce’ human error in to the operational process. There are many, many such factors and dynamics that never become the focus of the accident analysis process, even though there is a strong focus for so-called root cause analysis.
What cannot be admitted by anybody is simply that the behaviors of the supervisor and the operator, until the accident occurred, facilitated a more productive outcome for the organisation. If the accident did not occur, the human error was actually a ‘good thing’!
By focusing on human errors, accident-prone organisations become obsessed with the sharp end of the failure process only and become absolved of guilt for the failures at that sharp end. The analogy of being so busy with swatting mosquitoes comes to mind. This delusion does not allow time or willingness from the leadership to drain the swamp, because ‘we have to kill the mosquitoes.’
Modern risk management approaches assume that risk has a probability, likelihood or chance, and that risk analysis could measure that probability, identify many of the factors, which could then be managed.
However, unlike the risk insurance industry, which is based on massive databases, there is little hard data available about events in most organisations, not sufficient to achieve risk quantification.
Risk assessments remain subjective guesses, often by unqualified people who have a vested interest in the result, and easily manipulated to the ends of organisational politics.
Risk assessment creates the delusion that risks are quantified, as with the O-rings on the Challenger space shuttle.
The impact of this delusion is that the management of the organisation creates all operational systems based on this notion of predictability, and has little or no contingencies or adaptive response put into place – so that when deviations from the predicted path occur, they are more exposed to, often catastrophic, adverse outcomes than they would have been.
This ‘unpredictable’ dynamic is most clearly shown by the phenomenon of ‘risk migration’.
Risk migration is the tendency of risks to be moved around, changed or morphed into other types of threats, or other locations of the same threat.
A dramatic example occurred in Vancouver, BC, Canada. A young gas station operator tried to stop a thief from filling his car with gas and fleeing. He jumped on the hood to stop him, but he was thrown from the speeding car and killed.
The government issued a new ruling, in the interest of ensuring each person has the right to a safe work environment, that all motorists have to pay first and then fill the car with gas.
It seems a very logical solution that will prevent this incident from re-occurring and one that can be readily predicted. But what would be the response of the would-be thief?
They still do not have the money to pay for the gas and they still want to fill up their cars. They now have no other option than to become honest, law abiding citizens of society? Probably not.
They will in all likelihood rob the attendant of one gas station and then go and fill up at another, and in this process, the new, unpredicted, event has become more violent, more serious and more likely.
Organisationally, the focus on risk management is also biased towards the least productive end of the process: most risks assessments spend the majority of time on the risk evaluation process, using a risk matrix of some design and endlessly trying to quantify the risks.
Decisions about likelihood of risk events are almost always a purely subjective assessment for most operational risks and add little value. Organisations are strongly recommended to spend more resources on the risk identification process and less on the attempted quantification process – for which we do not have adequate data.
Similarly, the traditional ‘hierarchy of risk control’ advocated widely by the risk management profession is an outdated and no longer very useful guideline for controlling risks. It is fundamentally a hazard control guide and it has little impact on the dynamics of human variability and error reduction.
This precedes catastrophic events because such deluded organisations are obsessed with predicted, known risks, based on their assessment of known causes and known potential outcomes.
This delusion lets them know more and more of what they already know, and less and less of what they don’t know: unknown events with unknown causes and unknown outcomes.
‘Trend to zero’ delusion
The business report adage of ‘lies, damn lies, statistics’ holds true, but is more deadly in SHEQ practice. There is a significant demand for improving safety performance, as measured by graphs, resulting in all kinds of ‘treatment’ of data.
Workers are quickly ‘rehabilitated’ to return to work before a cut off period, incidents are argued away as not work related, incentives drive reductions in accident reporting.
There are two aspects to this delusion. There is hardly an organisation today that does not boast an improving accident rate trend. It is a function of better and better risk control/mitigation, the lie factors mentioned above are just a reality of an outcome-based world.
There is also hardly an organisation today that has not experienced the well-known “flat end” of the (opposite) hockey stick. Significant and dramatic (sometimes dramatized) reductions have slowed down as the technology and capabilities run out of puff. The behavioral safety era is one such out-of-puff flavor of the last decade.
The plateau-limited organisations are still seeking the trend towards zero and they are under pressure to maintain that. Often such organisations show the odd outlier event – a burst of serious accidents, or even the odd catastrophic event.
The first part of this delusion is that the trends are sustainable, but also that its statistical performance provides a valid measurement of its actual safety performance.
The problem with statistical performance is that it is a (fudged) account of minuscule events in the organisation. The ratio of deviations to conformance in any situation of work is simply mind boggling.
One deviant decision or action is literally a needle in a haystack of good, conforming decisions and actions, and even then, it is very seldom that the deviant action actually results in an accident. In most modern organisation, well managed and defended, chance plays a significant role in the accident/event.
Similarly, highly at-risk organisations can, by chance and sheer luck, avoid disasters and adverse impacts. A personal experience of the author during a risk assessment exercise at a gold mine in Papua Niue Guinea was a dramatic case in point. During a visit to a mine pit, the team of risk observers identified six potential situations in which a fatal accident was highly likely (near misses), in the space of two hours!
The production superintendent reluctantly admitted these as potential events, but as “unlikely”. During the same week, another mine in Canada, owned by the same company, suffered a fatal accident –a mine where the safety defenses were extremely well-developed and robust.
The natural response was that Mine B was mismanaged, exposed and weak in its safety system integrity.
Clearly, in the description above, the use of statistical performance as a measurement of actual safety performance has to be questioned. There is no more dramatic example of this than the explosion at the BP Texas Refinery, an organisation that boasted exemplary safety statistics, before the explosion killed 14 people.
The now famous Baker report starts with an emphatic announcement: “Process safety accidents can be prevented. On March 23, 2005, the BP Texas City refinery experienced a catastrophic process accident. It was one of the most serious USA workplace disasters of the past two decades, resulting in 15 deaths and more than 170 injuries.”
And then one of the key findings: BP mistakenly used improving personal safety performance (i.e., personal injury rates) as an indication of acceptable process safety performance at its five USA refineries; BP’s reliance on this data and inadequate process safety understanding created a false sense of confidence that it was properly addressing process safety risks at those refineries.
The second aspect of this delusion is ironically contained in the above dramatic announcement: “Process safety accidents can be prevented.”
If the statement is false, it will be an acknowledgement of the safety profession that it will fail in its endeavor and that people will be killed. Clearly, from a moral standpoint there is no room for that in any self-respecting safety manager’s vocabulary.
If the statement is true, then it means that the end result of all (process) safety endeavors has to be the total elimination of fatal accidents from the workplace.
For that to be achieved, all ‘accidents’ will have to be eliminated.
Then all incidents have to be eliminated. We also can have any near miss incidents, because the difference between a near miss and an incident is largely fortuitous (one second, one meter).
To eliminate near misses, we have to eliminate all mistakes…not as single misjudgment, all workers always absolutely vigilant, “situationally aware” (the new jargon), so that there is no single event that could remotely have resulted in any type of near mishap…not near, not close, not even remotely possible…
To achieve that, we will have to achieve a situation of zero hazards, zero risks and to achieve that, we all know, we must have no work…
Many, if not most organisations have fallen into the zero delusion. And long periods of (reported) zero accidents increasingly feed the belief that “zero” is possible, is happening and that in turn feeds the belief that risks are controlled, human error has been curtailed, compliance is achieved and behavior is consistent and predictable: all the delusions discussed so far are fed and fostered.
In the safety culture survey of the Australian mining industry mentioned above, one of the questions was: “It is possible to achieve zero accidents”. It was believed by less that 24% of employees, and in focus group challenges, the author has seen the disbelief rate as high as 93%. Nobody believes it, but everybody professes it.
The biggest downside of this is that the “safety business” is shrouded in fallacy, skepticism and cynicism.
The difficult question is this: do we then have to accept that accidents are inevitable? The brutally honest answer is: “Yes, that is reality” and it will always be so in the real world, but we dare not say this publically.
There is also no evidence to support the view that workers/people who accept that accidents are inevitable, are now fatalistic, less aware of risk or less motivated and are having more having accidents. It is another delusion that pervades the safety profession, with no basis in science.
On the contrary, James Reason postulates that employees in so-called HROs (high reliability organisations) are more aware, more committed because they accept failures will happen. They constantly fear an accident may happen, while we blindly believe it will not.
The safety profession is big business world-wide and has a long history of lies, facades and delusions. Organisations are caught up in the competitive cycle of publically pronouncing how safe they are and citing impressive accident free achievements.
Some organisations and consultancies have built up huge reputations with many “millions of hours accident free”, selling their expertise and magic tools and yet, most of these claims are simply false.
A significant drawback in safety is the many cycles of fads and buzz words that soak up resources for little real return, other than creating a lot of ‘busy-ness’. We have, as an industry, fired off many silver bullets in our relentless pursuit of the zero goals and we are increasingly aiming at the worker.
We are now fast turning to the next fad and silver bullets from the worlds of psychology, neurology and cognitive sciences, again aiming at the worker to ‘switch’ them on and make them situational aware – while the answer to safety lies within our organisations and well within our grasp: true and credible safety leadership at all levels, especially our first line supervisors.
True leaders share and lead with their values, make decisions based on values, challenge traditions, inspire and enable people to act.
Value-based leadership creates the accountability and values for safety we so desperately seek and it creates the ‘safe culture’ that is able to reach into the deepest crevices of our organisations, where our systems and defenses cannot. Leadership is a process that an organisation can acquire, nurture and develop.
It is felt that safety in our industry is at or near a critical turning point. Continuing with the same safety management philosophies and behavioral fads may take us quicker and further towards the comfort zone, unless we make fundamental shifts in our safety thinking. However, societal pressures, the safety profession and accident-chasing lawyers may not allow us to change direction.
Step changes in our near-zero safety performance are not likely anymore – and we will continue to experience random fluctuations in accident rates, and larger spikes if we have a disaster. And as we relentlessly push for incrementally better accident rates, so risk secrecy increases and so does the likelihood for catastrophic events.
Twenty year vision
Today, safety and risk are seen as opposites, where we have to move from a condition of danger to a condition of safety or absence of harm. In this approach, we analyze accidents and prevent them, we set rules and enforce them, we educate our people not to take any risks and we ask them to identify hazards which we as management will eliminate or mitigate.
But risk has potential to have both positive and harmful outcomes. If we don’t dare, we will not achieve, if we don’t venture we will not discover and if we don’t innovate, we will recede. We therefore have to find the fine balance between risk rewards and potential harm. If the front line worker starts up the dozer (machine) to move dirt (whatever), he deals with risk.
And how well he does that is the ultimate competitive edge of the business. The operation of that piece of equipment needs to be at the edge of risk all the time. It’s an edge where the worker operates at the limits of capability – speed, efficiency and risky. It is the collective effort of all workers’ ability to take risks that ultimately determines the success of any business.
The new world that safety will be entering is one where risk-taking is harnessed, not vilified, to ensure workers at all levels are optimizing the way they work, innovating new and more valuable ways of doing the job.
Twenty years from now, it may be a scary new world where safety has become invisible – no safety slogans, no safety programs and no fancy behavioral systems. It’s a world where there is a balance between risks and rewards and where the safety practitioner/safety consultant has disappeared from the organisational chart – like the dinosaur, he devoured his own world.
Safety in these terms is a process through which we seek opportunities in a responsible way, develop skills in all our people to deal with risks competently, and confidently explore new and better ways to engineer and build things. It is part of the strategic planning and thinking and is integrated in every activity so that eventually it will become a seamless and automatic consideration before we make any decision.
In this world, operations’ managers and supervisors are truly and entirely accountable for safety. They manage and coach people to develop competence and recognize them accordingly, gaining the trust and support of all. Supervisors are true leaders, because they are trusted.
Employees will grow from being safe workers to be safety champions who are trusted by their leaders to make their own decisions. It is a world where there are fewer but extremely resilient safety systems, where there is little regulation and lots of reliability, not compliance but competent precision.
This organisation deals with safety as its most complex business process, where the various sciences of engineering, risk management and organisational psychology interact.
It is pre-occupied and ruthlessly honest with its smallest failures and has the flexibility to constantly change and adapt its operating procedures. It never states a safety goal because safety will never be ‘achieved’. By definition the safety journey never ends.
In this context, safety isn’t defined by the absence of a number (zero), but is defined by the peaks of energy, motivation and competence: not zeros but zeniths.
• The above article is a summary of a presentation to the American Society of Safety Engineers, ASSE, 2010.
-Appleton, Brian, Technical Assessor of the Piper Alpha Disaster Inquiry; Video recording during ICI training course, 1990
-Adams, John; Risk, Routledge, Abingdon, UK, 1995
-Pitzer, Corrie; Emperor Has No Clothes; National Safety Council Proceedings, Chicago, 2008
-Reason, James; Safety Paradoxes and Safety Culture, in; International Journal for Injury Control and Safety Promotion, Vol 7, Issue 1, March 2000, p3-14
-Wilde, Gerald; Target Risk; PDE Publications, Toronto, 2001