Posted on: March 2, 2011 Posted by: Diane Swarts Comments: 0

High reliability industries run on complex equipment, processes, and skills, requiring complex management and leadership functions, including ‘soft’ issues like culture.

Reliability, like safety, is best defined by an absence of unexpected loss, or drop in performance quality. Loss incidents are the death of reliability, and could contribute to closing down organisations, yet many operational risk managers work only on visible and ultimate incident triggers, like ‘massaging hippopotamus ears in a muddy pool’.

The concept of HRO’s allows us to see below murky corporate water, to identify and manage basic elements, enabling organisations to avoid a roller coaster ride of safety and health performance.

HROs, like nuclear, aviation, naval, petrochemical, emergency response, intensive care or investment teams, rely on ‘hard’ systems like engineering, communication, control, training, and planning, but rely equally on ‘soft’ cultural elements, like vigorous audits, error prediction tools, low risk tolerance, and an uncompromising mindset.

Many industrial disasters continue to offer evidence that most organisations delude themselves about their reliability; Montara, Mexican Gulf, and many recent disasters and near disasters, continue the litany of Texas City refinery, Flixborough, Chernobyl, Longford, Challenger, and Columbia.

HRO characteristics

High reliability organisations are characterised by five habits; preoccupation with failures, reluctance to simplify, sensitivity to operations, commitment to resilience, and deference to expertise.

People who operate and manage high reliability organisations assume that each day could be a bad day and act accordingly. ‘Failure awareness’ is not an easy state to sustain, particularly when failures do not occur, or have occurred a long time ago, or at other organisations. They are obsessed about identifying and correcting even small mishaps, miss-specifications and miss-communication at all levels in the organisation.

Organistions have to ignore most of what they see in order to get work done. The crucial issue is whether a simplified diagnoses force them to ignore key sources of unexpected difficulties. Systems with high reliability resist temptations to simplify rationalise what they experience at work, to consider multiple scenarios, and to test likely possibilities.

HRO employees adopt systems to help them maintain situational awareness. Everyone in the organisation should be sensitive to operational needs and realities, and freely communicates operational failures and successes.

Management systems should try to anticipate trouble spots, but high reliability systems also pay close attention to their capability to improvise, and respond to unexpected events, the goal being to bounce back as soon as realistically possible.

Reliable systems allow decision authority to ‘migrate’ to relevant experts. Hierarchies are loosened, especially during high tempo periods, to match functions with experience.

Culture of denial

“Mindful updating is facilitated by processes that focus on failures, simplifications, operations, resiliencies, and expertise.”

Culturally one aspect that differentiates collective mindful organisations from others, is the commonly human culture of denial. Denial is a default culture, sustained by a series of comforting, but false beliefs.

‘It could not happen here’ or ‘it could not happen again’ is a commonly held belief. Before incidents, like the Gretley mine flooding incident in Australia, there was a general belief that ‘it could not happen here’. They would have said “We know about these hazards. We know about the dangers of flooded old workings. It is not a problem here because we have it under control”.

They had maps from New South Wales Mining Department which showed that they were at least 100m away from those old workings. They ‘knew’ that water seeping through the mine face could not indicate danger because they believed that it was a natural occurrence since it was a ‘wet’ mine and that they were not at risk.

Denial is a common response to warning signs. Warning signs are usually ambiguous, inviting multiple interpretations and dangerously favoured conclusions.

Warnings ignored

Warning signs are usually intermittent, inviting people to believe that the potential problem had ‘gone away’.

‘Normalising evidence’ is another human tendency, to look for other ways to interpret data and enable people to ignore hazards and risks.

At Gretley mine, when safety officers went underground, they saw water coming out of the mine face. One of them wrote this into his end of shift report, but the manager thought that it was a ‘naturally wet mine’ and did not respond to the report at all, like other managers had ignored similar reports, as the incident investigation found afterwards.

‘Onus or proof’ is another aspect of cultural denial. Warnings are ignored until proven imminent by those raising the alarm, instead of raising response.

‘Groupthink’ is another human habit that entrenches a cultural of denial. Large groups assume differences of opinion, and allow debate, but small groups easily presume unanimity and members fall in with a perceived leader, peer pressure, and group self image.

High reliability organisations should ensure that somebody in each group is empowered to disagree, to call for evidence, and initiate discussion or call on higher authority, without assuming authority.

Groupthink symptoms

Small groups are vulnerable to a particular set of dynamics that reduces reliability of their work and makes them prone to loss incidents. Groupthink includes;

• Illusion of Invulnerability; excessive optimism that encourages taking extreme risks.
• Collective Rationalisation; discounting warnings and stick to assumptions.
• Belief in Inherent Morality; rightness of an assumed cause, ignoring ethical or moral consequences of decisions that appear to miss-align with the cause.
• Stereotyped Views of Others; a perceived ‘enemy’ make effective responses to conflict seem unnecessary.
• Direct Pressure on Dissenters; pressure to suppress arguments against assumed group views.
• Self-censorship; doubts and deviations from perceived group consensus are not expressed.
• Illusion of Unanimity; majority view and judgments are assumed unanimous.
• Self-appointed ‘mind-guards’; members protect the group and leader from information that is contradictory to cohesion.

Mindful culture

High reliability organisations overcome common human group impulses by cultivating a set of attitudes and habits, aligned to their goals and a value system.

HROs strive for an ‘informed culture’ that creates and sustains intelligent wariness, based on four co-existing subcultures:
• Reporting culture: reporting errors and near misses
• Just culture: formalised apportioning of blame
• Flexible culture: ready to adapt to sudden and radical pressures
• Learning culture: converting lessons into reconfigurations of assumptions, frameworks, and action

Leaders change minds

Mindful leaders ‘pay attention in a different way’. They ignore information that confirm hunches, are pleasant, feel certain, seem factual, are explicit, and invite agreement.
Concentrating on things that disconfirm, are unpleasant, uncertain, possible, implicit and contested.

Reliable organisations require a particular set of qualities from their leaders. Mindful leaders are influential due to integrity, inspiring, engaging others, visionary, challenging, competent, involved, caring and supportive.

• Characteristics of organisational success and failure were examined at a High Reliability Organisations (HRO) seminar hosted by Saacosh and Global Prospectus at Montecasino in Johannesburg in January 2011. The article is based on seminar material.

AUTHOR PHOTO; Francois Smith, managing member of Saacosh.

Uncategorized

Leave a Comment