In this 4-part series we are looking at a concept called Drift, what it is and how it occurs and ideally how to prevent it. Drift is an important concept for any leader to understand because it’s a reason that organizations can experience failure or adverse outcomes without anything being broken or anyone making a bad decision.
A Culture of Drift
Drift is when “local decisions that made sense at the time cumulatively become a set of socially organised circumstances and norms that make the system more likely to produce a harmful outcome”. You can think of it this way, a culture starts to form that is not consciously aware that their practices and behaviour, perhaps even their beliefs, are compromising health and safety outcomes because drift is a slow, incremental process with multiple steps that occur over an extended period. Each step is usually small enough that it can go unnoticed, with no significant problems until it’s too late, which is why deviances are not report worthy and everything seems normal. The problem is, this behaviour will eventually lead to an injury or illness or even a fatality, it’s just a matter of when.
In Part 1, I introduced what drift is and it’s primary trigger, the Local Rationality Principle. There are three ingredients to this Principle and in Part 2 I explained the first ingredient, that is, every organization is dealing with conflicting goals which cause people to make trade-offs at the expense of health and safety. Overtime, these tradeoffs cause us to drift toward an unacceptable safety boundary where incidents occur.
The Normalisation of Deviance
The second ingredient to the Local Rationality Principle is the normalisation of deviance.
The concept was coined by American sociologist Diane Vaughan when looking at where conflicts, mistakes, and disasters find their roots. She earned acclaim for her investigation of the failure of the 1986 Challenger space shuttle launch that killed all seven astronauts on board.
To the Presidential Commission that investigated the incident there was a linear relationship between scarcity, competition, production pressure and managerial wrongdoing, meaning, scarcity in economic funding led to competition between space centres, which caused production pressure to get the launch up and going (the team were already behind in the launch schedule), which ultimately led to managerial wrongdoing.
The Inadequacy of Finding A Root Cause
All of this sounds pretty logical to the untrained eye… and it’s certainly how we traditionally approach understanding incidents / failure. However, due to the web of relationships and feedback loops in complex systems, relationships are NON-linear, things don’t happen in a nice sequential fashion – which means the infamous Swiss Cheese model that has been a traditional explanation of how incidents occur, is no longer relevant in today’s organizations, and furthermore looking for broken parts or bad decisions will not help you prevent future incidents because with drift, nothing needs to be broken and there doesn’t need to be a bad decision for an organization to experience failure – that’s why drift is so relevant to your governance role because if you’re to ask the right questions, you need the foundational knowledge of these important concepts.
In your governance role it’s important that you’re applying systems thinking by looking up and out and becoming aware of the systems and relationships because these will provide you with a better understanding of why people do what they do and WHY things happen the way they do. Whereas, looking down into the organization provides knowledge, knowledge of the parts and how they work or what happened in the event of an incident, but it won’t provide you with an understanding of WHY. Which means that finding a root cause will only tell you what happened because we go down and into the organization to find what’s broken. Finding a root cause is also not going to prevent future incidents because remember in complex systems the relationships are non-linear, so things don’t happen in a sequential order, it’s often a lot messier than that.
Vaughan’s came to the conclusion, that of course, production pressures played a huge role in middle management allowing rule violations and contributing to the silencing of those with bad news, but not in the way envisioned by the Presidential Commission who ultimately exonerated top administrators, such as the board and the executive, and blamed middle management. Vaughan found that production pressures and resource limitations gradually became institutionalised, taken for granted – these were patterns of behaviour that had become the worldview that every individual brought to organisational and individual-operational decisions.
Remember, drift is a slow incremental process that’s difficult to see, so bad processes or deviant behaviour are not uncovered because they’ve become ingrained within the organisation, accepted by both the organisational structures and the culture. So signals of potential danger are acknowledged, then rationalised and normalised and this is the danger because when we normalise this behaviour it becomes our worldview and we will unconsciously apply our worldview to various circumstances that shouldn’t necessarily warrant the same mindset or the same approach.
Vaughan defined normalisation of deviance as:
“people within the organization become so accustomed to deviant behaviour that they don’t consider it as deviant, despite the fact that they far exceed their own rules for the elementary safety”.
The normalization of deviance is important to understand when looking at how the board influences structures and systems that influence people’s perception of priorities. When we focus our attention on blaming individuals for poor behaviour or stupid decisions, we are not giving enough consideration to how the organizational structures and systems influence people’s decision-making, and this is where the board has influence because the board influences people’s perception of priorities, or what’s called the organizational climate, through what the board notices and comments on, measures, controls and rewards, and in other ways systematically deals with, and climate influences culture over time.
The Normalisation of Work-Related Stress
Let me give you a health and wellbeing example. I think you’d agree that it’s not that we’re comfortable that work-related stress has become linked to an excess of 120,000 deaths per year in the United States, making the workplace the 5th leading cause of death in the U.S, or that in Australia over a 5-year period, 91% of claims involving a mental health condition were linked to work-related stress – we didn’t consciously create these structures and systems that would eventually make people feel stressed at work. Much of the structures and systems that we have created at work, that have led to the normalisation of deviant behaviour that compromises our mental and physical health, has happened slowly and incrementally over the years, and in tandem with our social and technological environment. And this unnoticeable drift has led us to the situation we are in now, where mental health is at a critical tipping point.
So that’s the second ingredient to the Local Rationality Principle which leads to drift, the normalisation of deviance.
Here are a few tips for you to think about in your governance role to help prevent drift or at least monitor for red flags.
Proactive Governance Measures
From a proactive perspective:
- Don’t measure the success of health and safety in your organization by the absence of negatives like injuries and illness. Safety should not be defined through instances where it is not present.
- Don’t let past success determine future success – this is a principle of complex systems – the past cannot predict the future because complex systems are inherently unpredictable.
Reactive Governance Measures
From a reactive perspective:
- Ask counter-factual questions, simple ‘what if’ questions that people should be asking when they do an investigation. Challenge people’s worldview. Challenge their assumptions and beliefs about the system. If you’re looking at an investigation report, ask yourself ‘would the absence or modification of this “cause” have altered the course of events?
- Essentially, you’re questioning whether an identified root cause is actually something that needs further exploration … is that a “cause” or a symptom of something else? If it wouldn’t have altered the course of events, then it’s likely a symptom.
- Your goal is to learn enough that you realize, or appreciate that given the conditions those involved in the incident faced and the information they had, you probably would have made the same decision.
- So in order for you to challenge other people’s worldview, you need to be open to challenging your own. So what assumptions do you hold about the system that might be preventing you from understanding what happened? A common misconception I hear from board members is that safety is all about rules, but I hope this video has given you some insight as to why that’s just not the case.
In Part 4 of this series I’ll walk you through one last ingredient to the Local Rationality Principle that causes drift to occur.
Welcome!
I’m Samantha
I help board members succeed in the boardroom and make a positive impact on the health, happiness and resilience of society through their effective leadership and governance of safety, health and well-being.
RESOURCES
YOU MAY ALSO LIKE…
FEATURED CONTENT
[text-blocks id=”4249″ plain=”1″]
Let us know what you have to say:
Want to join the discussion?Your email address will not be published.