Data Loading...

Learning from Success

232 Views
4 Downloads
2.17 MB

Twitter Facebook LinkedIn Copy link

DOWNLOAD PDF

REPORT DMCA

RECOMMEND FLIP-BOOKS

LEARNING ABOUT LEARNING WORKSHOP

LEARNING ABOUT LEARNING WORKSHOP BEST PRACTICES FOR CONTINOUS LEARNING BEYOND THE CLASSROOM WORKSHOP

Read online »

Learning Circle 8: Learning ROI

Learning Circle 8: Learning ROI LEARNING CIRCLES ABOUT LEARNING NO. 8 LEARNING R.O.I. RESEARCH PROBL

Read online »

LEARNING ABOUT LEARNING WORKSHOP

LEARNING ABOUT LEARNING WORKSHOP LEARNING ABOUT LEARNING WORKSHOP WORKSHOP STRUCTURE: CHOOSING A LEA

Read online »

Learning Circle 3: Blended Learning

The 3 steps to consider and the profile of the modern learner The Handling Complaints example PRACTI

Read online »

PRE-WORK - Learning about learning

PRE-WORK - Learning about learning LEARNING ABOUT LEARNING WORKSHOP Dear Departmental Trainers at Sh

Read online »

Learning Workshop

Learning Workshop LEARNING ABOUT LEARNING WORKSHOP WORKSHOP STRUCTURE: CHOOSING A LEARNING SOLUTION

Read online »

Learning Circle 1: Neuro Science of Learning

Learning Circle 1: Neuro Science of Learning LEARNING CIRCLES ABOUT LEARNING Checking-In! Are you fu

Read online »

2022 Learning Opportunities Calendar

2022 Learning Opportunities Calendar 2022 Learning Opportunities Learning is your competitive advant

Read online »

Sensational: Learning Disabilities

Sensational: Learning Disabilities Pediatric Health The Newsletter About Taking Care Of The Ones Tha

Read online »

2020 TSEP Success Report

year) • Board of Trustees Networking • Board of Trustees Company Updates • Board Strategic Retreat &

Read online »

Learning from Success

Learning From Success: A Discussion Paper

A

June 2017

P

"""''' "If the lessons from incidents were as simple as our current approach to incident investigations suggests, why haven’t we learn the lessons already?"

Michael Tooma, Partner Clyde & Co

Contents

Introduction.....................................................................................................1 The limitations of only looking at failure ..........................................................2 The limitations of our traditional approach to looking at what goes wrong for learning ......... 2 The benefits of a more positive approach .......................................................7 What can we learn from positive psychology? ...............................................................7 The value of near miss investigation as an examination of success ............................8 Why find out about what went right? .............................................................................10 A positive methodology for learning from success ........................................12 Near miss/near hit investigations …………………..………………………………………..13 Scenario 1 Case Study: Near miss/near hit involving Qantas Flight 32……..15

Serious Incidents………………………………………………………………..……………… 19

Scenario 2 Case Study: Australian Transport Safety Bureau Report – Safe working irregularity following rail track maintenance…………………………..25

Learning from success during projects………………………………………..…………... 29

Tips for exploring success …………………………………………….……………………...30

Scenario 3 Case Study: The London Summer Olympics……………………..34

Introduction

The key to improving at anything is learning. But what are we learning from? So much of the time in safety, the conversation is about learning from what has gone wrong. An incident has occurred. What went wrong, we ask? How do we prevent that incident from re-occurring?

But if that is all we do, are we not missing out on many potentially fruitful lessons?

What happens if we turn this paradigm on its head? Instead of focusing all our attention on failures in safety performance, what if we started to learn from success? What would that look like? There is success in everything we do. Even in failure, there may be successes from which we can learn. While we have a tendency to cherry-pick what went wrong in the lead-up to an incident, the reality is not so simple. Undoubtedly, there will also be a number of things that proceeded according to plan and worked successfully. There is benefit in learning from that.

In this paper, we will explore:

-

the limitation of only exploring past failure for 'lessons learnt'

-

the benefits of a more positive approach

-

a methodology for learning from success, and

-

scenarios using the methodology.

We need to find the passion for learning from all that we do: our successes as well as our failures. We hope that this paper provides some inspiration for moving beyond simply looking at failure to derive our lessons in order to achieve better safety outcomes.

Contact: Alena Titterton Partner T: +61 439 077 702 E: [email protected]

Contact: Michael Tooma Partner T: +61 457 087 952 E: [email protected]

AlenaTitterton

MichaelTooma

https://au.linkedin.com/in/alena-titterton-91888a56

https://au.linkedin.com/in/michael-tooma-0b222421

1

The limitations of only looking at failure

The limitations of our traditional approach to looking at what goes wrong for learning

The challenge for everyone involved in incident management and investigation is that we fall victim to the mindset of over-simplifying causes of disasters. Disaster reports are starting to sound the same. They all tell the tale of a company that preferred profit over safety. They make recommendations for greater safety leadership, the need for a safety culture and for a more effective regulator. We analyse things in the same way, make the same recommendations and somehow expect the result next time around to be different. We do this at a smaller scale also. The imperative in the aftermath of an incident is to minimise the impact of the incident. That means reducing shutdown time associated with damaged equipment, regulatory notices or industrial action. This often leads to reactive and narrow-focused decision making on corrective actions - a new safe working procedure and training course, for example, is the most popular corrective action. The assumption is that if we identify the cause of the incident, we can simply develop a procedure for addressing it, train workers in the procedure and require them to follow it. That thinking satisfies the regulators, they being more eager than most to move on to the next incident being investigated. It limits liability in that it is usually accompanied by a third feature -the implicit or explicit blame of the workers involved, either for needing a procedure or for failing to follow it. It also satisfies the conscience of managers who feel they have addressed the issue as soon as it came onto their radar. Crucially, it is cost effective. A procedure is relatively cheap compared to an engineering solution. Of course, if that procedure works and will be followed, then the problem is truly solved and all objectives have been met. In a perfect world, that would be the case since, for example, it is in the best interest of the company to ensure that the procedure is comprehensive and effective. It is in the best interest of the workers to understand the procedure and follow it; after all, it is there for their health and safety. Unfortunately, the world is not that simple. There are a number of biases that prevent us from understanding what goes wrong. For example, investigations suffer from hindsight bias – we know exactly what happened and how it happened so it seems obvious to us where the operator went wrong. But if it seems so obvious to us, why would they have done what they have done? Surely no one goes to work with the intention of hurting themselves

2

or others around them. If they did something wrong, it must have seemed like a good idea at the time, right? Further, we engage in cherry picking of events. Things take on more significance with the benefit of hindsight than they did in real time. We focus on those things because they fit a certain narrative, an explanation for the wrong path taken. But identifying them in this way has little preventative value because they clearly did not seem obvious to workers, managers and/or leaders in real time.

The problem with focusing on what went wrong is that we never get the full story

In addition, people get defensive when you ask them about what went wrong. They either deliberately or subconsciously distort the truth to sound more favourable or to make their mate sound more favourable. While you can improve that somewhat by creating just culture in the workplace, you can't eliminate it altogether. What you get, ultimately, is the version of the truth that people are willing to share with you so you never fully capture the learning from any incident. These things are further exacerbated by the fact that most of our investigation techniques are best designed for story telling rather than analysis. They are based on uncovering the linear timeline of events: "This happened and then that happened." But often the order in which things happen distorts the understanding of why they happened in that conditions may have existed within the system for many years that are important to the incident. This limits our analysis of incidents.

3

So how do we factor that into incident investigations? By accepting that, faced with the same facts, people will not necessarily behave in the same way. The world we live in is far more complex than that. Dekker (2011) observes: "Rational decision-making requires a massive amount of cognitive resources and plenty of time. It also requires a world that is, in principle, completely describable. Complexity denies the possibility of all of these. In complex systems (which our world increasingly consists of) humans could not or should not even behave like perfectly rational decision-makers. In a simple world, decision-makers can have perfect and exhaustive access to information for their decisions, as well as clearly defined preferences and goals about what they want to achieve. But in complex worlds, perfect rationality (that is, full knowledge of all relevant information, possible outcomes, and relevant goals) is out of reach ... In complex systems, decision-making calls for judgments under uncertainty, ambiguity and time pressure. In those settings, options that appear to work are better than perfect options that never get computed. Reasoning in complex systems is governed by people's local understanding, by their focus of attention, goals, and knowledge, rather than some (fundamentally unknowable) global ideal. People do not make decisions according to rational theory. What matters for them is that the decision (mostly) works in their situation." 0 F i Dekker (2011) goes on to explain that this perfectly normal reaction to the rules being imposed on us at a local level can accumulate at an organisational level with harmful consequences. He explains: "Local decisions that made sense at the time given the goals, knowledge and mindset of decision-makers, can cumulatively become a set of socially organized circumstances that make the system more likely to produce a harmful outcome. Locally sensible decisions about balancing safety and productivity -once made and successfully repeated -can eventually grow into unreflective, routine, taken-for-granted scripts that become part of the worldview that people all over the organization or system bring to their decision problems. Thus, the harmful outcome is not reducible to the acts or decisions by individuals in the system, but a routine by- product of the characteristics of the complex system itself." 1 F ii These inconvenient truths of the complex reality we face pose a challenge to the conventional wisdom around incident investigation which is typically concerned with uncovering "the truth" and, indeed, more so, "the root cause" of an incident. Consider the pioneering approach of James Reason in the "Swiss cheese" theory, a theory on which most modern incident investigation techniques are based.

4

The theory is that, like holes in Swiss cheese slices, all systems have deficiencies or inadequate defences. The causal trajectory of an incident leads to an incident when those deficiencies or failed defences in the system line up. It follows, then, that proactively increasing the defence layers reduces the likelihood of an incident. It also follows that in attempting to analyse an incident, a better understanding of that trajectory will uncover the absent or failed defences which enabled the system failure. Logically, then, in the aftermath of an incident, those system deficiencies are identified and addressed through corrective actions, thus reducing the holes on the Swiss cheese slices and therefore reducing the likelihood of a recurrence of the incident. But complex systems' incident trajectories are often unique. That is, addressing what went wrong in a particular incident will only help prevent that exact sequence from recurring. But the likelihood of the planets, or Swiss Cheese slices, aligning in exactly the same way is very remote. It is more likely that the next incident will involve a different trajectory and different holes on the Swiss cheese slices. Addressing the specific sequence that caused the incident will not address potential paths that the incident trajectory could have taken but for certain events. We often look at an incident sequence amazed, but relieved, that things were not much worse and that they could, in fact, have been, had it not been for some "lucky" event. But other than such a casual observation or a remark in an incident investigation report, little is done about those "other" non-causal events. That is, events that either prevented the incident from being of greater impact or the possible trajectory that did not occur -the road not travelled but which could have been travelled.

"most of the root causes of serious accidents in complex technologies are present within the system long before an obvious accident sequence can be identified". 2 F iii

That is, the holes are there, if only our investigation techniques could uncover all of them and not just those involved in the incident. Yet most investigation techniques are linear in their approach, seeking out the exact causal sequence - the truth of what happened - and then uncovering the root cause(s) which lead to that factual sequence.

5

But is it appropriate in a complex world to maintain a linear view of incident causation? Isn't the road that is not travelled just as instructive to further incident prevention as the road that is actually travelled? Indeed, in many respects, the "lucky" control is more instructive for incident prevention than the failed or absent control. If we adopt that approach, building system resilience is not simply achieved by adding Swiss cheese layers as the orthodox view of the theory may suggest, but also by uncovering and plugging holes within each layer that are reasonably related to the incident although not causally connected to it.

6

The benefits of a more positive approach

What can we learn from positive psychology?

Health and safety is not the only discipline that has had to grapple with a focus on the negative and move toward a more positive approach. The psychology discipline, which is often a source of inspiration and empirical >Page 1 Page 2 Page 3 Page 4 Page 5 Page 6 Page 7 Page 8 Page 9 Page 10 Page 11 Page 12 Page 13 Page 14 Page 15 Page 16 Page 17 Page 18 Page 19 Page 20 Page 21 Page 22 Page 23 Page 24 Page 25 Page 26 Page 27 Page 28 Page 29 Page 30 Page 31 Page 32 Page 33 Page 34 Page 35 Page 36 Page 37 Page 38 Page 39 Page 40 Page 41 Page 42 Page 43

Made with FlippingBook - Online catalogs