One year ago the domestic flight Lion Air Flight 610 crashed into the Java Sea thirteen minutes after takeoff, killing all 189 people aboard. The final investigation report was published on the 25th of October 2019. The media were keen on having their most pressing question answered: Was it the aircraft, was it the airline or was it the crew? But the question where we could derive enduring insights from should not be who was accountable, the question should be: how and what can we learn from accidents where responsibility is dispersed across multiple parties?
Figure 1: Swiss cheese model from James Reason
Dispersed Swiss cheese slices
Air crash investigators in Indonesia identified ‘9 holes in the Swiss Cheese’ (see figure 1). As in any accident, a chain of failing events led to disaster. To name a few:
Boeing’s responsibility – The aircraft (Boeing 737 Max) had an identified weak spot, namely the Manoeuvring Characteristics Augmentation System – or MCAS. A crucial sensor herein was not properly tested. Besides that, it was not possible to perform the emergency procedure regarding correcting the malfunctioning of the MCAS a realistic time frame.
Airline’s responsibility – The aircraft should have been grounded already, due to earlier identified cockpit issues. However, these were not properly recorded. In fact, the maintenance log missed 31 pages in total. In addition, Lion Air allows underperforming personnel to take an active role in flying aircraft.
Crew’s responsibility – The first officer was not able to perform the needed procedural drill in case of emergencies, and it is claimed that it took him 4 minutes to find a manual to assist him in doing so. On top of that, he was not briefed properly on a solution that was eventually offered by his captain.
Figure 2: The top events of the three bowties that can be identified
All barrier failures appear in a perfect (chrono)logical order. However, the slices do not align within the scope of a single bowtie. In this accident scenario, we can at least identify three bowties, where the main problem concerns that its scopes and responsibilities are not guarded by one overarching party. The bowties and its responding accountabilities are managerially disconnected, so the barriers and their possible (in)effectiveness will not become apparent in one overview.
Regaining instead of chaining
In all (major) accidents, this is a common denominator when an incident analysis is done. But while media, victims, insurance companies and many other institutes are primarily concerned with pinpointing the problem to a specific organization, the real lesson to be learned is how to deal with this lack of control.
The answer lies not in rescoping the analysis into one (too) massive chunk. Eventually, the world remains as is, and large corporations will not engage in overseeing how their part within a sequence of responsibilities will play a role for an entity operable in another process down the road. Such air crash investigation reports give a lot of insights on what happened, but the main learning as always should be that all players involved are equally responsible. It wasn’t just the design, it wasn’t just the safety process, it wasn’t just the human error; it was all of the above.
Figure 3: Plan-do-check-act cycle
An important learning from this report is that an accident such as this one could have been prevented at all stages if appropriate action had taken place. The value of learning from incidents is not met when a management control cycle ends at the ‘check’ phase. Each entity involved has had to follow up on their own anomalies by using recommendations, then act accordingly (see figure 3 above). It is crucial to take action immediately after an incident or minor accident, to prevent potential catastrophes from happening. Even a ‘near miss’ and unreliable barriers should receive serious attention, in order to regain full control.
Would, could, should
In conclusion, no real new insights are gathered when analyzing an accident such as the Lion Air Flight 610 and reading its final report. In order to get risk management and safety cultures to a higher level and a brighter future, the gap between theory and practice within risk management should be diminished. The pilot should have not have been allowed to fly that specific aircraft, Lion Air should not have allowed the aircraft to take off after earlier problems with the MCAS, and Boeing should not have been allowed to market a maldesigned aircraft.
Plan, do, check, act… and do not miss out on continuously evaluating the implementations thereafter. Be accountable instead of guilty.