Accident Precursors and Hindsight
7/18/2016
After an unfortunate incident, it is common to discover through diligent investigation, that there were indicators, missed signals, and dismissed alerts that preceded the event. And if these “signals” were only heeded, disaster could have been averted. All one has to do is review some of history’s most famous catastrophes. Such notable examples are the sinking of the Titanic, the 2000 Concorde crash, the two space shuttle disasters as well as the epic Chernobyl incident in 1986. All these disasters could have been easily prevented as each one of these scenarios was preceded by an indicator (sometimes multiple indicators) that could have let any person that was within the immediate area know that trouble was near: If only someone took notice AND took the initiative to either warn others or perform a corrective action. For instance, it was determined that an O-ring failure was the root cause of the Challenger incident in 1986. Yet, it was recognized on earlier shuttle flights that the same O-ring had partially failed. While these minor failures were recognized, corrective actions were not enacted. After the loss of the Columbia flight on February 1, 2003, investigators found that insulating foam had become detached from the external tank and pierced the orbiter’s thermal protection system. Although managers at the National Aeronautics and Space Administration (NASA) had observed debris strikes on numerous prior missions and had recognized them as a potentially serious issue, the Columbia Accident Investigation Board concluded that over time, a degree of complacency about the importance of debris strikes resulted in the issue to go unchanged.
Of course with all the stories and movies about the Titanic, surely the poor decisions made by the captain as well as the brashness (or complacency) of the owning corporation (White Star Line) are well known. These include: 1) the ship was moving too fast through treacherous waters (ice bergs), and 2) not heeding ice berg warnings, 3) poor communications regarding steering directions to avoid the now infamous ice berg, and 4) a lack of the appropriate number of life boats. Regarding the infamous Chernobyl incident, a plethora of causes including poor decision-making based on time expediency, using the wrong subject-matter experts for oversight consultation and initiating testing procedures in the middle of a shift change (the oncoming shift was not completely briefed on the scheduled operations). All these disasters had numerous indicators, that if a person-in-charge would have bothered to understand their potential implications, appropriate measures could have be applied to prevent their historical catastrophes.
Accident investigation research has shown that incidents have multiple precursors. And while a precursor could be near miss or, itself, an unwanted event (such as an O-ring failure), indicators typically have less severe consequences than the actual (subsequent) event for which an investigation then becomes required. As such, indicators/precursors can be inexpensive learning opportunities for understanding what could go wrong. Thus, by encouraging personnel to identify and report precursor events, it is possible to unearth numerous instances of potentially serious safety gaps. In contrast, failure to solicit, capture, and benefit from precursor information simply wastes a valuable resource that can be used to improve safety.
Subject-matter experts within the field of S&H have developed a variety of approaches to improve precursor-reporting programs, with each approach having its advantages and drawbacks. As such, there is no one-size-fits-all program. Yet while effectively managing precursors is challenging, choosing not to use precursor information to improve safety is unacceptable in high-hazard industries. Because precursor events are learning opportunities to improve safety, there have been a number of cases where OSHA has acknowledged that to not actively try to learn from these events, may be perceived as negligence.
While it may be easy to acknowledge the benefits of incorporating precursor investigations within a safety program, defining a precursor can be surprisingly difficult! This is because the actual influence of a precursor may be considered to be insignificant, non-consequential or too minimal and therefore, nothing is done. For instance, driving an automobile, the driver may sneeze while going through an intersection, which, in turn, may cause a decrease in road awareness. Can it be said that this “less-than-one second” loss of focus was the result of the sneezing precursor? By including such small causes as a factor, it is said that the organization reporting system has established a “low reporting threshold.” When this occurs, a process can be overwhelmed by false alarms or inconsequential events, especially if some corrective action or substantial analysis is required for all reported events. Therefore, a low threshold can also lead to a perception that the reporting system is of little value. Such was the case when a large project insisted on broadcasting ALL field incidents, when one day project personnel received an e-mail that an employee received a mosquito bite. The message explained that the employee was administered first aid treatment and returned to work without any further incident. Obviously everyone that read the message sensed that reporting to such a minimal level was unnecessary and even diluted the effectiveness of incident reporting.
Conversely, when the threshold for reporting a precursor is set too high, another type of negative consequence is manifested. True, by establishing a high threshold fewer reporting episodes of non-consequential events occur. However there is possibility that some risk-significant events would not get reported, and therefore, there is a concern that potential valuable data will be unintentional ignored, thereby leaving an operation susceptible to unidentified precursors.
These competing tradeoffs can lead to type I (false positive) and type II (false negative) errors, which can result in too much investigation of issues that are not problematic and too little investigation of issues that are.
This leaves us in a very interesting predicament. When do we report an incident? Do we ignore minor cuts and even near-misses (after all, no harm, no foul, right?). As we can see there is no silver bullet that can capture everything just perfectly. However, when it doubt, it is always best to take the conservative approach and inform your supervisor (i.e. project manager, field team lead, etc). This information will flow to management and your organizations’ ES&H staff. At this point, they should make the decision to determine if further investigation is warranted.
He knows not his own strength that has not met adversity
Ben Jonson