The aviation industry could be defined as the Godfathers of the checklist world, as pre-flight checks are something that have been happening (we hope) for a long time. So, when a recent publication by NASA’s Safety Aviation Reporting System delved into the main reasons for error in checklist usage, we listened.
Almost all businesses have checklist procedures in place for certain activities. Be it workshop inspections, equipment audits, or even something as simple as a pre-start check, checklists are prevalent across the workplace, and getting the process right could quite literally mean the difference between life and death.
While checklists are great at guiding us through our procedures, they are by no means impervious to human error. The most common errors according to NASA were categorised into five sections.
[Tweet “While checklists are great, they are by no means impervious to human error.”]
1. Checklist Interrupted
Interruption and distraction ranked as one of the major causes of human error when it came to checklist completion. Distractions inherent in last minute checks can easily result in omissions.
Using the aviation industry as the pin-up models for checklist usage, here is a first person example of how it could all go wrong when distractions are present. Case example pulled from NASA’s SRA System’s original publication.
“During the accomplishment of the Before Pushback checklist, the Flight Attendant brought in the passenger count documentation at exactly the moment the First Officer read the ‘Takeoff Trim’ item.
I responded to the Flight Attendant interruption and subsequent verbal exchange and then the First Officer and I proceeded to the next item, ‘Cockpit Door’, without actually having reset the takeoff trim to the correct setting. During the takeoff, we received a Takeoff Warning horn as I advanced the throttles for takeoff.
At approximately 10 knots, I rejected the takeoff and accomplished the immediate action items while the First Officer notified the Tower of the rejected takeoff. After clearing the runway and finishing the checklist items, I discovered the takeoff trim was not set in the proper position and was out of the green band area.”
This is a reminder to be extra vigilant and aware of the impact of distractions when completing a checklist.
2. Checklist Item Overlooked
Even though you may habitually perform an inspection checklist on a regular basis, there is still a chance to absent-mindedly overlook an item. Here is a first person case example taken from NASA’s original publication of an MD11 Captain experiencing just that.
“Pulled into the gate, set the parking brake, and shut down the Number 3 engine. We waited a short time for external power and when we got it, I connected to it then shut down the Number 1 engine.
I did the Shutdown checklist, debriefed, discussed the strange taxi routing, and left the
aircraft. There was no crew bus so the First Officer went up to the cockpit to call for one and saw that the Number 2 fuel lever was still up. He shut off Number 2 and came back down to the ramp and informed me that the engine was still running when he went up to the cockpit.
I rarely taxi in on three engines and in this case did just that. I went through my normal shutdown habit pattern which is just shutting down one and three. I missed it on the shutdown checklist because I didn’t actually look at the levers because, in my mind, I was convinced I had shut them down.”
It is important to visually check everything on your checklist because it will help when your habit pattern is broken.
3. Use of the Wrong Checklist
If you are involved in a company with multiple checklists in place, with improper organisation, there is a chance that these checklists can be mixed up. This example tells the story of a B757 crew and how using the wrong checklist can make the situation worse.
“On departure at approximately 300 feet AGL the First Officer’s Primary Flight Display (PFD) and Nav Display (ND) went blank. I assumed control of the aircraft and after reaching a safe altitude called for the First Officer to open his QRH and find the appropriate abnormal checklist for our situation (loss of right PFD and ND). The First Officer said he was ready to proceed and he read the first item on the checklist. I do not recall whether the First Officer read the title of the checklist aloud before he read the first item on the checklist
The checklist called for us to check two circuit breakers supplying power to the Symbol Generator. Both circuit breakers were in. Next item on the list called for the Symbol Generator-1 Power circuit breaker to be pulled and then reset.
The circuit breaker was pulled and this resulted in the loss of the Captain’s PFD and ND. At this point it was determined that the First Officer was reading the checklist for loss of left PFD and ND and we immediately attempted to reset the Symbol Generator-1 power circuit breaker with no success. We then completed the QRH procedure for loss of right PFD and ND, but we did not regain the First Officer’s PFD or ND.
Upon reaching the gate, Maintenance met the aircraft and upon opening the E&E Compartment they discovered a great deal of water had accumulated in that compartment from an unknown source. It would appear that the accumulated moisture/water caused the loss of the First Officer’s PFD and ND and prevented the successful reset of the Symbol Generator-1 Power circuit breaker. ”
It is important to always confirm the correct checklist is being used. It should always be easy to tell once the checklist is started, unless the activities are very similar as with the example above. Regardless, ensuring your checklists have easily identifiable titles that thoroughly determines the activity at hand is a simple fix.
Also, from a system’s point of view, if the steps of the checklist don’t make sense for the activity you are doing, stop. Backtrack and make sure you are using the correct checklist.
4. Failure to use a Checklist
Checklists exist for a reason. A book by Dr. Atul Gawande, entitled “The Checklist Manifesto” tells the story of a Doctor taking a leaf out of the aviation industry’s book in a bid to reduce human error: a checklist.
In 2001, Dr. Peter Pronovost, a critical care specialist in John Hopkins Medical Centre in Baltimore, thought by incorporating checklists into medical practice, he could reduce human error, similar to the way airlines do their pre-flight checks.
Dr. Pronovost began by testing checklists on only one activity in the Clinic. He noted that there was a high number of infections in patients with central intravenous lines, notorious breeding grounds for pathogens.
Dr. Pronovost listed five things staff needed to meticulously follow when inserting these intravenous lines to avoid infection. Wash hands with soap; clean the patient’s skin with antiseptic; have the patient’s entire body covered with sterile drapes; wear a mask, hat, gloves, and sterile gown; and after the line was in, put a sterile dressing over the insertion site.
It was at the time, according to The Checklist Manifesto, an idea so simple it was considered crazy. These steps were common knowledge, the process taught this way for years. It seemed silly to make a checklist for something so obvious.
But after two years, the results were in. With the stringent use of Dr. Pronovost’s checklist meticulously followed every time, the figures were as follows. Infection as a result of central intravenous lines in the John Hopkins Intensive Care Unit went from 11% to zero. Dr. Pronovost estimated that in two years the checklist process had prevented 43 infections, avoided 8 deaths, and save the hospital approximately $2 million.
Checklists have been proven to work. They are in place for a reason and although certain tasks may be “common knowledge”, there can be human error every once in a while. If a checklist is in place for a task, it is important to follow it, no matter how confident you are in its completion, eliminating the chance for error.
5. Checklist Confusion
When developing iAuditor, SafetyCulture has tried to create the ability to build a more intelligent checklist by incorporating things like smart fields. There is a danger, however in attempting to create a checklist that is too fancy, and could cause confusion for employees.
Also wording needs to be clear, unambiguous and well defined to avoid confusion or individual interpretation.
This flight crew attributes the setup of the checklist and the ambiguous wording directly as the source of their confusion.
“The Quick Reference Handbook procedure for a L/R FADEC caution message is somewhat confusing. We had to read the procedure several times just to make sure that we were required to shut the engine down. The procedure calls for shutting down the engine “prior to landing” if all other indications are “normal,” but that is poorly defined. Doing the shutdown right away obviously isn’t required, but should you wait until short final or do it further out?
In the end we elected to shut the engine down as we made our descent and were probably still 20 miles or more from the field. This gave us time to review the procedure for single engine landing, make our PA announcement, talk to the flight attendants, coordinate with Approach, etc. Also, while the “NO” side of the checklist leads you to the Single Engine Approach and Landing Abnormal checklist, the “YES” side does not. And yet the “YES” side still requires that the engine be shut down. It would seem only logical that the Single Engine checklist be
performed in that case as well.
Upon further review of the QRH, it has come to my attention that the procedure for a FADEC caution, when all other engine indications are normal, was not completed correctly. I misread one of the steps in the procedure that called for the Thrust Reverser to be turned off and instead read it as though the Thrust Lever should be shut off.”
Checklists, particularly those dealing with emergency situations, need to present a clear, unambiguous set of actions that will reach the end result efficiently and without trouble.
To view the NASA Safety Aviation Reporting System’s original publication click here.
Author: Jarrod Boyd
The information contained in this article is general in nature and you should consider whether the information is appropriate to your specific needs. Legal and other matters referred to in this article are based on our interpretation of laws existing at the time and should not be relied on in place of professional advice. We are not responsible for the content of any site owned by a third party that may be linked to this article. SafetyCulture disclaims all liability (except for any liability which by law cannot be excluded) for any error, inaccuracy, or omission from the information contained in this article, any site linked to this article, and any loss or damage suffered by any person directly or indirectly through relying on this information.