I recently had the opportunity to attend a short training session on hoist and crane safety. Rigging and lifting is not my area of expertise, but like most industrial workers I have had some introductory training—just enough to ensure that I know to stay out of the cone of exposure and to recognize unsafe conditions. This training was a good refresher and taught me a few things to look for when observing a lifting evolution.
Part of the training covered recent crane accidents. One in particular caught my attention; here is a very high-level overview.
An outdoor gantry crane was being operated in high-wind conditions, but operators weren’t sure how high because the crane’s manometer was broken. Three of the crane’s four storm and parking brakes—the brakes that would normally lock the crane in position—were inoperable. A particularly strong gust of wind started the crane rolling down its rails. It picked up speed so fast that the normal brake was unable to stop the momentum. The crane slammed into the blocks at the end of the track. The crane operator, who was not wearing a seatbelt, was ejected from the cab and suffered fatal injuries.
The discussion about the event centered around the decisions of the crane operator. Why was he attempting a lift in high-wind conditions? Why was he not wearing his seatbelt? This is the typical response as we all think about how we could avoid a similar outcome by changing the things in our control; the risky decisions.
We do this all the time. We will watch a news report about a random homicide that could have easily been us, until we hear the piece of information that allays our fears. “Oh, it happened downtown at 3 a.m.? That couldn’t be me; I’d never be downtown at 3 a.m.” Someone walked into traffic while staring at their phone? “I always look where I’m walking; no way I’d do that.” All those people on the Titanic that drowned or froze to death? “Well, I would have gotten on one of those half-empty lifeboats or fashioned a raft out of life jackets and debris or insisted Rose share that door. I could have made it.” It helps us sleep at night to think OUR decisions are better and we would avoid danger. It’s not an uncommon reaction.
What really caught my attention, though, was the lack of discussion around the system breakdowns that helped to set this event up. The first principle of Human Performance is “People are fallible; even the best make mistakes.” Is your work system resilient to these potential mistakes? For example, if the manometer was working, maybe the crane operator would have had solid information and not made faulty assumptions. Even if he did decide to proceed in the face of risky weather conditions, a functional set of storm and parking brakes would have prevented the crane from rolling uncontrolled. Why wasn’t there an interlock that prevented crane operation with the seatbelt disengaged? If any of the error defenses were in place, this tragedy could have likely been avoided.
The next time you hear about an industry event or lessons learned, after you consider how you could have made better choices, take some time to consider the organizational breakdowns and how the system was not resilient to human error.
Toolbox Talks offers quick insights and thoughts to use for your toolbox (tailboard) talks. Dave Sowers is a founding member of Knowledge Vine, a veteran-owned human performance training and consulting organization that strives to reduce the frequency and severity of human errors in the workplace. He has almost 30 years of experience in power generation and the utility industry. He is a veteran of U.S. Navy Nuclear Power Program and holds a bachelor’s degree in resources management and a master’s degree in both management and emergency management and homeland security.