The Misunderstood Mind
Get checked out on the most powerful piece of equipment used on every job site.
- By Steve Casner
- Dec 15, 2017
They trip, fall, and crash their cars while looking down at their phones. They skip the PPE and get hurt using familiar tools. They make mental notes to do something critical but then forget. In my recent book called "Careful," there's a story about a kitchen worker who disregarded the procedure to unplug a meat slicer before cleaning it. As he scrubbed away at the circular blade, a loop on his apron tie caught the on/off switch and powered it up.
If you survey your workers, you'll hear them blame these mishaps on complacency, carelessness, or everyone's favorite: a conspicuous lack of intelligence. But psychologists offer a different explanation. Your workers aren't really that lazy or reckless. They're not stupid and they're not on a fast track to genetic extinction. When people engage in these risky behaviors, they truly believe that nothing all that terrible is going to happen to them. Sure, they’ll admit that there are risks involved and they know that others occasionally succumb to them. But most workers firmly believe that they are going to be just fine—especially just this one time. If they really thought they were going to get hurt, they wouldn't do it.
When it comes to safety, most of us have some fundamental misunderstandings about how our own minds work. We remain confident in our ability to divide our attention between two activities. We believe that our experience will shield us from the occasional errors that make us human. We remain convinced that we'll remember to do important stuff, and we imagine ourselves to be foreseers of potential hazards. Meanwhile, psychologists have spent the past 70 years conducting experiments that show that few of us actually possess any of these superpowers.
Here's the setup for our typical experiment. We'll bring in a few dozen people and show them the job we're going to ask them to do. We might have them try to pay attention to a phone and a road at the same time. Or use a tool that they've handled many times before. Or maybe we'll ask them to remember to do something important at an appointed time. Or to rate the riskiness of an activity like climbing a ladder without a safety harness or trying outrun a tornado in a pickup truck. If you ever have the chance to be in one of these experiments, do it. It's always fun stuff.
But before we get on with the business of measuring how well people perform at the task, we like to ask people how well they think they are going to perform, so after they're finished we can compare the two results.
What we almost always find is that people aren't very good judges of their own capabilities. After we hear participants go on about their impressive abilities to remain vigilant, we find that even the best-trained among us aren't that good at keeping an eye on much of anything for very long, and when we try to keep watch over two things at once, we invariably miss stuff.
After listening to people tout the protective effects of training and experience, our studies show that experts screw up just as often as beginners, only in different ways and for different reasons. Practice doesn't make perfect. To err is human, and no amount of practice is going to make it stop. The way to never stab yourself in the hand with a sharp knife isn't through training and experience. It's through training and experience followed by putting on a nice pair of safety gloves. Because sooner or later, no matter how trained and experienced you are, that knife is going to head straight for your other hand.
Memory is one of our favorite things to test. We've watched workers as skilled as airline pilots remain confident that they'll remember to do something important but then forget even after being reminded to do it seconds beforehand. And as we become more experienced (i.e., older), our memories don't improve much.
Experienced workers love to think that they can see danger coming 10 miles down the road. But our experiments show that stopping to think things through is the hallmark of the beginner who has no choice but to fumble through the instruction manual. A seasoned expert just fires that meat slicer right up like a boss. Our kitchen worker's years of experience are what allow him to run that slicer on autopilot. What chance did he have of avoiding the subtle hazard of that dangling apron tie? He had one: to follow the prescribed safety procedure. The problem was that he didn't fully understand why he needed to.
You'd think that once a few accidents happened, workers would catch on to these misunderstandings and correct them. They don't because workers don't view human error through the same lens that a psychologist does. When an accident happens, they don't hold up the safety manual and realize that whoever wrote it was right all along. They hold up a different book called "The Darwin Awards" and they blame everything on idiots.
Changing How Workers View Human Error
So how do we change the way our workers think? Research has given us clues about how to do that, too. The secret lies in what happens when we show our participants their experimental results together with their original predictions of how well they thought they would do. The best word I can think of to describe it is epiphany. I remember a Lear Jet pilot I tested more than 20 years ago remarking how his sense of whereabouts had so markedly improved since GPS was introduced. Midway through the flight, I switched off the GPS and asked him to tell me where we were. He quickly realized that although his GPS knew where we were, he was a little unsure about it. At that moment, he looked like his entire belief system had just been upended. He confessed: "I guess I'm more dependent on my GPS than I thought I was." We then had a short chat about what psychologists call the depth-of-processing effect. When information is simply handed to us, we don't tend to think about it much unless we have a pressing need for it. But when we have to calculate that information ourselves (barefoot in the snow, uphill both ways), that effortful processing tends to make the information stick in our memories for a while. Sure enough, pilots who used paper maps instead of GPS in that same experiment had no problem telling me where they were, even when I snatched their map out of their hands. Navigating the old-fashioned way had installed that map in their heads. How many pilots in that experiment mistook what a computer knew for what they themselves knew? All of them. Today, 20 years later, nobody falls for this one any more. Pilot today crack jokes about people who would mindlessly follow the color display on a GPS. They'd call them "dogs watching cartoons."
Retraining Our Brains
The bottom line here is that before workers can see the value in what you provide them with your safety program, they need to understand why they need it. And to do that, they need to understand the many misunderstandings most of us have about how our own minds work. Once that understanding is in place, your workers will begin to see your safety messages through a different lens. Improved safety metrics won't be a matter of "getting rid of some of these idiots out there." They'll be a matter of retraining our brains to stop and think when approaching something that looks like it could do some damage. That whatever equipment they are using could hurt somebody and, by way of victims, that they'll do just fine.
Over the past 20 years in aviation, we have re-engineered our very attitude toward our own sometimes fallible human minds. Walk up to the cockpit and listen to the way pilots talk today. They proudly blurt out things like, "I make a mistake every two to three minutes!" Or: "Steve, I have an excellent memory. But it's short." Pilots today understand that their next brain freeze could happen any time, but they're all set up to survive it. Don't get me wrong, pilots still have egos just like in John Wayne movies, but they have been trained on the basics of how our mostly rational minds work and it's now another part of their proudly possessed skill set. We just saw 75 million flights make it to their destinations without a single fatality.
But while pilots are busy defying gravity and breaking safety records, any of us could use that same understanding of our own vulnerabilities to be able to safely perform what the injury statistics suggest are even riskier tasks: such as walking across a shop floor, using a sharp knife, or trying to keep a car between two painted lines. This really isn't rocket science. It also may not be optional any more. As we fill our workplaces with new technologies that present us with even more subtle hazards, things could easily get worse. Now is probably the perfect time to up our understanding of the most powerful, high-tech, but misunderstood piece of equipment used on every job site.