"Can

"Can't Rather than Don't" Makes Injuries Impossible

The "can't rather than don't" principle emphasizes using engineering controls to make safety incidents physically impossible, rather than relying on administrative controls that depend on worker vigilance.

Henry Ford was well ahead of his contemporaries in terms of workplace safety, and the principles he used are still applicable today. Among these is what are now called “engineering controls” which make safety incidents physically impossible, as opposed to “administrative controls” that rely on worker vigilance and compliance. The problem with administrative controls is that one has to do the job wrong only once to suffer a lost work time injury or even worse. The best way to understand this principle is with a simple example.

In his 1920 article "How Henry Ford Saves Men and Money," Louis Resnick reported, "Even the simple little sewing machine—of which there are 150 in one department—did not escape the watchful eyes of the safety department. Every now and then, the needle of one of these high-speed machines would run through an operator's finger. Sometimes the needle would break after perforating a finger, and a minor operation would become necessary. When such accidents began to occur at the rate of three and four a day, the safety department looked into the matter and devised a little 75-cent guard which makes it impossible for the operator to get his finger in the way of the needle." 

The administrative control—which consisted effectively of telling machine operators to not put their fingers under the sewing needle—was clearly not adequate to prevent three or four puncture wounds a day. These were even more dangerous in 1920 because neither antibiotics nor the tetanus vaccine had been invented. This engineering control made it impossible to put a finger under the needle because the guard is between the worker's index finger and the needle, but the fabric fits underneath the guard.

In his 1931 book Ford: Men and Methods, Edwin P. Norwood wrote that warning signs that admonished people to be careful "find their chief value as cumulative evidence for the defendant in lawsuits" and are of "scant preventive value in shops. … In so far as is practicable, it is not a case of 'Don't' but the installation of devices that stand for 'Can't.'" Instead of reminding sewing machine operators to not put their fingers under moving needles, the installation of the guard in question made it impossible to do so.

To this Norwood added that, if a safety incident did occur, what is now called corrective and preventive action (CAPA) was taken to ensure that it never happened again. This did not, of course, consist of retraining those involved or telling people to "be careful," but rather the application of engineering controls to make the problem impossible. Today, if a worker files a near-miss report, or what Japan calls a hiyari hatto ("experience of almost accident situation"), the near-miss would be treated as if the incident had actually occurred and corrective and preventive action would be taken to preclude it. Similar activities will also be evaluated to determine whether they also would benefit from the actions taken.

Resnick provides another example that applies "can't rather than don't" to punch presses. The administrative controls in place at most companies required tap signaling—or red and green lights—that would supposedly tell the machine operator when nobody's hand was in the press so that he could apply the machine's power to the workpiece inside. As one has to get this wrong only once to cause a serious injury, thousands of fingers and hands were lost throughout the country every year. The safety department, as led by Ford's safety director Robert A. Shaw, introduced a switch that required each worker present to press and hold down two buttons to close the press. These buttons were well out of the way of the moving parts; so it was physically impossible to close the press if somebody's hand was not on a button. "To trip a press equipped with this device, the operator must press two buttons, about a foot apart, so that it is necessary for him to use both hands. A two-man press so equipped will not trip until four buttons are pressed at one time; a three-man press will not trip until six buttons are pressed simultaneously."

The need for every worker present to push two buttons, one per hand, to close the press also underscores the principle behind lockout-tagout, which was, in fact, used by Ford more than 90 years ago and before OSHA even existed. Lockout-tagout, as it exists now, requires each worker to lock out the power to a machine or electrical device before anybody works on it. Each worker removes his or her lock upon leaving the job, but as long as even one person is still working, nobody can re-energize the equipment. In this application, the punch press is essentially locked out from doing anything until all the workers press the buttons simultaneously to show that nobody's hand is near the moving parts.

Although "The current lockout/tagout standard came into being in 1989 and was based on ANSI Z244.1-1982, American National Standard for Personal Protection—Lockout/Tagout of Energy Sources—Minimum Safety Requirements," Norwood cites "lock-and-key security." A worker who services a slat conveyor, which the reference compares to a large sausage grinder and thus a hazard to anybody near the running gear were it to be set in motion, first switches off the power supply and then locks out access to the switch. "He alone has a key and so is the only person who can set that particular conveyor in motion." The machine cannot cause an injury without electrical power, and the maintenance worker ensures their own safety by locking out the power.

In contrast, a tuna fish processing plant worker was killed in 2015 when he went into an oven to service it, only to have others turn the oven on without realizing he was inside. Why didn't they have lockout-tagout? California's judicial system came to this conclusion, which resulted in penalties against the company and two individuals, as well as a damage award to the worker's family.

The "can't rather than don't" principle carries over into health care as well, where patients died as a result of nurses attaching enteric food bags to intravenous lines. Everybody knows, of course, that the introduction of liquid food into blood vessels is likely to harm or kill those on the receiving end, and the nurses had doubtlessly been told to "be careful." The problem is again that, if it's possible to do a job wrong and one does it a few hundred times, somebody will eventually do it wrong and one has to get it wrong only once to cause a fatality. 

The first question is why the connections for the enteric feeding system were even compatible with an intravenous line. The underlying principle notes that it is already impossible to put a polarized plug into an electrical outlet the wrong way or a USB plug into a USB port backward, while factories have long made connectors for incompatible gases so they will simply not fit the wrong gas supplies. Nestlé Health Science (2012) applied the same principle with the SpikeRight PLUS enteral connector system which cannot be connected to an IV line. 

In summary then, the "can't rather than don't" principle underscores the superiority of engineering controls, which make dangerous mistakes physically impossible, over administrative controls that rely on worker compliance and vigilance. Regardless of how careful and vigilant people might be, if it is possible to do a job wrong it will eventually be done wrong, and with likely physical harm to the workers or others.

Featured

Artificial Intelligence

Webinars