An unambiguous yet ominous chuck key — Commons

One of the promising undergraduate students within the lab I worked in at Wisconsin was machining a part one day on a mill. He passed on the unsupervised lab-specific machine shop for risk of safety and was in the established student shop in the College — a fancy facade of a facility with a carefully organized tool closet and a windowed observation office where the head machinist, a disliked authoritarian of a person with decades of experience, could watch the shop. The student was very sharp, but left the chuck key in the mill head and turned it on. The key spun around, flew out, and took with it two of his fingers. As he’s holding his bloodied hand the head of the shop comes running out and begins yelling at him, “why did you do that!!” This would surely be a mark on his safety record. The student, in shock, ran away to the hallway outside where other students applied paper towels to his hand and helped him to the hospital.

The problem here was not a lack of authority and control, or severity of consequences, but a lack of community connection and continuous improvement in the shop practices. A chuck key with an ejector spring prevents people from leaving it in the chuck, but is more expensive. The buddy system with a mentor can help spot some of these mistakes, whatever they may be. While these improvements may seem obvious to some, common sense isn’t so common.

The WSU administration, led by the Office of Research, is undergoing an effort to re-emphasize and improve safety at our institution. I was recently informed by my chair that “at least one significant incident occurs at a university laboratory every month.” And OSHA (Occupational Safety & Health Administration) shows “that researchers are 11 times more likely to get hurt in an academic lab than in an industrial lab.” What is it about our authoritarian-legalistic structure of academic bureaucracy that naturally leads to this sub-par performance in such a critical area, and what can we do to improve?

Why organizations have a hard time with safety

I’ve written previously about how Universities evolved tree-like hierarchies. Nearly all of the reward system and feedback loops are geared towards promoting researchers to become power-driven authority leaders in their fields, which reinforces the extant authoritarian-legalistic system structure. The problem with these structures is communication. There is a very low amount of duplex communication, i.e. real conversations,,, talk. There just isn’t time for an administrator to sit down and spend quality time actually working with someone in a lab to mentor them — let alone knowing the people in their division. This results in a natural disconnection and un-grounding of administration from the people actually doing the work. I recently asked one of my friends, who is an administrator: “When was the last time you actually got a training by sitting down and doing the activity with someone, or a group of administrators?” He couldn’t remember a workshop that wasn’t primarily the traditional one way data dump.

Couple the difficulties in communication with declining resources, increasing performance pressures, and a 2-5 year graduation timer on all your primary lab personnel, and you have a recipe for a safety nightmare.

This means that it’s all too common to hear safety bulletins from administrators along the lines of the following: “make a new resolution to make this year accident free,” or to add “safety to annual performance evaluations,” or to “please report even the minor accidents,” and emphasis that “failure to report an incident… does result in consequences.” This is the easiest thing for an administrator in a power structure to do. Aside from invasive intrusions into labs, what else can they do? But this leads to other problems.

I once knew an administrator who still conducted research in their lab. One day, a post-doc accidentally mixed two substances in the fume hood, leading to an explosion that destroyed the hood. The administrator, under pressure to reduce accidents in their unit, said, “I’ll make the consequences severe enough to deter this from happening again!”

When framed like this, the lack of communication almost seems criminal. Clearly, the sad reality is that these authoritative declarations coupled with punishments, within our communication-deficient authoritarian-legalistic system structure, can lead to corruption and actually be detrimental to the broader cause they intend to help. This command and control approach boils down to what is known as the deterrence hypothesis: the introduction of a penalty that leaves everything else unchanged will reduce the occurrence of the behavior subject to the penalty. I’ve previously written about the problems of applying the deterrence hypothesis to grading of coursework. In this case, safety is connected to my performance evaluation — which is primarily used for raise allocations and promotion. So in short, if an accident happens, my status and pay within the institution will suffer. So does this feedback mechanism promote better safety or lack of reporting — the most direct effect is lack of reporting. This is also presumes that the permanent disabling damage from losing fingers or another accident is not deterrence enough — the approach assumes that faculty delegate all risks to students rather than doing the activity themselves.

In a famous study titled, “A fee is a price” researchers investigated the efficacy of the deterrence hypothesis at mitigating the undesirable behavior of picking a child up late from daycare. This is low — abusing the personal time of a lower-paid caretaker charged with the health and well being of your child. In many ways this parallels the minor accidents, cuts, and knuckle bangs we’re being asked to report. In order to couple these to performance evaluations, a non-arbitrary metric must be created to decide how big the penalty, or price, should be. Contrary to expectations, the researchers performing the study found that adding the penalty actually increased the negative behavior that it intended to deter. The researchers deduced that the penalty became a price — if I’m late, I’ll pay the $20 and everything is ok — regardless of whether the caretaker had other plans. Perhaps the most troubling finding from the study was once the penalty was systemized, the bad behavior continued regardless of whether the penalty was removed or not. Once you marginalize or put a fee on a person, it’s tough to treat them as a person with rights and dignity again.

I’ve seen this play out many times with daycares, teams, and communities I’ve been involved. Reliably the diminishing of people and disruption of personal connection leads to the demise and underperformance of the organization. When an authoritarian is presented with this evidence contrary to their belief, they reliably counter with, “oh I’ll make the penalty severe enough to deter the behavior.” What else can they do? This approach, in the absence of appropriate developmental scaffolding, leads to a depressed environment adverse to uncertainty. Everyone becomes afraid to report safety, afraid to discuss safety, afraid to try new things and push the limits (isn’t trying new things and pushing the limits called research?) — often simply because trying new things is no longer the norm. When something is not the norm, it becomes an uncertainty risk and threat.

I once was having a discussion with an administrator about a new makerspace on campus. This prompted the statement, “But we’ll never be able to control the safety!” To this I immediately responded: 3D printers are robotic hot glue guns with safety shrouds! Every campus in the US has a gym with a squat rack (people put hundreds of pounds on their back on a daily basis with poor form), climbing wall (someone could fall!), pool (but what if someone drowned!), and a hammer/discuss/shotput/javelin toss (yikes!).

Arbitrary targeting of risk/blame is another characteristic of authoritarian/legalistic organizations because they lack established heuristics, a.k.a. processes, to work through safety scaffolding of new activities. Shot put and the hammer toss are established activities that our culture has normed to, where the risk in developing the established safety protocol was encumbered centuries ago. Less of a need for an administrator to CYA. Moreover, a command and control approach isn’t what makes them safe — it’s connections and discussions with people. The disincentive for not using the squat rack correctly is chronic back pain, something I deal with on a daily basis. That risk didn’t stop me from squatting incorrectly! The problem was ineffective coaching/scaffolding. Telling the coaches to coach better won’t explicitly fix that. And we can’t always rely on starting a new facility fresh with appropriate safety from the beginning.

One organization identified minor cuts from razor blades as being the leading cause of safety incidents within their organization. In response, they replaced all of the razor blades with plastic knives and trained people on how to use the plastic knives safely. Everyone quickly became frustrated with the plastic knives and started bringing in their own pocket knives to use for routine tasks.

It’s natural for an authority to look at the leading cause of issues and try to make it go away. The key is to help the group perform better while doing it — not worse. People inherently want to perform and take great pride in it. Presenting a safety solution that reduces their performance will be quickly circumvented. What this organization eventually did was to purchase customized box openers with better razor blades that were protected by a plastic gaurd such that they could not cut a person. The tools worked better than regular box opener/razor blades. This is known as pok-a-yoke, a.k.a. error-proofing, in Lean Manufacturing parlance. When done well, nobody will complain.

After having a string of safety incidents in their unit, an administrator and safety board required every club and lab to have a “designated safety officer” or a designated authority to control safety for the group. After a few months in this position, one lab’s “safety officer” lamented to me, “Sometimes I need to be the bad guy because people don’t take safety seriously. But it gets tiring. They dislike me for it, blame me when stuff goes wrong, and they still don’t take safety upon themselves.”

This is directly analogous to the problem of quality control faced in Lean Manufacturing. In Lean, the question comes up of whether something you’ve manufactured meets the design specification. Do you hire a quality control czar to stop production if product starts coming out not to spec or unsafe? Ever heard a story of someone who was frustrated with the quality cop coming over to tell them things were wrong, yet providing no explanation what was wrong or how to fix it? Moreover, the only way to ensure 100% quality/safety is 100% inspection — not a sustainable or scale-able approach. The Lean approach is to design quality/safety control into the production process — if the part can’t be made wrong/unsafe, it’s much easier to achieve 100% safety/quality. Moreover, if everyone is responsible for checking safety/quality during the production process, you just made everyone in your group a safety officer and multiplied the odds of spotting a risk before it’s realized.

It boggles my mind why lab leads do not have safety procedures posted by all key lab processes and equipment. It’s really simple — if something goes wrong you change the procedure. Changing the procedure is orders of magnitude cheaper and easier than changing the equipment or personnel. It’s therefore much easier to continuously improve procedure.

So we’ve shown through multiple ways the safety shortcomings of traditional authoritarian-legalistic bureaucratic structures. How do we get beyond these to cultivate a sustaining community and culture of safety within such institutions?

Let’s talk about HYPER-Safe

If you want to get away from an authoritarian safety culture, you need to have a rule-based framework that helps your community perform better. We can all agree that good safety results in better performance through fewer stop works. We accomplish this in HYPER through our HYPER-Safe design process:

The HYPER-Safe process flow chart.

Once we implemented this flow chart, I stopped having to fight for safety. This was because we started producing our designs on time and had fewer problems requiring us to re-engineer things due to safety oversights. We started producing more of our work on-time. Pretty soon, industry started sending folks to learn this system from us. Examples on applying the system are available on this website: https://hydrogen.wsu.edu/safety-101/.