Quality and Gresham’s Law: When Bad Practices Drive Out Good Ones — and Your Shop Floor Slowly Forgets How to Do Things Right

Blog

Quality and Gresham’s Law: When Bad Practices Drive Out Good Ones — and Your Shop Floor Slowly Forgets How to Do Things Right

It starts with a shortcut. Not a dramatic one. Not one that makes the evening news. Just a small, reasonable-looking deviation that saves eleven seconds on a cycle that runs four hundred times a shift.

Nobody fills out the deviation form because there is no deviation form for something this small. Nobody updates the work instruction because the work instruction still describes the original method — the one that takes eleven seconds longer. The operator who discovered the shortcut mentions it to a colleague during break. The colleague tries it. It works. Within a week, the entire line has quietly adopted the new method without telling anyone.

This is not a story about rebellion. This is not a story about careless workers or negligent management. This is a story about one of the most powerful and least understood forces in manufacturing quality — a force that economists have known about for centuries but that most quality engineers have never heard of.

It’s called Gresham’s Law, and if you’ve been in manufacturing long enough, you’ve watched it destroy quality systems from the inside out without ever knowing its name.


The Original Law

In 16th-century England, Sir Thomas Gresham observed something that seemed paradoxical: when two types of coins circulated with the same face value but different metal content, the cheaper coins would eventually drive the more valuable ones out of circulation. People hoarded the good coins and spent the bad ones. Over time, the money supply became dominated by inferior currency — not because anyone planned it that way, but because rational individual behavior produced a collectively irrational outcome.

“Bad money drives out good.”

Gresham’s Law describes what happens when two systems compete under asymmetrical incentives. The system that’s easier, cheaper, or faster in the short term tends to displace the one that’s better in the long term — not through conspiracy, but through the accumulated weight of thousands of individual decisions that each make perfect sense in isolation.

Now look at your factory floor.


The Manufacturing Version

In your quality system, there are always two ways to do things. There’s the documented procedure — the one in the work instruction, the one the auditor sees, the one that produces consistent, conforming output. And there’s the real procedure — the one operators actually use, the one that gets the job done faster, the one that evolved organically through years of small adaptations and shortcuts that never made it into any document.

Gresham’s Law in manufacturing says this: the easier, faster, less rigorous method will always drive out the more careful, thorough, time-consuming one — unless you build explicit countermeasures to prevent it.

Not because operators are lazy. Not because management doesn’t care. But because the incentives are structurally asymmetrical.

The shortcut pays off immediately — eleven seconds saved, four hundred times a shift, every shift, every day. The cost of the shortcut — a slight increase in variation, a marginal rise in defect probability, a tiny erosion of process discipline — is diffuse, delayed, and nearly invisible. The benefit is concentrated and instant. The cost is spread across weeks and months and manifests in ways that are nearly impossible to trace back to their origin.

This is the same dynamic that Gresham observed with coins. The bad method circulates freely because it’s “cheaper” in the moment. The good method gets hoarded in the work instruction — preserved on paper, honored during audits, and abandoned the moment the auditor leaves the building.


The Anatomy of Quality Displacement

Let me walk you through how this plays out in a real manufacturing environment, because understanding the mechanism is the first step to interrupting it.

Stage One: The Innocent Shortcut.

A new operator joins the line. During training, they’re taught the documented procedure — let’s say it involves measuring a critical dimension with a caliper after every fifth part, recording the value, and adjusting the machine if the reading drifts beyond a control limit. The operator does this faithfully for the first two weeks.

Then they notice something. The reading never changes. The process is stable — beautifully, boringly stable. After two hundred measurements, every single one falls within the same narrow band. The operator starts to wonder: do I really need to measure every fifth part? What if I measure every tenth? Nobody would know. The process is rock-solid.

They switch to every tenth part. Nothing bad happens. The process remains stable. The operator saves time. Production output increases slightly. Their supervisor notices the improved efficiency and praises them.

The shortcut has now been rewarded.

Stage Two: Social Transmission.

The operator shares their discovery with a colleague on the adjacent line. “You know, I stopped measuring every fifth and went to every tenth. Nothing changes. Process is dead stable. Saved myself probably twenty minutes a shift.”

The colleague tries it. Same result. Nothing bad happens. They share it with another colleague. Within a month, half the shop floor has quietly reduced their measurement frequency. Nobody updated any procedure. Nobody ran a risk assessment. Nobody asked whether the process stability they observed was caused by the frequent measurements — whether the measurements themselves were an act of process control that maintained the very stability they seemed unnecessary for.

This is the critical insight that Gresham understood: the good coin disappears because people can’t see the system-level effect of their individual choices. The operator can’t see that their frequent measurements were a stabilizing force, not merely a monitoring activity. Remove the measurements, and the process might still be stable — for a while. But the feedback loop has been broken, and the first sign of drift will go undetected.

Stage Three: Institutionalization.

Six months later, a new operator is trained. The trainer — the original shortcut discoverer — teaches the new hire to measure every tenth part, because that’s how it’s actually done. The work instruction still says every fifth, but nobody reads work instructions after training. The new operator learns the real method, not the documented one.

A year later, someone suggests reducing to every twentieth part. After all, the process has been stable for months. Why waste time measuring so frequently?

The cycle accelerates. Each generation of shortcuts builds on the previous one. The baseline of what’s considered “normal” shifts continuously downward. And because the original method is no longer practiced by anyone, there’s no living memory of what proper process discipline actually looks like.

Stage Four: The Invisible Collapse.

One Tuesday morning, a batch of raw material arrives with slightly different properties. Nothing dramatic — the supplier is still within specification, but the process mean shifts by a fraction. Under the original measurement protocol (every fifth part), this shift would have been detected within forty-five minutes and corrected before any defective product was produced.

Under the current protocol (every twentieth part, or maybe by now it’s every fiftieth, or maybe it’s “when I feel like it”), the shift goes undetected for two full shifts. By the time someone notices — usually because a downstream process starts rejecting parts, or a customer files a complaint — eight hundred nonconforming parts have been produced.

The investigation that follows will be thorough. Root cause analysis. 8D report. Corrective actions. Someone will probably blame the raw material supplier. Someone else will question why the measurement frequency was reduced. The operator will say, “That’s how I was trained.” The trainer will say, “The process was stable, so we reduced frequency.”

And nobody will identify the real root cause: a structural incentive system that made the shortcut rational, rewarding, and socially transmissible while making the correct method feel like unnecessary overhead.

Gresham’s Law doesn’t produce dramatic failures. It produces slow, invisible erosion that periodically manifests as a crisis whose origins nobody can trace — because the original cause was a thousand tiny decisions, each of which made perfect sense at the time.


Why This Is Different From Normalization of Deviance

You might be thinking: “This sounds like normalization of deviance.” It’s related, but it’s not the same thing.

Normalization of deviance is about gradually accepting lower standards — redefining what “acceptable” means over time. Gresham’s Law is about the displacement of good practices by easier ones — not because standards change, but because the practices that uphold those standards are quietly abandoned while the standards themselves remain on paper.

In normalization of deviance, the target moves. In Gresham’s Law, the target stays the same — but everyone stops aiming at it while pretending they still do.

This distinction matters because the countermeasures are different. Fighting normalization of deviance requires recalibrating standards and expectations. Fighting Gresham’s Law requires redesigning the incentive structures that make shortcuts rational.


The Three Conditions for Gresham’s Law in Manufacturing

For bad practices to drive out good ones in a quality system, three conditions must be present. Understanding these conditions is the key to preventing them.

Condition One: Asymmetrical Feedback.

The shortcut provides immediate, tangible feedback (time saved, effort reduced, production increased). The correct method provides delayed, diffuse feedback (process stability maintained, defect rates kept low). Human beings are neurologically wired to respond more strongly to immediate feedback than to delayed feedback. The shortcut wins not because it’s better, but because its benefits are felt while its costs are invisible.

Condition Two: No Immediate Consequence for Deviation.

If every shortcut immediately produced a defect, Gresham’s Law couldn’t operate. The whole mechanism depends on the fact that most shortcuts work — most of the time. The process is forgiving. The defect rate is low. The measurement frequency can be reduced without immediate catastrophe. This forgiveness is precisely what makes the shortcut so dangerous: it teaches the operator that the shortcut is safe, reinforcing the behavior that will eventually cause a problem.

Condition Three: Social Proof and Transmission.

Gresham’s Law requires transmission. One operator’s shortcut must be observable by and transmissible to others. In a manufacturing environment where operators work in proximity, share breaks, and rotate between stations, this transmission happens naturally and rapidly. The shortcut becomes social currency — a piece of practical knowledge that experienced operators share as a favor.


Countermeasures: Making Good Practices Competitive

If Gresham’s Law operates through asymmetrical incentives, the countermeasure is to rebalance those incentives — to make the correct method at least as rewarding as the shortcut. Here’s how.

Make Compliance Visible

One of the reasons shortcuts spread is that they’re invisible. Nobody sees an operator skipping a measurement. But if compliance is visible — displayed on a dashboard, tracked in real time, celebrated publicly — then the social dynamics reverse. Instead of the shortcut being the smart, insider knowledge that experienced operators share, compliance becomes the socially reinforced behavior.

Digital measurement systems are excellent for this. When every measurement is logged automatically and compliance rates are displayed on a screen that everyone can see, the shortcut loses its social appeal. Operators can see who’s following the procedure and who’s cutting corners. More importantly, they can see that management is watching — not punitively, but with genuine interest in process integrity.

Make the Cost of Shortcuts Visible

The most powerful antidote to Gresham’s Law is to make the invisible costs of shortcuts visible. This means tracking not just defect rates, but process drift — the gradual shift in process parameters that precedes defects.

If operators can see that their measurement frequency directly correlates with process stability, the shortcut becomes less appealing. “When I measure every fifth part, my process Cpk is 1.67. When I measure every tenth, it drops to 1.33. When I skip measurements entirely, it drops to 1.10.” Suddenly, the cost of the shortcut is no longer invisible — it’s quantified and personal.

Make the Correct Method Easier

The ultimate countermeasure is to make the correct method the easiest method. If measuring every fifth part requires walking across the station to retrieve a caliper, opening a logbook, recording a value by hand, and then returning the caliper, the shortcut is practically inevitable. The procedure is asking the operator to perform a high-friction activity four hundred times a shift.

But if the measurement is taken with an inline gauge that requires no operator intervention — or a simple button press that logs the value automatically — the shortcut disappears because there’s nothing to shortcut. The correct method is now the path of least resistance.

This is the principle of friction-aware process design: every time you design a quality control activity, ask yourself whether the correct method is the easiest method. If it isn’t, you’re creating the conditions for Gresham’s Law to operate.

Build Routine Variation into Auditing

Gresham’s Law thrives on predictability. When operators know exactly when audits happen and what auditors will check, they can perform the documented procedure during the audit and revert to the shortcut afterward. Layered process audits help, but they’re not enough if they become predictable themselves.

The most effective organizations build randomization into their audit schedules — not to catch people doing wrong, but to maintain a genuine baseline of what’s actually happening on the floor. This isn’t about surveillance; it’s about maintaining an honest feedback loop between documented procedures and actual practice.

Practice Procedure Archaeology

Every six months, take a work instruction and compare it to what’s actually happening on the floor. Not during an audit. Not with a clipboard. Just stand and watch. Talk to operators. Ask them to walk you through what they actually do, step by step.

You will find discrepancies. Every organization has them. The question is not whether they exist — it’s whether you have a system for finding and addressing them before they accumulate into a quality event.

I call this procedure archaeology: digging through the layers of practice that have accumulated over a documented procedure to find out what’s really happening. It’s uncomfortable work, because it reveals the gap between your quality system on paper and your quality system in practice. But that gap is where Gresham’s Law lives, and you can’t fight what you can’t see.


The Leadership Test

Here’s a simple test for whether Gresham’s Law is operating in your organization. Walk onto your shop floor and ask an operator: “Show me how you do this task.” Then compare what they show you to the work instruction.

If they match perfectly, you have exceptional process discipline — or an operator who knows you’re coming.

If they don’t match, you have a choice. You can blame the operator for not following the procedure. Or you can ask a better question: “What made the other way seem like a good idea?”

That question — not the blame — is the beginning of fighting Gresham’s Law. Because the answer will reveal the incentive asymmetry that’s driving bad practices into circulation. And once you see the asymmetry, you can redesign it.


The Honest Truth

Every factory I’ve ever worked in — and I’ve worked in hundreds — has Gresham’s Law operating somewhere in its quality system. Every single one. The documented procedure and the actual practice diverge over time because the incentives for divergence are built into the structure of manufacturing work.

The organizations that maintain world-class quality don’t eliminate this dynamic — they recognize it and build systems to counteract it. They make compliance visible, quantify the cost of shortcuts, design friction out of correct procedures, and practice relentless procedure archaeology.

They understand that the bad coin doesn’t drive out the good because people are bad. It drives out the good because the system makes it rational to spend the bad and hoard the good.

Your quality system is only as strong as the gap between what’s documented and what’s practiced. Close that gap, and you close the door on Gresham’s Law.

Leave it open, and you’ll wake up one Tuesday morning wondering how eight hundred defective parts escaped a process that was running “perfectly fine” the day before.


Peter Stasko is a Quality Architect with 25+ years of experience transforming manufacturing quality systems across automotive, aerospace, and industrial sectors. He specializes in making invisible quality dynamics visible — and fixable.

Scroll top