Quality and the Peltzman Effect: When Your Safety Nets Make People Take More Risks — and Your Quality System’s Best Protection Becomes Its Biggest Vulnerability

Blog

Quality and the Peltzman Effect: When Your Safety Nets Make People Take More Risks — and Your Quality System’s Best Protection Becomes Its Biggest Vulnerability

The Paradox That Costs You More Than Any Defect

In 1975, an economist named Sam Peltzman published a paper that should have changed how every organization thinks about safety — and quality. He studied the introduction of mandatory seat belts in automobiles and discovered something unsettling: fatalities per accident went down, but accidents themselves went up. Drivers, subconsciously protected by their seat belts, drove more aggressively. The safety net changed their behavior. The total number of traffic deaths barely moved.

This isn’t a story about seat belts. It’s a story about your shop floor, your inspection department, your automated vision systems, and every failsafe you’ve ever installed. Because the Peltzman Effect — risk compensation — is alive and well in your quality system. And it’s probably costing you more than any single defect ever could.

What Risk Compensation Looks Like in Manufacturing

Here’s how it works in practice. You install a sophisticated automated inspection system at the end of your production line. The system catches 99.7% of defects. Your quality manager is thrilled. Your customer is impressed. Your defect rate plummets.

But something else happens — something nobody measured, nobody tracked, and nobody expected. Your operators start getting sloppy. Not deliberately. Not maliciously. But the psychological safety net of that inspection system subtly shifts their internal risk calculus. “The scanner will catch it” becomes the background music of your production floor. Setup procedures get followed with less precision. In-process checks become perfunctory. The mental edge that comes from knowing you are the last line of defense — that edge dulls.

You haven’t eliminated defects. You’ve moved them. You’ve shifted the quality burden from a distributed, human-aware system to a single technological checkpoint. And when that checkpoint has a bad day — a calibration drift, a lighting change, a software update that reclassifies a defect as acceptable — the floodgates open. Not a trickle. A flood. Because the human vigilance that used to be your first and most adaptable defense has atrophied.

I’ve seen this play out across dozens of organizations. The pattern is always the same: a new quality control is introduced, initial metrics improve, then slowly — invisibly — the underlying process discipline erodes. The net effect is zero improvement, or sometimes negative improvement, masked by the temporary boost of the new system.

The Architecture of Dangerous Comfort

The Peltzman Effect doesn’t just apply to automated inspection. It shows up everywhere in quality systems, and it wears disguises.

Excessive rework capacity is one of the most common carriers. When your organization builds a robust rework station — skilled technicians, clear procedures, fast turnaround — you create an economic safety net. The cost of producing a defect drops. And when the cost of failure drops, the urgency to prevent failure drops with it. I worked with an automotive supplier that had a rework cell so efficient that their first-pass yield was barely 80%. But their final delivery quality was 99.2%, because the rework team was extraordinary. The leadership didn’t see a problem. They saw a success story. What they didn’t see was the 20% of their capacity being consumed by work that should have been done right the first time, the material waste, the energy waste, and the slow normalization of failure that had crept into their culture.

Generous safety stock is another carrier. When your warehouse holds enough buffer inventory to cover three weeks of defects, the urgency to fix root causes evaporates. The defect becomes a logistics problem, not a quality problem. The organization learns to absorb failure instead of eliminating it.

** forgiving specification limits** — those wide tolerances you negotiated with your customer “just to be safe” — they invite exactly the behavior you were trying to avoid. Operators learn to target the center of a wide specification with less precision, because the edges are so far away. Your process distribution spreads out. And when you finally get a customer who needs tighter tolerances, you discover that your process capability has been quietly degrading for years.

Where the Effect Strikes Hardest

The Peltzman Effect is most dangerous in systems where the safety net is most impressive. This creates a bitter irony: the better your quality controls, the more vulnerable you may become to the behavioral erosion they enable.

Automated optical inspection (AOI) in electronics manufacturing is a textbook case. AOI systems are extraordinary — they catch solder defects, component misalignments, and trace damage at speeds no human can match. But I’ve watched operators stop visually inspecting boards entirely because “the AOI already checked it.” The AOI becomes a single point of failure with no human backup. When the AOI’s reference library misses a new defect type — a new component, a new failure mode — it sails through undetected. Not because the operator couldn’t see it. Because the operator stopped looking.

Statistical process control (SPC) can trigger the same dynamic. When control charts are automated and generate alerts, operators stop developing their intuitive sense of the process. They wait for the alarm instead of feeling the drift. The statistical safety net replaces human pattern recognition — and human pattern recognition is still, in 2026, superior to any control chart algorithm for detecting novel, complex, multi-variable shifts.

Final inspection gates are perhaps the most universal carrier. Every organization has one — that last checkpoint before product ships. And every operator upstream of that gate knows it exists. The knowledge that someone else will catch the defect changes behavior. Not because operators are lazy. Because they are human, and humans optimize their effort based on perceived risk.

Diagnosing the Effect in Your Organization

How do you know if the Peltzman Effect is active in your quality system? Look for these symptoms:

Declining first-pass yield with stable final quality. If your final defect rate looks good but your in-process defect rate is climbing, you’re not improving — you’re filtering harder. The underlying process is getting worse, masked by downstream controls.

Rework as a normalized cost center. If your annual budget includes a line item for rework and nobody questions whether it should exist, the Peltzman Effect has already taken root. Rework should be a temporary exception, not a permanent feature of your production landscape.

Inspection proliferation. If your answer to every quality problem is “add another inspection point,” you’re building a thicker safety net without addressing the behavioral erosion underneath it. More inspections can actually accelerate the effect, because each new checkpoint reinforces the message that someone else is responsible for catching defects.

Skills atrophy in experienced operators. Talk to your veteran operators. Ask them how their job has changed since you installed that new quality system. If they say they “don’t need to watch as carefully anymore,” the Peltzman Effect is already reshaping your workforce’s capabilities.

Escalating cost of quality despite stable defect rates. If your COPQ (Cost of Poor Quality) is rising even though your PPM rate is flat or falling, you’re spending more to achieve the same result. The safety net is getting heavier, and the process underneath it is getting weaker.

Building Systems That Don’t Invite Risk Compensation

The solution isn’t to remove safety nets. That would be reckless. The solution is to design quality systems that account for human behavioral response — systems that provide protection without providing permission.

Design for inherent quality, not inspected quality. The most powerful defense against the Peltzman Effect is a process that produces good output by its very nature, not one that relies on inspection to sort good from bad. Poka-yoke devices that prevent defects at the source don’t create risk compensation because they don’t give the operator a choice to make. The error is physically impossible. There’s no safety net to rely on because there’s no fall to take.

Maintain human engagement at every level. Don’t let automation replace human judgment — let it augment it. One effective model is the “dual verification” approach: automated systems perform their checks, and operators perform independent checks without seeing the automated results. This preserves human vigilance while adding technological capability. Yes, it costs more in the short term. But it prevents the skills atrophy that creates catastrophic single points of failure.

Make the safety net visible, not invisible. When people can see the safety net — when they understand its capabilities and its limitations — they make better decisions. An operator who knows that the AOI system catches 99.7% of defects but also knows the 0.3% failure modes it misses is a more vigilant operator than one who simply trusts the machine. Training should emphasize what controls don’t catch, not just what they do.

Tie quality metrics to the source, not the filter. Measure and reward first-pass yield, not final defect rate. Track in-process defect rates alongside delivery quality. When your dashboards show the health of the process — not just the effectiveness of the inspection — you create accountability at the point of origin.

Rotate inspection responsibilities. When operators periodically serve as inspectors and inspectors periodically run processes, the “someone else will catch it” mentality dissolves. Each person has lived on both sides of the safety net. Empathy replaces complacency.

Audit the safety net itself. Most organizations audit their processes. Few audit their quality controls for behavioral side effects. Add a periodic review that asks: “Has the introduction of this control changed upstream behavior? Are people less careful, less engaged, less skilled than they were before?” The answers will surprise you.

A Personal Observation

I once consulted for a precision machining company that had invested heavily in coordinate measuring machines (CMMs). They had CMMs at final inspection, CMMs at in-process checkpoints, even portable CMMs on the shop floor. Their measurement capability was extraordinary. Their customers rated them as a preferred supplier.

But when I walked the floor, I noticed something odd. Operators were setting up jobs without using the setup fixtures that had been designed for them. When I asked why, the answer was casual: “The CMM will tell us if it’s off.” And it would. But the setup that took fifteen minutes with the fixture now took forty minutes of trial-and-error machining followed by CMM verification. The material waste from the trial cuts was never tracked. The machine time consumed by the iterative approach was invisible on any dashboard. The operator had outsourced his craftsmanship to a measuring machine.

The company wasn’t saving time or money with their CMMs. They were spending more — much more — but the cost was distributed across so many budget categories that nobody saw the total. The Peltzman Effect had turned a measurement investment into a behavioral subsidy for lower craftsmanship.

When we reintroduced mandatory fixture use for setup — not to replace the CMM, but to restore the process discipline — first-pass yield climbed from 91% to 97.8% in six weeks. Setup time dropped by 60%. The CMMs became verification tools instead of correction tools. And the operators rediscovered a pride in their craft that had been quietly disappearing.

The Deeper Lesson

The Peltzman Effect teaches us something fundamental about quality systems that most frameworks miss: quality is not a property of controls. It is a property of behavior. Controls shape behavior. Sometimes they shape it in the direction you intended. Sometimes they shape it in exactly the opposite direction.

Every quality professional knows that the best process is one that doesn’t need inspection. But what we often forget is that the best inspection system is one that doesn’t erode the process it was designed to protect. The art of quality engineering isn’t just building better nets — it’s building nets that teach people to walk the tightrope with confidence rather than carelessness.

Your safety nets are not the problem. The problem is the invisible behavioral subsidy they provide. Make the subsidy visible. Design around it. Account for it in your metrics, your training, and your audits.

Because the most dangerous quality system isn’t one that fails. It’s one that succeeds so well that everyone forgets why it was needed in the first place — and stops doing the things that made success possible.


Peter Stasko is a Quality Architect with 25+ years of experience helping organizations build quality systems that work with human nature instead of against it. He has led quality transformations across automotive, aerospace, electronics, and heavy industry on three continents.

Scroll top