Quality and the Planning Fallacy: When Your Organization Systematically Underestimates How Long Quality Improvement Actually Takes — and Every Project Runs Late Because Optimism Isn’t a Strategy

Uncategorized

Quality
and the Planning Fallacy: When Your Organization Systematically
Underestimates How Long Quality Improvement Actually Takes — and Every
Project Runs Late Because Optimism Isn’t a Strategy

You already know how this story ends. You just don’t believe it
applies to you.

The corrective action was supposed to take six weeks. The team had a
plan. Management had a deadline. The customer had expectations. And on
paper — on that beautiful, clean Gantt chart with its color-coded phases
and optimistic milestones — everything fit perfectly.

Twelve weeks later, you were still arguing about root cause.

The planning fallacy isn’t a quality problem. It’s a human problem
that happens to destroy quality projects with surgical precision. First
identified by psychologists Daniel Kahneman and Amos Tversky in 1979,
the planning fallacy describes our systematic tendency to underestimate
the time, cost, and risks of future actions — even when we have direct
experience with similar projects that ran over.

Read that again. Even when we have direct experience.

That’s what makes the planning fallacy so devastating in quality
management. It’s not ignorance. It’s not a lack of data. You’ve run
dozens of CAPA projects. You’ve watched 8D investigations stretch from
weeks into months. You’ve seen process validations consume three times
the budget anyone allocated. And yet, when the next project comes along,
you look at the calendar and think: “This one’s different. This one’s
simpler. Six weeks.”

It’s never six weeks.

Why Your Brain Betrays
Your Timeline

The planning fallacy operates through two complementary cognitive
traps. The first is inside-the-box thinking — when you plan, you
construct a best-case scenario. You imagine each step going smoothly.
You assume approvals come on time, equipment arrives when scheduled, and
operators are available for training. You don’t deliberately ignore
risks; your brain simply doesn’t generate them during the planning
phase.

The second trap is the outside-view blindness. You have access to
historical data — your previous projects, industry benchmarks, even your
own project archive. But when planning a new project, you treat each one
as unique. You focus on the specifics of this project rather
than looking at the base rate of all your projects. This is
what Kahneman calls the distinction between the “inside view” and the
“outside view.”

In quality management, the inside view sounds like this: “This FMEA
update only covers two failure modes. The team knows the process well.
We can finish it by Friday.”

The outside view sounds like this: “The last four FMEA updates we did
took an average of three weeks each, and every single one was supposed
to be done by Friday.”

One of these predictions is dramatically more accurate. And it’s not
the one your project schedule is built on.

Where
the Planning Fallacy Strikes Quality Work Hardest

Not all quality activities are equally vulnerable. The planning
fallacy feeds on three conditions: novelty (or perceived novelty),
complexity, and interdependence. The more your project depends on other
people, other departments, or other systems, the worse your time
estimate will be.

CAPA and 8D Investigations. These are the planning
fallacy’s favorite hunting ground. You start with what looks like a
straightforward defect. The containment is simple. But then root cause
analysis reveals a systemic issue. Now you’re not fixing one process —
you’re redesigning three. The corrective action that was supposed to be
a procedure update becomes a capital project. And your six-week timeline
just became six months.

Process Validation. IQ/OQ/PQ protocols look linear
on paper. In reality, each phase depends on the previous one’s success,
and each deviation resets the clock. A single out-of-specification
result during OQ can add weeks while you investigate, correct, and
repeat. The planning fallacy makes you schedule the best case.
Validation delivers the real case.

FMEA Updates. “We just need to add the new product
variant to the existing FMEA.” Six participants, two days of workshops,
three rounds of review, and a customer-requested revision later, you’ve
consumed three weeks of engineering time. The FMEA itself was accurate.
The timeline was fantasy.

Supplier Quality Improvement. You’re not just
managing your own planning fallacy anymore — you’re inheriting your
supplier’s. That corrective action you requested? They promised 30 days.
Their internal approval process alone takes 45. And they haven’t even
started the actual corrective work yet.

Audit Remediation. Every finding looks simple when
you’re writing the response. “Update procedure, train personnel,
implement evidence.” But updating one procedure reveals that three
others reference it. Training requires scheduling around production. And
the evidence your auditor wants doesn’t exist in the format they’re
asking for. Welcome to scope creep, powered by optimism.

The Anatomy of an
Overdue Quality Project

Let me walk you through a scenario that plays out in manufacturing
organizations every week.

A customer audit identifies a gap in your control plan. The finding
is legitimate but not catastrophic. Your quality engineer estimates two
weeks to close it: update the control plan, revise the work instruction,
train the operators, upload evidence. The quality manager approves the
timeline. The customer accepts the commitment.

Week one: The quality engineer starts updating the control plan and
discovers that the current revision was never fully implemented on the
floor. The documented process and the actual process have diverged. Now
you have two problems: the customer’s finding and an internal gap you
didn’t know existed.

Week two: You decide to fix both simultaneously. The control plan
update now requires input from process engineering, who are busy with a
new product launch. Your quality engineer waits three days for a
30-minute review meeting. Meanwhile, the work instruction revision
reveals that the specification it references was updated six months ago
but the work instruction wasn’t. Another unplanned update.

Week three: The revised documents go to the quality manager for
approval. He’s at a supplier audit. Approval takes four days instead of
one. Training is scheduled for the following week because the night
shift supervisor is on vacation.

Week four: Training happens, but two operators call in sick. They
need to be trained separately. The customer wants evidence of training
effectiveness — not just attendance records but competency verification.
Nobody planned for that.

Week five: You finally submit the evidence package. The customer
reviews it and asks for one additional clarification. Their review cycle
takes ten business days.

Six weeks total. For a finding that was supposed to take two.

Nobody was incompetent. Nobody was lazy. The planning fallacy simply
did what it always does: it made the future look simpler than the
past.

The Reference Class
Forecast: Your Antidote

The most powerful countermeasure to the planning fallacy is
deceptively simple. Kahneman calls it the reference class forecast, and
it works like this:

Step 1: Identify the reference class. What category
does this project belong to? Not “this specific CAPA” but “all CAPA
projects we’ve done in the last two years.”

Step 2: Gather the distribution. What was the actual
duration of projects in this class? Not the planned duration — the
actual duration. What was the fastest? The slowest? The
median?

Step 3: Use the base rate as your starting
prediction.
If your last ten CAPA projects took between 4 and
14 weeks with a median of 8, your starting estimate for a new CAPA
project should be 8 weeks — regardless of how simple this particular one
looks.

Step 4: Adjust for specific factors. Only
after you’ve anchored to the base rate should you adjust up or
down based on the specifics of this project. And the adjustment should
be modest, because the base rate already contains information you don’t
have about the things that will go wrong.

This feels wrong. It feels like pessimism. It feels like you’re
sandbagging. But it’s not pessimism — it’s empiricism. You’re replacing
your gut feeling with data from your own history.

Practical Tools
for Quality Project Estimation

Beyond the reference class forecast, several practical techniques can
help your quality team produce more realistic timelines.

The Pre-Mortem. Before the project starts, gather
the team and ask: “Imagine it’s six months from now and this project has
failed spectacularly. What went wrong?” This simple exercise forces the
team into an outside-view mindset. They’ll generate risks and obstacles
that never appear in a forward-looking planning session. Capture every
answer and build time buffers for the most likely ones.

Three-Point Estimation. Instead of a single duration
estimate, ask for three: best case (everything goes perfectly), most
likely (realistic assessment), and worst case (significant problems
arise). Use the PERT formula — (Best + 4 × Most Likely + Worst) ÷ 6 — to
calculate a weighted estimate. This single technique can improve your
timeline accuracy by 30-40%.

The Buffer Rule. For any quality project that
involves more than two departments, multiply your initial estimate by
1.5. For projects involving suppliers or customers, multiply by 2.0.
This isn’t arbitrary padding — it’s an empirical correction for the
planning fallacy’s systematic bias.

Milestone Decomposition. Break the project into
milestones no larger than one week. If you can’t describe what “done”
looks like for a one-week chunk, your scope is too vague to estimate.
The act of decomposition reveals dependencies and hidden work that
inflate the total timeline.

Historical Project Reviews. After every quality
project, capture the planned vs. actual timeline. Build a simple
database. Review it quarterly. Make it visible. The data is useless if
it lives in someone’s email. It needs to be on the wall — or at least in
a shared dashboard — so that the next planning conversation starts with
reality instead of optimism.

The Organizational Dimension

The planning fallacy isn’t just an individual cognitive bias. It has
an organizational amplification mechanism that makes it worse in
companies than it is in individuals.

Deadline pressure from above. When management sets
the timeline before the technical team estimates the work, the estimate
becomes a negotiation rather than a prediction. The quality engineer who
knows the CAPA will take eight weeks learns to say four, because that’s
what the VP wants to hear. The planning fallacy meets executive
optimism, and the result is a timeline that bears no resemblance to
reality.

Escalation of commitment. Once a timeline is
committed to a customer or communicated to management, it becomes
psychologically sticky. When the project starts running late, the team
doesn’t revise the timeline — it works harder. Overtime increases.
Corners get cut. The quality of the quality improvement itself degrades.
You implement a corrective action faster but less thoroughly, and the
defect recurs.

The optimism cycle. Organizations that consistently
underestimate project durations don’t learn from the experience.
Instead, they explain each overrun with project-specific factors: “That
one was unusual. The supplier was slow. The auditor was picky.” They
never look at the pattern. The next estimate is just as optimistic as
the last one.

Breaking the Cycle

If you’re a quality leader, here are five things you can do starting
this week:

First, track planned vs. actual duration for every
quality project. Not just the big ones. All of them. You need data, not
impressions.

Second, institute a “base rate check” in every
project planning meeting. Before anyone proposes a timeline, someone
presents the historical data for similar projects. “The last five FMEA
updates took an average of X days.” Let the number sit in the room
before anyone suggests this one will be different.

Third, separate estimation from negotiation. The
technical team estimates the work. Management prioritizes and resources
it. These are different conversations. When you combine them, the
estimate always bends toward what management wants to hear.

Fourth, normalize timeline revision. When a project
hits an unexpected obstacle, the first response should be “let’s update
the timeline,” not “let’s work the weekend.” Accurate timelines aren’t a
sign of poor planning — they’re a sign of honest planning.

Fifth, celebrate accurate estimates over fast ones.
If a quality engineer estimates eight weeks and delivers in eight,
that’s a win. If they estimate four and deliver in six, that’s not
heroism — that’s a 50% estimation error that your customer had to
absorb.

The Deeper Lesson

The planning fallacy in quality work isn’t really about time
management. It’s about honesty. It’s about the willingness to look at
your own history and accept what it tells you. It’s about the discipline
to plan based on evidence rather than hope.

Every quality organization I’ve worked with has the data. They know
how long their projects actually take. They have the CAPA logs, the
validation reports, the audit responses. The information is there.
What’s missing is the willingness to use it — because using it means
admitting that quality improvement is harder, slower, and more complex
than anyone wants it to be.

But here’s the paradox: the organizations that plan honestly finish
faster. Not because their work goes more smoothly, but because they
allocate the right resources from the start. They don’t scramble. They
don’t shortcut. They don’t submit evidence packages that get rejected
and need to be redone. They do it right the first time — because they
gave themselves enough time to do it right the first time.

The planning fallacy tells you that the next project will be
different. Your data tells you it won’t. Listen to your data. It’s the
one voice in the room that doesn’t have an incentive to be
optimistic.


Peter Stasko is a Quality Architect with 25+ years of experience in
automotive, aerospace, and quality transformation. Certified PSCR and
Six Sigma Black Belt.

Scroll top