The Safety Leak | Article 5: The Certainty Bias

I’ve realized that I have a dangerous subconscious bias: I tend to trust the person who sounds the most certain.

In engineering, where we crave hard data and definitive answers, the person who speaks with total confidence—”This will take exactly two weeks” or “This is definitely a database issue”—is the person I reward. I promote them. I give them the lead on projects.

But I’ve learned that Certainty is often just a mask for Hidden Risk. By rewarding confidence over accuracy, I’m accidentally leaking safety. I’m teaching my team that if they aren’t 100% sure, they shouldn’t speak.

The Confidence Tax

When I reward certainty, I am taxing the honest engineer.

Engineering is inherently messy. A senior dev who says, “I honestly don’t know why this is failing yet, but I have three hypotheses,” is actually being more technical and accurate than the dev who says, “It’s definitely the cache.”

But if I show impatience with the “I don’t know,” I force my team to start guessing.

  • The Result: Engineers stop surfacing the “Unknown Unknowns.”
  • They commit to deadlines they can’t hit because “I’m not sure” feels like a sign of weakness.
  • They stop doing the deep discovery work because I’ve incentivized the appearance of progress over the reality of it.

The Fragility of the “Confident” Plan

When I build a roadmap based on certain-sounding estimates, I am building a fragile system. Because I haven’t allowed for the “Maybe,” I haven’t built in any buffers.

When the inevitable complexity arises, the “Certain” engineer doesn’t want to admit they were wrong (it would cost them their status). So, they hide the delay. They cut corners. They let the safety leak further.

I’ve found that the most catastrophic failures come from “Certain” plans that didn’t have a way to handle the truth when reality diverged from the whiteboard.

Why “I Don’t Know” is High-Bandwidth Telemetry

I’ve had to re-train myself to see Intellectual Humility as a senior-level skill.

When an engineer says “I’m not sure,” that is a high-fidelity signal. It tells me exactly where the risk is. It tells me where I need to allocate more resources or more time. If I punish that signal, I am essentially cutting the wire to my own sensors.

How I Patch the Certainty Bias

I have to change the incentive structure so that accuracy is valued more than confidence.

  1. I reward “Probabilistic Thinking”: I stop asking for “The Date.” I ask for a range and a confidence level. “We’re 80% sure we can hit June 1st, but there’s a 20% risk if the legacy API doesn’t behave.” This makes it safe to talk about the “20%.”
  2. I model “Not Knowing”: I make it a point to say “I don’t know” in front of the whole team. I want them to see that my status as a leader isn’t tied to having all the answers.
  3. I celebrate the “Pivots”: When an engineer comes to me and says, “I thought it was X, but I was wrong—it’s actually Y,” I don’t focus on the mistake. I thank them for the course correction.

The Diagnostic

I look at the last few status updates or project pitches:

  • Did anyone use the words “I suspect,” “Maybe,” or “I’m not sure”?
  • Do I push people for a “final” answer before they’ve done the research?
  • Is the person who is always “sure” actually the most successful person on the team?

If I only hear 100% certainty, I am not hearing the truth. I am just hearing what I’ve paid people to say.

1

Leave a Reply

Your email address will not be published. Required fields are marked *