The Safety Leak | Article 0: The Logic of the Leak
I see it constantly in engineering organizations: I hire high-agency, experienced people and then place them in environments that act as a high-pass filter for reality.
There is no “No Dissent” policy. I don’t put a slide in the onboarding deck that says “avoid disagreement.” Yet the behavior is consistent: risks surface too late to pivot, meetings sound “aligned” until the system crashes, and postmortems explain failures that multiple ICs privately predicted weeks in advance.
The talent exists. The environment is simply filtering out the telemetry.
The Internal Cost-Benefit Analysis
Every engineer runs a background process before they open their mouth in a meeting. It’s a rational cost-benefit calculation: “Is the cost of speaking up higher than the cost of the project failing?”
In most companies, I find the math looks like this:
- Speaking Up (High Cost): You create immediate social friction. You “derail” the flow. You challenge a lead with an ego. You now have to spend three hours building a proof-of-concept just to justify your concern. If you’re wrong, you’re “the person who cried wolf.” Even if you’re right, you’re “difficult.”
- Staying Silent (Low Cost): You look “aligned.” You’re easy to work with. If the project fails six months later, it’s a “systemic failure”—everyone is to blame, which means no one is to blame. You just move to the next sprint.
For a rational actor, silence is the efficient choice.
The Logic of the “Leak”
When I talk about a “Safety Leak,” I’m not talking about “feelings.” I am talking about Information Theory.
If the social cost of sending a message (the truth) is higher than the systemic cost of the failure, the message gets dropped. This is a technical failure of the organization.
- I pay for expertise.
- The environment creates resistance.
- The telemetry (reality) drops to zero.
I’ve seen senior leaders mistake this lack of dissent for maturity or alignment. It isn’t. It’s Social Risk Management.Your team has stopped sending data because they’ve learned the system doesn’t handle “truth” packets efficiently.
Leadership as Environmental Design
If I’m leading engineers, I am not a “culture coach.” I am a system designer. My job is to change the math. I have to make it “cheaper” (socially and professionally) for an engineer to tell me a hard truth than to stay silent. If an intern finds a bug in my architecture, do I thank them for saving the system, or do I explain why they “don’t see the big picture”?
Every defensive reaction I have is a hole in the bucket. Once the safety leaks out, I am flying blind.
The Debugging Guide: 7 Common Safety Leaks
In this series, I will debug the seven recurring failure modes that break this “cost math” and filter out reality:
- The Blame Gradient – Treating systemic failures as personal failings.
- The Status Filter – Why seniority acts as a throttle on good ideas.
- The Consensus Theater – Meetings optimized for the feeling of agreement.
- The Feedback Risk – Why “radical candor” is often just an excuse for bad behavior.
- The Certainty Bias – Rewarding confident estimates over accurate ones.
- The Silence Acceptance – Assuming a quiet room means everyone is on board.
- The Repair Gap – Breaking the trust buffer and never fixing the leak.
A Reality Check
I assess my environment by looking at the telemetry of my last critical discussion:
- The Reboot Count: How many times did a junior dev challenge a senior lead?
- The Humility Metric: How many times did I say “I don’t know” or “I was wrong”?
- The Dissent Latency: How long did it take for the “obvious flaw” to finally be mentioned?
If the answers skew consistently toward the top of the org chart, my environment is leaking. And in a complex system, a safety leak is just a catastrophic failure waiting for a deadline.