The Critical Shift In What Differentiates Great Engineers


Hello Reader,

Some years ago, I watched a senior engineer named David present the most technically flawless migration plan I had ever seen. He had spent six weeks mapping every dependency across four microservices, documenting rollback scenarios for each phase, and building a custom monitoring dashboard to track the cutover in real time. His architecture review deck was twenty slides of precision. When he finished presenting to the leadership team, the room went quiet.

The kind where twelve people are thinking the same thing but nobody wants to say it.

A staff engineer named Priya spoke up. "David, the technical plan is solid. But I’ve been in three 1:1s this week where people on the platform team told me they’re exhausted. If we push this migration now, we’re asking a burned-out team to execute something that requires their best work."

David pulled up his risk matrix illustrationg that he accounted for resource constraints and have buffer built into the timeline.

That’s not what I’m saying, Priya said. I’m saying the team doesn’t trust that leadership cares about what the last six months cost them. Launching this now sends a message that the roadmap matters more than the people delivering it.

David’s plan launched on schedule buy three weeks in, two critical engineers quietly transferred to other teams. The remaining engineers, running on fumes, missed a configuration dependency that triggered a four-hour production outage. The migration that was supposed to take eight weeks stretched to fourteen. In the post-incident review, every root cause pointed to execution errors that healthy engaged engineers would have caught in their sleep.

Priya’s read on the room had been the most accurate technical signal available but it didn’t come from a dashboard or a dependency graph, It came from paying attention to how people felt.

The End of the Knowledge Advantage

For most of the last century, knowledge was power. Education was the golden ticket to upward mobility, and business valued intellectual horsepower above almost everything else. The ability to access information, analyze it, spot patterns and produce recommendations created high-paying careers in engineering, finance, law, and medicine. If you could learn more, retain more, and synthesize faster than the person next to you, you won.

That era is ending.

AI has read more documentation than any engineer. More RFCs than any architect. More incident reports than any SRE. It can look at what has been done, written, and published across the known world and produce a synthesis that is comprehensive and fast. Being head smart is becoming a baseline rather than a differentiator for the top roles.

Yue Zhao, a product executive and leadership coach with over twenty years of experience leading product and engineering teams across growth-stage startups and FAANG companies, frames this shift through a lens she calls the three centers of wisdom.

The Head is the center we know best: logic, analysis, synthesis, and pattern recognition, operating through facts and reason. The Heart is the center of connection, creativity, and curiosity, operating through emotions rather than reason. And the Gut is the center of discernment and ethical judgment, the instinct that tells you something is right or wrong when you cannot fully articulate why.

Most high-performing engineers have spent their entire careers growing the wisdom of the Head. In school, we learned critical thinking and built our information database. In early careers, we got in the room by having the best analysis or the most thorough design document. But AI just became the most capable, smartest analyst in every room, and it is available to everyone.

The Myth of Pure Rationality

Engineering culture has spent decades reinforcing a clear hierarchy: data over opinion, logic over emotion, facts over feelings. We tell ourselves that the best decisions come from removing human bias from the equation. Don’t get emotional is not just advice in most engineering organizations, it is an identity marker that signals that you belong in technical spaces, that you are serious and can be trusted with complex systems.

Engineers who pride themselves on rationality genuinely believe they are protecting the quality of decisions by filtering out the noise of human emotion.

But it is a trap.

The trap is not that logic is wrong, the trap is that logic alone is incomplete. When you define rigor exclusively as analytical output, you systematically ignore an entire category of signal that is often more predictive than the data you are analyzing. The team’s morale after a brutal on-call rotation. The unspoken tension between two architects who disagree on a fundamental design choice. The gut feeling that a vendor’s timeline is unrealistic even though their project plan looks clean. These are not distractions from good engineering decisions. They are inputs that determine whether good engineering decisions actually survive contact with reality.

Why Feelings Are Engineering Data

The science behind this is more concrete than most engineers expect. Researchers at UCLA and other institutions have documented that the enteric nervous system, a network of over 100 million neurons lining the gastrointestinal tract, communicates directly with the brain through what neuroscientists call the gut-brain axis. This is not metaphor but bidirectional signaling that influences emotional regulation, pattern recognition and decision-making at speeds faster than conscious analysis. When experienced engineers describe a gut feeling that something is wrong with a deployment plan, they are accessing a form of rapid, non-conscious pattern recognition performed by neural systems operating below the threshold of deliberate awareness.

Joshua Freedman and his colleagues at Six Seconds published a landmark study in Frontiers in Psychology in November 2025, analyzing emotional intelligence data from 28,000 adults across 166 countries between 2019 and 2024. They found a statistically significant 5.79% decline in global emotional intelligence scores over that period. More striking was the finding that individuals with higher emotional intelligence were more than ten times as likely to report strong outcomes across effectiveness, relationships, quality of life, and well-being combined. In a profession that is already prone to burnout and interpersonal friction, the decline in emotional intelligence is not an abstract concern. It is a measurable erosion of the capabilities that separate engineers who lead from engineers who simply execute.

The uncomfortable truth is that the engineers who actually reach senior and staff levels are not just the smartest. They are the ones who can rally a team through a brutal migration, de-escalate a conflict between two opinionated architects or hold the line on a decision when the data is incomplete and the pressure is real. AI can model empathy, but it cannot feel it. When a team is demoralized after a reorg, when strong personalities disagree on architecture, when someone needs to hear a hard truth about their performance, those moments require a human being who is skilled in reading and managing emotions.

AI also has no ethics either. It can reflect the ethical frameworks humans have published, but the judgment of what is right in novel situations where existing rules do not quite fit, when someone has to stand for something important with incomplete information, that is Gut. And it is required at every level of technical leadership.

Five Patterns That Keep Engineers Stuck in the Head

The Data Shield

When a decision feels uncertain, the instinct is to ask for more data. One more analysis. One more benchmark. One more proof of concept. On the surface, this looks like rigor. In practice, it is often a way to avoid the discomfort of making a judgment call when the information is incomplete. Leadership sees someone who cannot commit. The engineer sees themselves as being thorough. The gap between those two perceptions compounds over years, and it quietly caps careers at the senior individual contributor level where analytical depth stops being the bottleneck.

The Emotion Dismissal

When a teammate raises a concern that is rooted in feeling rather than data, the reflex is to redirect. Let’s focus on what the metrics say. This feels like maintaining standards. What it actually does is train the people around you to stop surfacing the early warning signals that prevent the worst failures. The engineers who dismiss emotional input are often the same ones blindsided by attrition, missed deadlines, and team dysfunction that came out of nowhere. It did not come out of nowhere. The signals were there, they were just categorized as noise.

The Conflict Avoidance Loop

Many engineers avoid interpersonal tension by retreating into technical work. When two teams disagree on an API contract, the instinct is to write a more detailed design document rather than sit in a room and navigate the human disagreement underneath the technical one. The document becomes a proxy for a conversation that never happens. Three months later, the integration fails not because the spec was wrong, but because the two teams never actually aligned on priorities. The engineer who wrote the spec cannot understand why their technically correct solution did not survive implementation.

The Ethical Silence

When something feels wrong but the data does not clearly prove it, most engineers stay quiet. A vendor’s timeline feels unrealistic. A product decision seems to cut corners on reliability. A hiring process feels biased but there is no smoking gun. The gut registers discomfort, but the culture says you need evidence before you speak. So the engineer waits for proof that often arrives too late, in the form of an outage, a failed launch, or a discrimination complaint. The cost of waiting for certainty is that you only ever respond to problems. You never prevent them.

The Creativity Drought

AI can generate from what already exists, It cannot truly create. The creative leap that comes from caring deeply about a problem and having the courage to propose something with no precedent is not a capability AI possesses. But engineers who have spent decades training themselves to value only what can be proven and measured slowly lose access to their own creative instincts. They stop proposing bold ideas because bold ideas cannot be justified with existing data. They become optimizers rather than inventors, and they wonder why their work feels increasingly commoditized.

The Language of Leading With More Than Logic

The shift is not about abandoning analytical rigor. It is about expanding what counts as rigor to include the emotional and ethical dimensions that determine whether technically sound decisions actually produce the outcomes they promise.

When you notice tension in a room during an architecture review, name it instead of ignoring it. Saying "There’s clearly strong disagreement here, and I think it’s worth understanding what’s driving it before we commit to a direction" does not make you less technical. It makes you the person who prevented three months of rework because two teams were never actually aligned. When your gut tells you a timeline is unrealistic even though the project plan looks clean, say "My experience pattern-matches this to situations that went sideways, and I want to flag that before we commit” instead of waiting for the data to prove you right after the deadline has already been missed.

Stop treating curiosity as a distraction from productivity. Pay attention to what genuinely interests you, not just what you are good at. The engineers who bring creative energy to problems are the ones who generate solutions that AI cannot replicate, because those solutions come from caring about the problem in a way that no language model can simulate. Let yourself be moved by the work. Show your team that you lead in the face of uncertainty, not in the absence of it.

A Framework for Building Your Other Two Centers

Before high-stakes decisions, practice a deliberate pause. Not to gather more data, but to check in with your body. Notice what is happening physically. Tightness in your stomach signals something your conscious mind has not yet processed. A sense of calm and groundedness signals alignment. This is paying attention to a neural network that processes pattern recognition faster than your prefrontal cortex. Give it thirty seconds of attention before you override it with a spreadsheet.

During meetings and reviews, shift part of your attention from the content to the room. Before you respond to a technical question, ask yourself what the emotional state of the group is right now. Is the team engaged or checked out? Is someone holding back a concern? Is there unspoken frustration that will surface later as passive resistance? Reading these signals and managing them toward a productive outcome is not a soft skill. It is the skill that determines whether your technical decisions actually get implemented by humans who are committed to making them work.

After difficult moments, whether it is a production incident, a tough conversation, or a decision you are not sure about, notice how your body feels and resist the urge to immediately rationalize. Remember the physical sensation of discomfort when something was wrong, and the sensation of clarity when something was right. Over time, this builds a library of somatic data that makes your gut instinct more reliable, not less. You will feel uncomfortable doing this, especially if you have spent your career believing that feelings are noise. The discomfort is not evidence that you should stop. It is the cost of building a capability that AI cannot replicate and that your career increasingly depends on.

From Analysis to Influence

Few months after the failed migration, David’s organization needed to consolidate three legacy monitoring systems into a single observability platform. The technical complexity was comparable to the previous project and David volunteered to lead it.

This time, he started differently. Before writing a single line of the migration plan, he spent two weeks in 1:1s with every engineer on the platform team. He asked what the last migration had cost them personally. He listened to the frustration, the exhaustion, and the erosion of trust. He did not try to fix it with a better timeline or more buffer.

When he presented the new migration plan to leadership, the technical architecture was just as rigorous as before. But he opened with something he never would have said six months earlier. “The biggest risk to this project is not technical. It’s that this team has been through a difficult year, and they need to know that we’re making this decision with their capacity and wellbeing as a real input, not just their availability on a resource spreadsheet.”

Priya, who was in the room, later told him it was the first time she had seen leadership sit up and actually listen during a migration review. The project finished in seven weeks, under the original estimate.

The technical work was identical in quality to the first migration. The only difference was that he stopped treating human signal as noise and started treating it as the most important data in the room.

The Real Measure of Engineering Leadership

The era of getting ahead by being the smartest person in the room is ending. Not because intelligence stopped mattering, but because intelligence became universally accessible. AI can analyze your system faster than you can. It can read more documentation, generate more options, and synthesize more data in an afternoon than you will process in a month.

Some engineers will influence through the elegance of their architectures. Some through the clarity of their communication. Some through their ability to stay calm when production is on fire. Some through the courage to say what everyone else is thinking but nobody will voice.

But all of them will influence by treating human signal, the emotions, the instincts, the ethical discomfort that cannot be reduced to a metric, as real data that deserves the same rigor they give to system telemetry.

When knowledge is universally accessible, the most important engineering skill may be knowing when to let feelings override logic.

That’s all for this week.

The Influential Engineer

Join 1k+ other forward-thinking professionals who receive the weekly newsletter, where I provide actionable strategies, insights and tools to escape the grind and build influential, future-proof careers. Sign up to get a FREE copy of my 5-Stage playbook to multiply your impact and build a career that AI can't replace.

Read more from The Influential Engineer

Hello Reader, Three years ago, I watched a senior engineer named David lose a promotion he had earned six times over. He was the kind of developer every engineering organization depends on but rarely celebrates. His code reviews were surgical. His incident response times were the fastest on a team of fourteen. In the third quarter alone, he resolved more production incidents than any other engineer in the department, including two high severity outages that if left unaddressed for another...

Hello Reader, It's Kayode Three years ago, I watched a senior engineer named Priya present a migration plan that should have been a career-defining moment. She had benchmarks from three cloud providers. She had architecture diagrams showing the current state and the proposed state, layered with annotations. She had a detailed rollback procedure for every phase of the transition. The savings were real: $2.3 million annually, backed by six weeks of analysis. Any questions? she asked the room....

Ripple effect of influence

Hello Reader, It's Kayode A few years ago, I sat in a planning meeting that had been running for 45 minutes without a single decision made. Everyone had an opinion and nobody had ownership, and the same three options kept cycling through the conversation like luggage on a carousel that nobody wanted to claim. Then one engineer said: “We’ve been here long enough. Based on what we’ve heard, we’re going with option two. Here’s who owns what next.” The room went quiet in the way rooms do when...