The conversation nobody wants to have about AI

AI is everywhere right now—conferences, boardrooms, family dinners, LinkedIn feeds. And yet, for all the noise, most of these conversations stay remarkably shallow.

We talk about productivity gains and cost reduction. We debate implementation timelines and which tools to pilot. We put it in roadmaps and call it transformation.

What we rarely do is say the quiet part out loud.

There is a genuine fear underneath a lot of these conversations. I see it in the questions people don’t ask in strategy meetings. I hear it in the careful language executives use when the topic gets too close to home. Nobody wants to be the person who sounds threatened by a technology trend.

But the discomfort is there. It points to something worth taking seriously.


The question we keep avoiding

Most AI discussions anchor on the same familiar ground: efficiency, cost, speed. These are reasonable things to care about. But they are a way of staying comfortable.

The harder question — the one that actually keeps senior people up at night — is this:

What happens when expertise is no longer the differentiator it used to be?

Careers, hierarchies, and entire organizational structures have been built on the assumption that deep knowledge is scarce and hard to accumulate. AI is quietly dismantling that assumption. Not all at once, and not completely — but enough that the old model of value creation is starting to crack.

When a tool can compress a junior analyst’s output to near-senior quality, when institutional knowledge can be partially replicated at scale, when the gap between knowing and executing narrows — what exactly is a 20-year career worth?

That is not a rhetorical question. It is the real one.


What actually changes

I want to be careful here, because I am not arguing that experience becomes worthless. I don’t believe that.

What I am saying is that its role shifts.

The things that remain genuinely hard — and genuinely human — are not about what you know. They are about what you do with it when the situation is ambiguous, the stakes are high, and there is no clean answer. Defining the right problem before solving it. Holding judgment steady when data is incomplete. Connecting the technical reality to the business consequence and the human cost. Leading in a room where people are uncertain and looking for direction.

Those capabilities don’t get automated. But they do become more important — precisely because everything around them is getting easier.


Where organizations are getting it wrong

My concern is not about the pace of AI development. It’s about the gap between how fast things are changing and how slowly most organizations are actually adjusting.

In too many places, AI is still treated as a project. Something a specific team owns. A pilot running at the edge of the business while the core operating model stays untouched.

That is a comfortable illusion. And it is a dangerous one.

Because this is not an incremental shift, it is a structural one. The organizations that treat it as an innovation initiative will find themselves significantly behind those that embed it into how they fundamentally create value — and how they think about talent, governance, and decision-making.


What I think leaders should actually do

Three things, none of which are particularly glamorous:

Get AI out of the innovation team and into the strategy. If it only lives in a lab or a pilot, it’s not a transformation — it’s a decoration.

Redefine roles before you’re forced to. Waiting for disruption to decide for you is not a neutral position. It’s a high-risk one dressed up as patience.

Invest in judgment, not just capability. Skills can be trained, augmented, or eventually automated. The ability to think clearly under pressure, navigate genuine complexity, and lead with credibility in uncertain environments — that is where the real premium will sit.


A final thought

The biggest risk right now is not that AI moves faster than we expect.

It’s that we keep having the comfortable version of this conversation — productivity, pilots, transformation agendas — while the more fundamental shift happens underneath us.

The fear is real. Most people just aren’t saying it.

The leaders who name it and then act on it with clarity rather than panic will be the ones who shape what comes next.

Everyone else will be catching up.