When Experts Go Quiet

She’s been handling escalated customer complaints for eleven years. She knows the system, the workarounds, the customers who just need to feel heard before anything else can happen. She’s the person newer agents come to when the script doesn’t fit the situation. Last quarter, her organization rolled out an AI triage tool. It routes, summarizes, suggests responses. It’s faster than anything she’s done manually. And she hasn’t said a word about it in three team meetings.

Her silence isn’t obstruction. It’s a question she doesn’t yet feel safe asking out loud: does what I know still matter?

That question is sitting in your organization right now. In every person whose professional identity was built on knowing something — a process, a system, a customer, a machine — and who is now watching a tool arrive that knows it differently, and possibly faster. How leaders respond to that silence will determine whether AI disruption becomes an organizational asset or a quiet hemorrhage of the people who carry institutional knowledge no system has yet learned to replicate.

The Disruption Frame

Here’s what makes this moment different from previous technology cycles: AI doesn’t just automate tasks. It targets expertise itself. Process-driven organizations — manufacturing, logistics, financial services, healthcare operations, customer service — have been built around a simple value proposition: hire experienced people, develop deep specialists, reward those who know. AI inverts that logic. The knowledge that took years to codify is now precisely what these tools are best at absorbing and reproducing.

Rita McGrath saw this structural shift coming before AI made it visceral:

“We used to think of the competitive environment as one of punctuated equilibrium, where there were long periods of stability between disruptions. Now the disruptions are coming closer and closer together. The competitive environment is in perpetual motion.”

— Rita McGrath, Strategy+Business

The question for leaders isn’t how do we implement AI? It’s how do we build organizations that can keep learning at the speed the environment now demands? That’s a culture question. And culture questions don’t get solved with a rollout plan.

 

What’s Actually at Stake for People

The eleven-year veteran isn’t thinking about competitive advantage. She’s thinking about whether her judgment still counts. The manager promoted for deep process knowledge is privately wondering whether he can lead credibly through something he doesn’t yet understand. None of these people are wrong to feel what they feel. And none of them will tell you they’re feeling it unless you’ve made it safe to say so.

Early in my consulting work, a colleague told me the effective consultant has to be “abrasive and dumb.” I didn’t take to abrasive, but I understood dumb immediately: arriving with genuine curiosity rather than performed expertise, asking the question everyone else had stopped asking. My version of abrasive turned out to be asking the uncomfortable question and waiting out the silence rather than rescuing anyone from it. That silence, held with patience, is where honest answers live.

The person most resistant to a new idea is rarely the problem. They’re the signal. Treating the skeptic as a stakeholder before rollout, not an obstacle after it, changes everything. When they’ve been genuinely consulted, you start hearing them say “we talked about this, and here’s why they did it that way.” That sentence carries more adoption currency than any communication plan.

 

Three Principles for Leading Through It

1. Replace Certainty with Curiosity

Expertise culture rewards having the answer. In a period of genuine disruption, performed certainty is exactly the wrong signal. Liz Wiseman puts it plainly:

“Certainty is one of the weakest positions in life. Curiosity is one of the most powerful. Certainty prohibits learning, curiosity fuels change.”

— Liz Wiseman, Rookie Smarts

The leader who says “I don’t know how this is going to play out; here’s how we’re going to figure it out together” is doing more cultural work than any training program. They’re demonstrating that uncertainty is navigable and not-knowing is a starting point, not a failure state. Wiseman’s insight is equally pointed: rookie smarts — curious, unencumbered, willing to ask the obvious question — isn’t a stage to grow out of. In a disruption environment, it’s the most adaptive posture available.

2. Make Psychological Safety Load-Bearing, Not Decorative

Psychological safety gets named in values statements and undermined in team meetings. Amy Edmondson’s research is precise:

“High psychological safety means a high learning quotient — these are teams that are engaged, leaning in, honest, and working hard to create value for the company.”

— Amy Edmondson, The Fearless Organization

The practical translation for Ops VPs: will your team tell you when the AI tool is producing a bad output before it becomes a production problem? Will the veteran who spots a gap between what the model recommends and what experience says speak up, or go along? Those are psychological safety questions dressed in operational clothing. Teams high on accountability but low on safety operate in the anxiety zone — keeping problems to themselves until they become crises. That’s where adoption fails quietly.

3. Reframe Expertise as the Interpretive Layer

AI doesn’t replace expert judgment. It creates urgent need for it — at a different level. The veteran who has handled escalations for eleven years doesn’t become obsolete when AI summarizes a complaint and suggests a response. She becomes the person who knows whether that suggestion is right for this customer, this relationship, this history. Her expertise moves up the stack, from executional to interpretive.

That reframe needs to be built into governance, not just inspiration. When AI outputs inform decisions, structure the human review: who validated it, what contextual knowledge they applied, where they departed from the recommendation and why. This isn’t bureaucracy; it’s how you protect the interpretive layer and generate data that makes the system smarter over time. As Wiseman puts it: “To generate a big impact, pair someone who wants to change the world with someone who already knows how the world works.”

 

The Organizational Muscle

None of this is a one-time initiative. McGrath’s deeper point — the one most change programs sidestep — is that adaptability itself has to become the durable organizational asset. The companies that navigate AI disruption well won’t be the ones that ran a clean implementation. They’ll be the ones that built the capacity to keep reconfiguring as the environment keeps shifting. Her framing of what that requires in terms of talent is direct:

“You will choose people who are educable rather than people who are deeply specialized.”

— Rita McGrath, Strategy+Business

For the people on your team who say (with genuine exhaustion) “they’re always changing things,” the answer isn’t to slow the pace. It’s to shift what’s stable. In a transient-advantage environment, the stable thing can’t be the role or the process. It has to become the identity: I am someone who figures out new things. That reframe, modeled and reinforced by leaders over time, is what turns disruption from an assault into a confirmation of who your people are.

 

Where to Start

Four actions for leaders navigating this now:

 

Have the honest conversation before the rollout.

Bring your credible skeptics into the design, not the announcement. Their questions improve the approach. Their eventual buy-in carries the room.

Ask what’s hard, not just what’s working.

The gap between those two conversations is where adoption actually lives — and where you’ll find out whether psychological safety is real or decorative.

Build expert validation into governance.

Document who reviewed AI outputs, what they considered, and where they departed from the recommendation. The expert’s judgment, properly captured, becomes part of the learning loop.

Protect learning time explicitly.

If your team is absorbing new tools on top of full workloads with no accommodation, you’re building burnout, not capability. Even an hour a week of protected learning signals that the organization is investing alongside its people.

 

Bearing Check

Back to the customer support veteran. She’s not waiting for a better tool or a clearer policy. She’s waiting for a signal that her judgment is still in the room; that eleven years of knowing this work counts for something, that asking a question won’t mark her as someone who couldn’t keep up.

That signal has to come from leadership in the small moments with the leader who admits they’re figuring it out too, or who asks the uncomfortable question and waits for the real answer, who says we talked about this and means it.

Where’s your leadership moment? It’s not reserved for the town hall. It’s in the hallways and the conference rooms?

The organizations that come through AI disruption with their people intact won’t be the ones who moved fastest. They’ll be the ones where people trusted each other enough to learn in public.

 

“What’s the hardest part of this shift for your team right now? I’d genuinely like to know.”

Next
Next

Do You Have an Eagle Eye?