AI can be 95% right and 100% wrong
- Liza Engel

- Nov 17, 2025
- 2 min read
Last week I was speaking to a room full of students. I was speaking in English, and most of them had French, Italian, or Swiss-German as their first language. Many had their laptops open. Were they watching Netflix, or were they translating and summarising me?
I want to give them the benefit of the doubt and assume the latter is true.
So I wondered: how would AI summarise a live talk?
So I ran one of my recent talks through an AI tool.
It nailed every fact - but missed the point entirely.
The data captured the what.
It erased the why.
That’s the danger of efficiency: we risk losing the emotional thread that makes communication matter.

Photo by MChe Lee on Unsplash
When accuracy isn’t understanding
AI is extraordinary at precision; it can summarize, categorize and quantify language faster than we can read it.
But understanding? That’s different.
In the AI’s summary of the talk, it listed goals, metrics and key initiatives perfectly. But it missed the tone of the room: the nervous laughter, the pride in small wins, the moment of pause when someone said, “We’re not there yet, but we’re closer than we’ve ever been.”
Facts alone can’t carry meaning.
Leaders don’t just share information - they shape interpretation.
The leadership risk of over‑trusting accuracy
It’s easy to assume that if AI gets most things right, it must be trustworthy.
But “95% right” can still be “100% wrong” if the 5% it misses is the human meaning.
That missing 5% is often the part that changes minds, builds trust or inspires action.
Because we don’t remember information, we remember how it made us feel.
What this means for you
This isn’t just a communication insight - it’s a leadership imperative.
In a world increasingly navigated by algorithms and dashboards, leaders need to guard against the illusion that data equals understanding. If you delegate interpretation to AI, you risk missing the very signals that drive trust, morale and motivation.
The best leaders won’t just ask, “What does the data say?” They’ll ask, “What does this mean for the people involved?” And “What‘s missing between the lines?“
Empathy is not a soft skill - it is a leadership differentiator. And no algorithm can replicate your voice.
Keep the machine for precision. Keep yourself for meaning.
AI is a superb analyst. It can save time, highlight patterns and surface what matters most in the data.
But it’s your job, as communicator, as leader, to be the human at the table.
Next time you review an AI summary, ask yourself:
Does this reflect the intent and emotion behind the words?
Would the people who were there recognize themselves in this version?
What truth might live between the lines?
If any of those answers feel off, that’s your signal to step back in.
The responsible step
Go ahead and let AI take care of efficiency, but please don’t let it define understanding.
Next time you use AI to analyze, summarise or report, treat it as a conversation partner, not the final authority.
Ask it for facts. Then ask yourself for meaning.
Leadership doesn’t reside in accuracy alone, it resides in empathy, context, and truth.




Comments