On wisdom, curiosity, and why human teachers matter more than ever
We live in the first moment in human history where getting an answer requires almost no effort. A student can ask any question — about quantum entanglement, the causes of the First World War, or how to structure a business plan — and receive a coherent, well-reasoned response in seconds. The age of information scarcity is over. We have entered the age of infinite answers.
And yet — anxiety, confusion, and a sense of purposelessness among young people is rising, not falling. Students report feeling more uncertain about their futures than any generation before them, even as they carry supercomputers in their pockets. This is not a coincidence. It reveals something important: humans don’t just need answers. They need the struggle of finding them.
“The friction of not-knowing is where character, resilience, and genuine understanding are built. AI removes that friction. Wisdom requires it.”
This is the central paradox of our moment. And it is why, even as artificial intelligence transforms every domain of human knowledge work, the question of what humans still offer — to education, to each other, and to the future — has never been more urgent or more answerable.
What AI Actually Is — and Isn’t
Artificial intelligence, at its core, is a very sophisticated pattern-matching engine. It has ingested an extraordinary volume of human knowledge — books, papers, conversations, code — and learned to retrieve, recombine, and articulate that knowledge with impressive fluency. Ask it a question and it will give you the most statistically probable answer, drawn from the collective record of human thought.
But there is a profound difference between knowing about something and having lived through it. A large language model can describe grief with clinical accuracy. It has never lost anyone. It can explain the physics of a crumbling bridge. It has never stood on one, heart in its throat, responsible for the lives of those who will cross it tomorrow. It can generate a lecture on ethical leadership. It has never had to make an impossible decision under pressure and live with the consequences.
This is not a flaw that will be engineered away in the next model release. It is a structural feature of what AI is: a mirror of human knowledge, not a source of human experience. And experience — earned, embodied, sometimes painful — is precisely where wisdom lives.
The Five Things Humans Offer That AI Cannot
1. Wisdom Earned Through Failure
Knowledge can be transmitted. Wisdom cannot — it has to be forged. When a senior engineer tells a young graduate, “I’ve seen this exact overconfidence before, and here is what happens,” that carries a weight no AI response can replicate. The scar tissue of experience is real information. It changes the quality of attention a student brings to what they are about to attempt.
This is why mentorship is not a soft complement to technical education. It is, in many ways, the most important form of technical education. The difference between a graduate who knows the equations and an engineer who builds things that last is almost always a human being who showed them something books couldn’t.
2. Asking the Questions That Haven’t Been Asked Yet
AI is extraordinarily good at answering questions. It is far weaker at questioning the question itself — at sensing that the framing is wrong, that the problem being solved is not the real problem, that something important has been left out of the specification. The greatest scientists and engineers in history were not answer machines. They were people disturbed enough by something to keep asking why when everyone else had moved on.
Feynman’s genius was not that he knew more physics than anyone else. It was that he was constitutionally incapable of pretending to understand something he didn’t. That quality — intellectual honesty combined with relentless curiosity — is what humans model for each other, and what no training data can fully encode.
3. Genuine Curiosity as a Moral Act
Human curiosity is not just a cognitive function. It is a commitment. When a teacher is genuinely curious about a student — about what is confusing them, what lights them up, what they are quietly afraid of — that caring changes the quality of attention they bring. It creates the conditions in which a student will attempt something difficult. AI can simulate interest. Humans give it.
4. Contextual Judgment Under Uncertainty
AI gives you the most probable answer. Real wisdom is knowing when the improbable answer is correct. This requires something AI doesn’t have: skin in the game. A doctor who has made a wrong diagnosis and lived with it carries knowledge that no training data can fully encode. A project manager who has watched a team collapse under impossible deadlines knows things about human limits that no management textbook captures.
These forms of judgment — earned through consequence, sharpened by accountability — are precisely what our most complex and highest-stakes decisions require.
5. Wisdom as Witness
One of the most underappreciated functions of a teacher or mentor is simply witnessing. Being seen by someone who has lived longer, suffered more, and still chose to care — that is transformative in ways that have nothing to do with information transfer. A student who feels genuinely seen by a mentor will attempt things they would never try alone. AI can validate. It cannot truly witness.
The Guru Paradox — An Indian Perspective
India carries a 5,000-year tradition of the guru-shishya relationship — one built not on the transfer of information but on transformation. The guru did not merely teach mathematics or philosophy. They modeled a way of being: how to face uncertainty with equanimity, how to hold knowledge lightly, how to remain humble before the vastness of what is not yet known.
This tradition understood something that modern education largely forgot and is now being painfully reminded of: the medium of education is the relationship, not the content. What a student learns from a teacher is inseparable from who that teacher is, how they carry their expertise, what they do when they don’t know the answer.
“AI can be a tutor. It cannot be a guru. The distinction matters enormously — and India’s philosophical tradition gives it a head start in understanding why.”
As Indian universities grapple with the integration of AI into engineering and science education, this tradition is not a romantic historical footnote. It is a practical guide. The institutions that will produce the next generation of great Indian engineers and scientists will be those that use AI to handle what AI does well — information delivery, problem practice, feedback at scale — while protecting and investing in what only humans can provide.
The Coming Bifurcation
We are heading toward a world with two kinds of people. Those who use AI as a thinking partner — to challenge their ideas, stress-test their reasoning, explore possibilities they hadn’t considered — and those who use it as a thinking replacement. The first group will become sharper, more curious, more capable. The second will gradually hollow out their own cognitive capacity, outsourcing the very cognitive struggle that builds competence.
The OECD’s 2026 Digital Education Outlook found something alarming: students with access to general-purpose AI tools produce higher-quality outputs than their peers — but this advantage disappears, and sometimes reverses, when access is removed. They performed better without understanding more. The tool masked the gap rather than closing it.
This is the real crisis in education. Not that AI will replace teachers. But that students will let AI replace their own thinking — and no one will notice until it is too late.
The question for educators — in every discipline, at every level — is which group they are training their students to join. That is not a technical question. It is a philosophical one. And answering it well requires exactly the kind of human wisdom and moral imagination that no AI can supply.
A Provocation to Close
If AI can answer every question, what is a university for?
The honest answer is that a university was never really about answers. It was about forming people who know how to live with questions — with rigor, with ethics, with wonder. It was about placing young minds in contact with older minds that had been broken by difficulty and rebuilt into something stronger. It was about transmission, not of data, but of a certain quality of attention to the world.
That mission is not threatened by artificial intelligence. If anything, it is more urgent than ever. Because in a world of infinite answers, the ability to ask the right question — with all the humanity, judgment, and hard-won wisdom that requires — is the rarest and most valuable thing a person can possess.
“When information becomes infinitely abundant and cheap, what becomes scarce and precious is the quality of questions.”
AI will answer most questions faster and more accurately than most humans. That is simply true, and there is no use pretending otherwise. But the right questions — the meaningful questions, the ones that shape what we build, how we live, and who we become — still need a human being standing behind them. Someone who has something at stake. Someone who has lived long enough to know what matters.
That is not a small thing. That may be the thing.
Agendra is the Managing Director at Esri India and writes on technology, education, and leadership.