There's something unsettling about watching my friends learn their first Hello World with ChatGPT. They hover over the keyboard with excitement, crafting prompts that feel more like negotiating with an oracle than engaging with ideas. The nature of human learning is being rewritten in real-time, and we're all just trying to figure out what it means to know something when statistical models can know so much more.
I've been thinking about this as I watch the numbers roll in. 86% of education organizations now use generative AI, the highest adoption rate among all industries. Regarding impacts, Khan Academy's Khanmigo shows 20% higher learning gains for students who use it consistently. Microsoft reports its AI tools save teachers 40% of their time while boosting student grades by 10%. These data point towards a conclusion: AI is making education more efficient, more personalized, and more effective.
But efficiency was never the deepest question on learning. The question that keeps me wondering is whether we're optimizing our way out of what makes us human.
Where did the “aha” moment go?
Think of the most transformative learning experience you have had, the experience that you believe shaped who you are. I’d wager they all share a common theme: struggling. I recalled the night I finally understood Calculus, not because I bought a better textbook, but because I did almost every possible question until it dawned on me. I remembered the night I cracked Neural Networks, not because I used Machine Learning (ML) libraries to build one, but rather because I decided to build it from scratch, iterating until it worked.
These weren't an efficient process. But something occurs in this productive frustration: the “aha” moment, the eureka, the feeling you just solved life's greatest mysteries. It is in these moments that I felt genuine understanding taking root in my mind, knowledge, and wisdom that goes a long way in my journey.
Now consider what happens when we use AI to optimise the struggles away. Sebastian Thrun, one of the pioneers of online education and autonomous vehicles, paints a compelling vision: "Eventually it's going to be a big piece of artificial intelligence that sits there, watches you learn, and helps you pick the right learning venue or task, so you're more effective and have more pleasure." More effective. More pleasant. But what happens to struggling? Where did the “aha” moment go?
Is learning faster actually learning at all? Daniel Schwartz from Stanford's Graduate School of Education warns that "a lot of AI is also going to automate really bad ways of teaching," but the deeper question is whether it might also optimize away some essential human experiences that contribute to growth despite the “inefficiency” caused.
What we talk about when we talk about learning
What does it mean to understand something? Can you truly know what you haven't struggled to learn? Is wisdom simply intelligence plus experience, or something different?
Data suggests we're creating a generation of students who can produce better work while understanding less about how they produced it. Studies show that when AI assistance is removed, students struggle to maintain their previous performance levels. This suggests dependency rather than skill development. It's like teaching someone to paint by holding their hand while they hold the brush. The result might be beautiful, but did they learn to paint?
Hubert Dreyfus, the philosopher who spent decades critiquing AI's limitations, argued that human expertise emerges from embodied engagement with the world – the "intentional arc" of skilled coping. When a chess master sees a board, they don't calculate moves. Rather, the board shows up as meaningful patterns developed through thousands of hours of play. This kind of knowledge is inseparable from the process of acquiring it.
AI education platforms are optimizing for the appearance of understanding while potentially undermining students’ development. They can identify when a student is confused and provide just the right hint, select the perfect practice problem, and generate personalized explanations. But in doing so, they might be preventing students from developing their intentional arc – their way of having problems show up as growth.
The species that learns to learn
What makes humans different from animals? For one, we’re the species that has learned to learn. Not only to acquire information, but to transform ourselves through the process of learning. We don’t just solve problems, we become the kind of being that can solve those problems.
The research on human unique capacities reveals something intriguing about what we risk losing. MIT researchers developed the EPOCH framework – Empathy, Presence, Opinion, Creativity, and Hope – identifying what remains of humans even as AI capabilities expand. These aren't skills, they're ways of being in the world that emerge from the fundamental human experience of meaning-making.
True creativity, the research shows, involves "creative empathy" – the ability to generate novel representations of others' mental states. It's not pattern matching or clever recombination, but the distinctly human capacity to imagine what it's like to be someone else and create something new from that understanding. When students taught empathy showed 78% increases in creativity scores, it revealed the interconnected nature of these supposedly separate capabilities.
This suggests something crucial about learning: the goal isn't just to acquire knowledge, but to become the kind of person who can create knowledge, empathize with others, and transform ourselves within.
The cascade of obsolescence
The workforce data paints a picture of unprecedented transformation. 39% of core job skills will change by 2030, with potentially 400-800 million workers globally needing to transition to new occupations. But the numbers don't capture the psychological vertigo of watching your professional identity dissolve in real-time.
I think about the radiologists learning that AI can diagnose X-rays more accurately than they can, or the legal researchers discovering that AI can review documents faster and more thoroughly. It's not just that their jobs are changing – it's that the thing they spent decades becoming excellent at is being redefined as a computational problem.
Amazon's $1.2 billion investment in workforce development, retraining 250,000 employees across 14 countries, paints the new reality that continuous reinvention isn't a career choice anymore; it's a survival requirement. But I wonder: Are we retraining people to be human in an AI world, or are we training them to be better AI assistants?
The generational divide is stark. 75% of Gen Z use generative AI at work, and they're building their professional identities around human-AI collaboration from the start. They're comfortable with AI assistance in ways that would have seemed impossible to previous generations. But they're also the first generation to grow up questioning whether their natural intelligence has any unique value.
When work has been humanity's primary source of meaning, purpose, and identity for centuries, what happens when that work can be done better by machines?
The paradox of personalized learning
The promise of AI in education is enthralling: finally, truly personalized learning for every student. DreamBox's Intelligent Adaptive Learning system adjusts to every student interaction within and between lessons. Coursera's AI coach provides real-time feedback tailored to individual learning styles. Khan Academy's Khanmigo offers Socratic dialogue customized to each student's conceptual gaps.
But personalization optimized by algorithms might be precisely the opposite of what human learning requires. Real learning often happens through collision with perspectives that challenge our existing frameworks. It's the unexpected connection, the comment from a classmate that shifts everything, the book recommendation that changes your life – none of which can be predicted by analyzing your learning patterns.
The research shows that students rate AI tutors as more helpful than human tutors, and the learning outcomes support this preference. But effectiveness isn't what learning is. What if the most important learning happens not when we get exactly what we need, but when we encounter exactly what we didn't know we needed?
There's something profound about learning in community with other humans – the way ideas emerge from dialogue, the way understanding develops through teaching others, the way wisdom grows through witnessing different approaches to the same problem. AI can optimize for individual learning efficiency, but it struggles with the essentially social nature of human understanding.
Preserving our humanity
Don’t get me wrong - I support the use of AI models (I build them!), However, we must consider well what the adoption of AI means to us - our lives, our future, our humanity. The technological trajectory seems clear: AI will become more capable, more integrated into the public system, more central to how humans live. The question is never whether this will happen, but rather how to nudge it in the direction that preserves our humanity the most.
The answer might lie in considering AI as one of many dimensions in our humanity that forces us to look into other dimensions that define us as humans. AI is merely a statistical model, capable of mimicking us in executing our daily lives. Thus, they’re unable to have emotions, unable to be aware of their physical presence within this space & time, and are unable to have “intuition” - something a lot of us utilize.
The research on authentic learning points toward some principles: Learning that develops human agency rather than undermining it. Education that preserves space for struggle and discovery. AI assistance that enhances human capabilities without replacing the processes that develop those capabilities. Tools that make us more human, not less.
This might mean designing AI educational systems that deliberately introduce productive obstacles, that create space for genuine confusion and discovery, that prioritize the development of growth over speed of acquiring knowledge. It means to have human teaching on dimensions that AI does not possess. It also means recognizing that the goal of education isn't to produce humans who can compete with machines, but humans who can develop their humanity the most in a world full of capable machines.
What remains
In the end, I keep returning to the question: What does it mean to learn? Not just to acquire information or develop skills, but to undergo the kind of transformation that changes who you are?
The data suggests that AI can make us more efficient learners, but one may argue that efficiency might be antithetical to the deepest forms of human learning. Real learning – the kind that creates wisdom, empathy, and authentic understanding – might require precisely the inefficiencies that AI is designed to optimize out of.
As we navigate this transition, perhaps our task isn't to preserve human learning against AI, but to integrate human learning in relationship with AI. To become the species that learns not just to know, but to be. To develop not just intelligence, but wisdom. To cultivate not just capability, but meaning.
Learning through AI is a signal of a new form of human-machine collaboration. The question we face is whether we can guide this collaboration toward what's most valuable about human learning, or whether we'll optimize our way into a more efficient but less human future.
The learning species is in transition. What we become depends on the choices we make today about what we're willing to preserve, what we're excited to enhance, and what we're wise enough to leave unchanged. The future of human learning isn't just about adapting to AI – it's about ensuring that adaptation serves our deepest human purposes rather than optimizing them away.