We Are the Last Generation to Have Learned Without AI. Here's What That Means.
The people building AI systems today are the last ones who learned everything the hard way — through friction, failure, and slow accumulation. What we carry is not just knowledge. It is a different kind of knowing.
I learned to code in 2013 by being confused, searching for help, being confused again, and then eventually understanding something. The understanding was slow. It was uncomfortable. It was also, I now think, irreplaceable.
My nephew is twelve. He has never written code without an AI suggesting the next line. He has never had to hold a problem in his head for hours, turning it over, waiting for the angle that makes it solvable. He does not know what that feels like. He will never know what that feels like.
I do not think this makes him worse. I do not think it makes him better. I think it makes him different in a way we do not yet have the language to describe.
What Was the Friction Actually Building?
When I was learning, the friction was the entire education.
I would get stuck on a problem for an afternoon. I would try six wrong approaches before finding one that worked. I would understand, eventually, not just the answer but the shape of the problem — why the wrong approaches were wrong, what constraints made the correct approach correct. This understanding was not in any documentation. It was built through the experience of the thing resisting me.
Cognitive science calls this desirable difficulty — the finding that learning is more durable when it requires effort. Problems that are easy to solve are easy to forget. Problems that cost you something tend to stay.
I am not saying the cost was worth the suffering. I would have happily had an AI in 2013. But I am saying that the cost produced something: a kind of structural understanding that is different from knowing the answer. The difference between having found your way through a city on foot and having been driven through it. You know the city differently. You cannot always explain how, but you do.
What Will the Next Generation Have Instead?
This is the question nobody is asking clearly: if the friction built something, and we remove the friction, what replaces what it built?
The honest answer is: we do not know yet.
What the next generation will have is something different. They will interact with AI systems from the beginning of their cognitive development. They will learn to frame problems, to evaluate outputs, to direct systems toward goals — skills that my generation is learning in our thirties and forties after learning the previous way first. They will likely be better at certain things than we are. They will likely be worse at others.
Research from the Stanford Human-Centered AI Institute and others is beginning to track how AI tools affect learning outcomes. Early results are mixed in ways that should make us careful: AI assistance improves performance on measured tasks while sometimes reducing the deep processing that makes knowledge transferable to unmeasured ones.
The question is what we are optimising for. If we optimise for task completion, AI-assisted learning is superior. If we optimise for the ability to operate in genuinely novel situations where no prior pattern applies, the picture is less clear.
The Loop We Are Already In
Here is something I think about regularly.
The AI systems being built today were trained on content produced by people who learned without AI. The code on GitHub, the explanations on Stack Overflow, the documentation written by people who understood systems through years of working with them — this is what current AI knows.
That data captured something: the thinking of humans who built their understanding through friction.
The data being generated now is different. Developers who use AI agents think differently. They reach for a prompt before they reach for a solution. They work in handoffs and reviews rather than line-by-line construction. This is not worse. But it is a different cognitive texture, and that texture is becoming the training data for the next generation of models.
We are not, in any simple sense, training AI on human thought anymore. We are training AI on human thought that has already been shaped by AI. The feedback loop is closed. Nobody fully knows what it produces.
What We Should Preserve
I have a position on this, though I hold it carefully because I could be wrong.
The thing worth preserving is not syntax knowledge. Let syntax go — I am happy to never memorise another API. The thing worth preserving is the experience of genuine difficulty. Of holding a hard problem until it yields. Of building understanding through resistance rather than through assistance.
This probably means designing learning environments where AI is deliberately not available — not as punishment, but as deliberate friction. The same way athletes train without equipment that they use in competition. Not because the equipment is bad but because the body needs to know what it is like without it.
It probably also means being honest with young people about what AI can and cannot do — specifically, that it can produce answers without building the understanding that makes answers useful in new contexts.
I do not know how to preserve this at scale. I am not sure it is possible at scale. But I think the attempt matters.
What the Last Generation Carries
My generation — the last to have learned without AI — carries something that will become scarce.
Not knowledge. Knowledge is abundant now. The ability to search has always made knowledge abundant. What we carry is something harder to name: the feel of a system, the instinct that something is wrong before you can say why, the judgment built in years when there was no alternative to building it slowly.
This will matter most in the situations AI cannot handle — the genuinely novel problems, the edge cases outside any training distribution, the moments when what is needed is not a pattern from prior experience but the ability to reason from first principles in a context that has no precedent.
Those moments will still come. They always do. They are, arguably, the only moments that really matter.