Are we preparing students to be chefs or cooks?
Over 7 years ago I first posed this question in a Youtube Video you can watch below. I had recently lost my brother to cancer, and was wrestling with all the ways he impacted my life and changed my mindset on learning and leading.
This was pre-pandemic, pre-AI, and so much has changed since then…
Yet, the message has never been more important.
This week an article went extremely viral (millions and millions of views), titled, “Something Big Is Happening”.
Yes, some folks are saying it is an extremist alarm about AI, and others are questioning if it is content marketing disguised as a thought-piece.
But, hidden in this article was a quote that made me think about the difference between Cooks and Chefs, and what we are doing in schools. It’s long, but worth reading and thinking about:
Here's the thing nobody outside of tech quite understands yet: the reason so many people in the industry are sounding the alarm right now is because this already happened to us. We're not making predictions. We're telling you what already occurred in our own jobs, and warning you that you're next.
For years, AI had been improving steadily. Big jumps here and there, but each big jump was spaced out enough that you could absorb them as they came. Then in 2025, new techniques for building these models unlocked a much faster pace of progress. And then it got even faster. And then faster again. Each new model wasn't just better than the last... it was better by a wider margin, and the time between new model releases was shorter. I was using AI more and more, going back and forth with it less and less, watching it handle things I used to think required my expertise.
Then, on February 5th, two major AI labs released new models on the same day: GPT-5.3 Codex from OpenAI, and Opus 4.6 from Anthropic (the makers of Claude, one of the main competitors to ChatGPT). And something clicked. Not like a light switch... more like the moment you realize the water has been rising around you and is now at your chest.
I am no longer needed for the actual technical work of my job. I describe what I want built, in plain English, and it just... appears. Not a rough draft I need to fix. The finished thing. I tell the AI what I want, walk away from my computer for four hours, and come back to find the work done. Done well, done better than I would have done it myself, with no corrections needed. A couple of months ago, I was going back and forth with the AI, guiding it, making edits. Now I just describe the outcome and leave.
Let me give you an example so you can understand what this actually looks like in practice. I'll tell the AI: "I want to build this app. Here's what it should do, here's roughly what it should look like. Figure out the user flow, the design, all of it." And it does. It writes tens of thousands of lines of code. Then, and this is the part that would have been unthinkable a year ago, it opens the app itself. It clicks through the buttons. It tests the features. It uses the app the way a person would. If it doesn't like how something looks or feels, it goes back and changes it, on its own. It iterates, like a developer would, fixing and refining until it's satisfied. Only once it has decided the app meets its own standards does it come back to me and say: "It's ready for you to test." And when I test it, it's usually perfect.
I'm not exaggerating. That is what my Monday looked like this week.
What does this mean for all of us in education? For those of us who care about kids and their learning experiences?
1. The "standard playbook" is breaking in real-time
The traditional path of good grades → good college → stable professional job is pointing students directly toward the roles most vulnerable to AI displacement. Within 1-3 years, you need to fundamentally rethink what you're preparing students for, or better yet, what you are not helping them prepare for. The jobs that looked like safe bets (lawyer, accountant, engineer, doctor, analyst, even software engineer) are experiencing rapid transformation right now.
This doesn't mean our current educational curriculum and process doesn't matter, but the focus needs to shift dramatically towards teaching students to work fluently with AI tools, developing genuine curiosity and adaptability over credentials, and helping them pursue things they're actually passionate about (since the stable career ladder they're climbing may not exist when they graduate).
2. AI Fluency is becoming as fundamental as reading—and you have a narrow window
The article emphasizes that being early to understand and use these tools is the single biggest advantage right now. For K-12 education, this means we have maybe 1-3 years to build AI fluency into our curriculum before it becomes baseline expectation rather than competitive advantage.
Students need hands-on experience using AI for real work. This is not just learning about AI, but actively using current tools to research, write, analyze, and create.
The students who graduate comfortable iterating with AI, knowing how to get useful output, and understanding what these tools can and can't do will have a massive advantage over peers who treated it as off-limits or just a way to cheat on homework.
This is happening now, not in 10 years. The disruption the article describes is already underway in professional fields, and it will reach your students faster than the typical education system can adapt.
Of course, we should push back. Using AI for everything is ridiculous. Human interaction, learning experiences, discussions, and feedback become even more important in a world with all of this technology. As I’ve argued over and over and over again - we need to be intentional about creating AI-Resistant learning experiences.
But to ignore it, or act like it’s not that big of a deal, or it can’t do MY job, is just delaying the progress already happening all around us.
This is the CHEF moment. If anyone can be a Cook right now in almost any field. Who will become the Chefs?