20 Comments
User's avatar
David Mendoza's avatar

I enjoyed the article and I appreciate the author taking the time to detail this counter to conventional wisdom.

However, I think conventional wisdom is right on this topic. The highest part of human cognition - the part that moves the needle for human civilization and is responsible for most (all) of our advancements - is extremely metabolically expensive. For that reason, humans tend to skirt around it whenever possible. The old adage of “water flows through areas of least resistance” describes this phenomenon. You saw it in your Gilgamesh reading group.

So it seems that training the ability to exert cognitive force is important. It’s something we (on average) try to skip if we can, but civilization knows it’s important and so we codify it and mandate it in our formal education structures.

AI is the first real tool that can be used by almost anyone to completely bypass this metabolically expensive yet critical activity. Cheating on homework and tests was situational and a stopgap at best. But generative AI can completely relieve a human mind of its most potent cognitive burden.

I believe this is the root of the concern of AI in the classroom. Whether or not it’s a valid concern is something people can debate. But I think intuitively people sense a trade is being made with the devil.

Expand full comment
Anna Dallara's avatar

Thanks for the thoughtful reply! You raise good points. Taking the time to think critically and respond thoughtfully to material is energetically expensive, and many people will bypass that effort if they can. Many teachers are seeing more cheating, and that’s a real issue.

I don’t think energy expenditure is the end of the story, however. I still think the real problems are with the underlying education system. Students are a unique mix of unmotivated, exhausted, and hyper-competitive, so of course they’re looking to save energy! AI didn’t cause these problems, but it’s making them more obvious.

I compare my Gilgamesh classmates to the Aristotle reading group I’m in right now. Aristotle is HARD, way harder than Gilgamesh. Yet every single person in this group completes the assigned reading, shows up with questions, and participates in the discussion. They’re willing to expend a tremendous amount of energy on top of their jobs and families because they all chose to be here and value what they’re learning. No one is cheating or using AI to cut corners. This is what education should be.

Of course, changing an entrenched system takes time. In the meantime, teachers are finding creative ways to encourage their students to expend the extra effort. Some are banning all technology from their classrooms and asking students to write by hand during class time. You can read lots of great teaching experiments here on Substack.

Expand full comment
Sean Clark's avatar

Absolute bang on! I have been in K-12 public education for nearly 15 years as a teacher and administrator and have come to exactly the same conclusions since getting serious about exploring AI. This is a post I wish I had written!

Expand full comment
Anna Dallara's avatar

Thank you! It's so interesting that we came to the same conclusions as a career educator (you) and an outside observer (me).

Expand full comment
Darrell Mac's avatar

Another issue is that old habits die hard. This subject just won't be just about education, it's going to leak into all domains especially the workforce. Making humans pivot takes work. Early adopters go and everyone else kind of drags their feet. AI Agents and their Agentic systems will be taking actions in 2025 and closing that capability gap VERY soon (See the release of the o3 model) Students & workers will have to embody RAPID comprehension and RAPID adaptation to survive, that will only be done thru AI augmentation. There is no other way. We can split hairs and debate, but that machine will continue to make exponential leaps.

As Dave Shapiro said: Adapt or be made obsolete.

Expand full comment
Anna Dallara's avatar

It's human nature to resist change. My biggest fears concerning AI aren't about the technology itself but our inability to handle the disruption it will cause. I think the pressure AI will put on our broken systems will be a once-in-a-lifetime opportunity to fix those systems for the better, but I worry we aren't up to the challenge. I think you're right that the next year will start bringing about big changes, for better or worse.

Expand full comment
Darrell Mac's avatar

Correct. Best we adapt early, then adapt late. Moving late in this space, you might not catch up. I've been looking at a lot of chatter about OpenAI's o3 model. That second order effect will be other companies trying to match that AI. And China, trying to match with their open-source AI research. This is the Fast & Furious part 15 lmao

Expand full comment
Griff Wigley's avatar

Anna check out how Khan Academy is using AI. It seems to address the concerns that you raise.

https://www.khanmigo.ai/teachers

Expand full comment
HJO Oogink's avatar

Everyone needs to know and respect that AI is the assistant and can only be that way working well, nowadays.

Expand full comment
Stephen Fitzpatrick's avatar

This is a great post. I tried to get at some of the same ideas but you articulate them far better than I did - assumption #2 and #3 really resonated. A lot of the criticism assumes there are an infinite number of teachers out there who can look at and review dozens of drafts of student work. For motivated students, AI can be a force multiplier. I also have consistently found that those most vocal about AI's drawbacks are the least informed about what it can do. That said, obviously AI will not be a panacea and may even make things worse in the short term, but laying the problems of our current education system at the feet of AI is an easy scapegoat.

https://fitzyhistory.substack.com/p/can-technology-transform-schools

Expand full comment
Mike Kentz's avatar

There are many educators you would enjoy reading on this subject. We are all pushing this conversation forward. Myself, Nick Potkalitsky, Doan Winkel, Jason Gulya, Lance Cummings, and many more. Doan in particular reframes the discussion in a very similar way to how you have here. Nice connecting!

Expand full comment
Robert Hiett's avatar

Good article. I just wrote one related to AI detectors, and how they are doing a disservice to everyone. I think if AI is used as a tutor, then students can learn responsibly.

My question to Education and AI is that how many schools are providing an AI platform, are teachers and professors being trained on responsible AI, and are the students being trained in response and ethical AI use? Data I see indicates the answers to the question are largely no, no, and no.

Too many leaders are not learning AI, allowing staff to use free or personal AI tools so they don't have to pay for it, and no one is getting any training. So I am not surprised when education cracks down on students using AI, but it is the teachers and the students who are paying the price.

Expand full comment
Amy Letter's avatar

I was recently invited to speak with a group of Accounting majors and learned from them that they are actually quite put out that one of their business school professors uses AI to respond to their essays. After some conversation about it we put our finger on the problem: the social contract. The accounting students felt their professor was breaking the social contract, that he was in essence refusing to “hear” them while simultaneously expecting them to listen to him. They were resentful and felt like he had a lot of nerve to expect them to listen to his lectures if he couldn’t take the time to read their work.

I don’t think they expect their essay feedback to be perfect, and maybe it even IS of better technical quality or value than if the professor did it himself. But that didn’t seem to matter to them. They wanted human connection.

Expand full comment
Lauren S. Brown's avatar

Thank you for this post. Like the commenter below (Sean Clark), I wish I had written it. Like him, I am in public education, 20+ years teaching history. My world was blown up by Covid before ChatGPT. And like with Covid, the incredible power of AI has me going back to the fundamental questions about education: what actually is it? what is it good for? how do I get students to want it?

I agree with the fallacy of all your assumptions. #1 Students, like all of us, are sometimes lazy. But like all human beings, they are curious, too. I wrote about that here: https://laurenbrownoned.substack.com/p/curiosity-in-the-classroom

I winced at #2, recalling one of my seminars in grad school in which I was the only one who had done the reading. Why would anyone be studying U.S. history in grad school and not do the reading, I remember wondering.

And then #3--I followed a few of the links in your article, and one of them led to me to a research article that I was interested in but don't have time to read. Thank you ChatGPT!

So to your conclusion, that AI is the scapegoat for problems that have existed for a long time. Yes, yes, yes! Teachers and administrators need to go back to the basics and ask themselves, if the primary function of school is to give kids an education, then what does that education consist of/look like? How do we ensure that it is accessible to all who want it? And how to we help the ones who think they don't want it realize that they should?

Expand full comment
Esha Patnaik's avatar

A thoughtful piece. I agree with you that AI has helped highlight issues that were already existing. It's just easier to blame these issues on AI rather than address the root of the problem, which is poorly designed and maintained educational systems.

End of the day, it's till a human somewhere who is using AI, whether it's a teacher, student or policy maker. How they choose to use it marks the difference between using AI to enhance the learning experience or reduce it to a copy-paste hack job to earn grades.

Expand full comment
Amy's avatar
Mar 5Edited

You make some good points and I was mostly with you until this line: “Brake misses the fact that our education system was designed in the industrial era to train children to work in factories. Schools have always instilled “productivity” and “efficiency,” and they do so by design.”

This myth that schools were based solely on factory models to train factory workers is simply not true. The history of education in the US has a much more complex foundational history, including the desire to create a unified citizenry based on commonly shared knowledge.

Expand full comment
Lauren S. Brown's avatar

Same! Check out this thoughtful piece about the factory model of education: https://fivetwelvethirteen.substack.com/p/dont-talk-to-me-about-the-factory

Expand full comment
Craig Van Slyke's avatar

Great essay. Many in education would rather AI went away. That's not going to happen. Like a lot of things I've seen over three decades in education, whether we like it or not, AI is here to stay and we have to deal with it. Personally, I see huge potential benefits, but can understand the skeptics. After all, educators are largely overworked and AI is yet another thing to deal with.

On a different note, I'm increasingly concerned about what I'm calling learned passivity. Students just seem to think it's better to wait for help (or just give up) that trying to help themselves. This isn't the same as learned helplessness and it's not the same as being lazy. Frankly, I'm still trying to make sense of it, but I've talked to many educators who are seeing something similar. (I'm in higher ed, by the way.) Keep up the good work!

Expand full comment
Kevin Rice's avatar

"If you are not willing to learn, no one can help you. If you are determined to learn, no one can stop you." Zig Ziglar

Expand full comment
Roman Baranovic's avatar

This is an excellent article. I share your views and conclusions. I really don’t understand why people are so convinced that everything was perfect before and that AI will ruin it. You are absolutely right; students have been cheating since the inception of education. Why is that? Again, you hit the nail on the head—it's because they lack motivation to learn. This could be due to the reading material not aligning with their interests or the math exercises being too challenging. AI has the potential to be a game changer in this regard, provided it is used correctly to personalize learning and enhance motivation. One teacher cannot tailor their approach for a classroom of 30 students, but AI can effortlessly adapt to meet individual needs.

Expand full comment