Large language models have brought a brief wave of excitement and a quick sobering in education. It has turned out that although they can produce text and code, they often hallucinate and, when used incorrectly, weaken thinking—even to what MIT research calls cognitive debt. If we handle them sensibly, however, they can transform the way we teach, learn, and collaborate with industry.
From Enthusiasm to Sobering Up
When the first media reports arrived that ChatGPT writes flawless essays or can pass the U.S. bar exam, it seemed that student papers and tests were losing their purpose. Shortly after, reality set in: the models generate content that sounds convincing but is not always correct. Students often stopped asking questions and verifying, because “AI said so.” Teachers also quickly noticed that these texts can be recognized—not proven beyond doubt, but detected by their tone and the superficiality of their arguments.
Researchers from MIT showed that groups who relied on ChatGPT for tasks exhibited lower brain activity. Long-term outsourcing of thinking thus creates cognitive debt—the brain isn’t trained and loses fitness. Tools that once helped us with heavy work or routine now create the illusion that we can hand over thinking itself. That is tempting, but it leads in the wrong direction if one’s own judgment and oversight are missing.
How to Rebuild Teaching in the AI Era
Bans make no sense—the tools are here and will remain. The class has to make sense: instead of two hours of monologue or mechanical drills, we need to devote time with students to analysis, working with ambiguity, and solving complex problems. The goal is not to hand in a sheet of paper with text or code, but to understand why the solution works and what alternatives there are. Everything that is routine and transferable to machines should be done by machines.
For students, AI is the cheapest, infinitely patient tutor—provided they don’t trust it blindly and continually verify its outputs. They can have a term, an equation, or an entire concept explained, have a lecture summarized, or search through hundreds of documents. The teacher then doesn’t need to fight generated notes, but should focus on leading discussion, providing feedback, and quality control of reasoning. Writing course notes in rapidly changing fields is less important than cultivating the ability to think.
Industry as a Co-creator of Talent
Companies often say they don’t want juniors, but fully formed seniors with precise domain expertise. Universities can’t deliver that on their own—it’s not possible without partnerships. A competitive advantage will go to organizations that enter education from the first years of study, not just with a logo on a sponsor list. What is needed are real teams and mentors who work with students on projects and teach them to think, not just hand tasks over to AI.
Such a partnership will strengthen business and the academic community: students will grow into practical roles more quickly, and companies will gain experts who know how to work with AI without surrendering their own minds to it. The task of universities is to concentrate excellence and cultivate thinking; the task of industry is to become its active co-author. Together we will thus harness the benefits of the tools and avoid the trap of cognitive debt.