The arrival of generative artificial intelligence in schools has prompted understandable concern among educators and policymakers. Recent research warns that when students rely too heavily on AI, they may outsource the very intellectual work required to build knowledge and develop critical thinking. These concerns deserve serious attention. But leadership in schools requires more than responding to immediate risks. It requires asking deeper questions about the future of learning.
Two recent publications capture this moment well. A study from the MIT Media Lab, Your Brain on ChatGPT, uses electroencephalography to investigate how using a large language model affects brain activity during essay writing (Kosmyna et al., 2025). Separately, a report from the Australian Network for Quality Digital Education examines how AI may shift the cognitive effort involved in learning through the concept of cognitive offloading (Lodge & Loble, 2026). Both ask what happens to learning when powerful tools make intellectual tasks easier.
Reference Documents
Your Brain on ChatGPT
Kosmyna, N. et al. (2025). Accumulation of Cognitive Debt when Using an AI Assistant for Essay Writing Task. MIT Media Lab. arXiv:2506.08872v2
Artificial Intelligence, Cognitive Offloading and Implications for Education
Lodge, J. M. & Loble, L. (2026). University of Technology Sydney / Network for Quality Digital Education. doi:10.71741/4pyxmbnjaq.31302475
However, these findings do not tell the whole story. They reveal what happens when AI is used poorly, not what happens when it is used well. For school leaders, the central question is not whether AI should be allowed or banned. It is what kinds of learning experiences we want to cultivate, and how AI might serve that vision.
What the Research Tells Us
The MIT study assigned participants to three groups: one writing with an LLM, one using a search engine, and one writing unaided. Over four sessions, the researchers tracked brain connectivity using EEG. Brain connectivity systematically scaled down with the amount of external support. The brain-only group exhibited the strongest, widest-ranging neural networks. The LLM group showed the weakest overall coupling. By Session 3, LLM users were mostly copy-pasting with minimal editing, showed impaired perceived ownership of their work, and fell behind in their ability to recall what they had written minutes earlier.
The Lodge and Loble report approaches the same territory through a different lens. Drawing on decades of cognitive science research, they distinguish between beneficial cognitive offloading, where tools free up mental resources for higher-order thinking, and detrimental offloading, where tools allow learners to bypass the effort required to build durable knowledge. The report introduces the idea of a "performance paradox": AI can make students appear competent in the moment while quietly undermining the development of genuine understanding.
The risk is not simply that students complete tasks with less effort. The deeper concern is that they may fail to engage in the intellectual struggle required to develop understanding.
These warnings point to a real challenge. If AI becomes an answer-generating machine that students use to complete assignments, then learning will suffer. The technology encourages passivity rather than intellectual growth. But these findings describe a particular use of AI, not AI itself.
The Leadership Question
Ken Kahn's The Learner's Apprentice: AI and the Amplification of Human Creativity reframes the conversation. Rather than viewing AI as a replacement for thinking, Kahn proposes that AI should function as a learner's apprentice: a responsive partner that helps learners explore ideas, test hypotheses, and extend their creative capacities. This moves the discussion from anxiety about technology toward a vision of learning that prepares students for a world where human creativity and machine intelligence increasingly interact.
The goal of education is not to ensure that students complete tasks without technological assistance. The goal is to help students develop the knowledge, creativity, and judgement needed to thrive in a complex world. AI can either weaken that goal or strengthen it, depending entirely on how learning is designed.
From Task Completion to Creative Inquiry
One of the most important leadership challenges in the age of AI is moving schools away from a culture of task completion and toward a culture of inquiry and creation. Many traditional assignments focus on producing a specific output: an essay, a summary, a worksheet. These are precisely the tasks that generative AI can perform quickly and convincingly, and when students use AI to complete them, the concerns raised by the MIT and UTS research are entirely justified.
But when learning is organised around exploration, design, and problem-solving, AI becomes a different kind of tool. Instead of generating answers, it helps learners investigate questions. Students studying environmental science might use AI to simulate how changes in policy affect carbon emissions across regions. In literature, students might use AI to generate alternative interpretations of a text, critique those interpretations, and defend their own arguments in response. In technology or design courses, learners might collaborate with AI to prototype applications or design tools that address real-world problems. In each case, AI does not replace thinking. It expands the intellectual terrain students can explore.
AI can help learners explore conceptual spaces that would otherwise remain inaccessible.
Kahn emphasises that AI systems can generate possibilities, offer suggestions, and respond dynamically to learners' ideas. This creates opportunities for iterative thinking: testing ideas rapidly, receiving feedback, and refining work through multiple cycles of experimentation. For school leaders, this insight should shape how technology is introduced into classrooms. The focus should not be on efficiency or automation. It should be on expanding the intellectual possibilities available to learners.
Developing Judgement and Agency
AI systems can generate convincing text, persuasive arguments, and plausible explanations. But they are not infallible. They produce errors, biases, and misleading conclusions. Students must learn to interrogate AI outputs critically: whether an explanation makes sense, whether a claim is supported by evidence, whether alternative interpretations should be considered.
Kahn highlights that effective use of AI requires learners to remain the architects of their own thinking. AI can provide ideas and suggestions, but the learner must evaluate them. Lodge and Loble make a complementary point: the key distinction is between students who use AI to avoid cognitive effort and those who use it to redirect that effort toward higher-order thinking. The latter represents genuine augmentation. The former represents atrophy.
For school leaders, this means that AI integration must include deliberate opportunities for students to practise evaluation, critique, and reflection, rather than accepting AI-generated responses at face value.
What School Leaders Should Move Toward
The emergence of AI invites school leaders to rethink several aspects of learning.
Curriculum and assessment must evolve. Tasks that focus primarily on reproducing information or producing predictable outputs will increasingly be completed by machines. Learning experiences should instead emphasise problem-solving, design thinking, and intellectual exploration.
Teachers must be supported in redesigning learning. Professional learning should focus on how AI can deepen inquiry, stimulate creativity, and support metacognitive reflection, not simply on what students should or should not be allowed to do with it.
Schools should cultivate cultures of experimentation. AI is a rapidly evolving technology. Educators will learn most effectively by trying new approaches, reflecting on outcomes, and sharing insights across teams.
Leaders must communicate a clear vision. The conversation about AI should not be dominated by fear or restriction. Instead, it should emphasise the opportunity to develop new forms of learning that prepare students for the future.
Serving the Learner
At its best, education equips young people with the confidence and creativity to shape the world around them. AI will become part of the intellectual landscape in which they work, create, and collaborate. The responsibility of schools is not to shield learners from this technology. It is to help them use it wisely.
When AI is framed as a learner's apprentice rather than an answer generator, it becomes a tool for exploration rather than avoidance. It supports the development of curiosity, creativity, and intellectual independence.
AI will change learning. That change is already underway. Whether schools lead that transformation thoughtfully will determine whether human creativity, curiosity, and judgement remain at the heart of education.