The classroom is no longer analogue, and neither is the student

Share this article

Generative AI has finally caught up with the complexities of education, reshaping how we teach, learn and lead. But the sector faces tough choices about equity, pedagogy, and the ethical infrastructure of learning in a post-ChatGPT world.

Education has always resisted easy automation. Unlike finance or logistics, it deals less in numbers and more in nuance. Language, ambiguity, emotion, and collaboration are its raw materials. For decades, artificial intelligence struggled to parse these complexities. Then came generative AI. Within months, the classroom moved from the periphery of digital transformation to its centre.

Nicolaas Matthijs, Chief Product Officer at Anthology, recalls the precise moment the shift became clear. “I still remember sitting at the table using ChatGPT for the first time,” he says. “That was when I realised this was designed for the kind of data education actually has, unstructured, natural language-rich content. Our lives changed that week. We just did not know it yet.”

Moving from literacy to fluency

If education is to prepare students for a future shaped by AI, it must teach them not only how to use these systems but also how to question them. This goes beyond digital skills or prompt engineering. It is about critical engagement, ethical awareness, and experiential understanding of what AI can, and cannot, do.

“Institutions now must build AI literacy,” Matthijs says. “Students must experience it, understand its strengths and weaknesses, and develop a healthy scepticism. Only then can they use it responsibly in the workforce and society.” That scepticism, he argues, cannot be taught from theory alone. It must come from hands-on interaction, experiencing bias, witnessing hallucinations, seeing both the brilliance and the blind spots in real time.

Prompt engineering, he notes, is already evolving. While it may be a helpful skill today, it could eventually become a specialised function embedded into tools themselves. What will remain essential is the ability to assess the outputs of AI and place them in context. Systems thinking, algorithmic transparency, and awareness of the social impact of automation must all be woven into the learning process, even if the tools themselves continue to abstract away the technical detail.

Rewriting pedagogy, not replacing teachers

One of the most persistent fears in education is that AI will displace teachers. Yet Matthijs remains convinced that augmentation, not automation, will define the future. The educator’s role is changing, not disappearing.

“There is no sign that instructors are going anywhere,” Matthijs says. “But their day-to-day will evolve. AI will help manage course logistics, accelerate feedback, and support assessment. That frees up more time for mentoring, coaching, and direct student engagement, things AI cannot replicate.”

This shift also forces a rethink of how learning is assessed. As generative tools become increasingly embedded in everyday software, the boundaries between AI use in assignments become blurred. Some tasks will allow it; others will explicitly ban it. Institutions will need to strike a balance between authentic human expression and the responsible integration of AI. This may mean a return to in-class assessment in some cases, or more project-based and collaborative formats where critical thinking and creativity can flourish.

Matthijs likens it to the introduction of calculators. “In some maths assignments, you are not allowed to use them,” he continues. “In others, it is expected. AI is the same. It is not the goal. It is a tool that enables us to do things that were previously not possible.”

Equity is not a given

Among AI’s most promising claims is that it can level the educational playing field. Yet the gap between promise and practice remains wide and potentially widening. The same technologies that offer unprecedented personalisation can also deepen existing inequalities if access is restricted or outcomes are skewed.

“AI is already being used to make course content more accessible,” Matthijs explains. “We see formats optimised for mobile devices, real-time translation, and alternative media that support different learning needs. However, we must also confront the fact that many AI tools are currently offered as premium add-ons. That risks putting them out of reach for students who cannot afford them.”

Hyper-personalisation, often cited as AI’s great gift to education, is also not without trade-offs. Tailoring every aspect of a student’s learning experience may undermine the social fabric of education. Being part of a cohort, learning from peers, and experiencing shared challenges are not side effects; they are essential components of intellectual and emotional growth. “The opportunity is there, but so are the risks,” Matthijs warns. “We must ensure we do not fragment learning to the point where it loses its communal value.”

Engagement needs to be emotional, not just technical

The most overlooked potential of AI in education is its capacity to transform engagement. Students increasingly expect digital experiences to be responsive, interactive, and meaningful. This does not mean simply bolting AI onto existing content. It means rethinking what engagement looks like in an intelligent system.

One example Matthijs highlights is the rise of AI-powered conversation activities. Students interact with avatars of historical figures, fictional characters, or professional personas to deepen their understanding through dialogue. “We have seen this used in nursing programmes for patient roleplay, or in history classes with avatars like Madame Curie or Abraham Lincoln,” he explains. “The realism and interactivity spark a different level of curiosity and learning.”

These experiences may be mediated by machines, but their success depends on a deeper emotional and cognitive connection. Matthijs views this as a signal that education is not merely surviving AI, but adapting in creative and thoughtful ways that enhance its core purpose.

Pacing innovation for long-term trust

Despite his optimism, Matthijs is cautious about the pace of change. Generative AI evolves at breakneck speed, and education, unlike consumer tech, is bound by ethical, institutional and regulatory constraints. Building trust takes time, especially in a sector as socially embedded as education.

Anthology’s approach has been to lead with frameworks rather than features. “We deliberately focused on a trustworthy AI framework before deploying tools,” Matthijs explains. “We wanted to align our deployments with clear principles, educate users, and evolve gradually rather than overwhelm the system.”

This strategy is less about technical capability and more about psychological readiness. Teachers, students, and administrators must join in the journey. “There is always the temptation to race ahead,” Matthijs admits. “But if we move too fast, we risk leaving people behind. Education cannot afford that.”

Smaller models and larger visions

Underneath the visible applications of AI in the classroom is a deeper infrastructure question: Who owns the models? Matthijs is wary of the concentration of power among a handful of vendors. Training large language models from scratch is prohibitively expensive, making it likely that only the most prominent players will control the core technologies.

But there is hope in a growing shift toward smaller, purpose-built models. “Smaller models are easier to run, cheaper, and less environmentally taxing,” Matthijs adds. “And they can be more targeted for specific tasks. That gives institutions more control and flexibility.” This movement could restore some balance to a landscape dominated by hyperscalers.

Education is often slow to change. But the last two years suggest it is now changing faster than many had imagined. The integration of generative AI is not a trend or a phase; it is a structural transformation of what it means to teach and to learn.

Matthijs remains hopeful because, so far, the trajectory has been broadly thoughtful and inclusive. “The discourse has been measured,” he concludes. “Educators are not blindly adopting AI. They are asking the right questions: about impact, appropriateness, and value. That gives me confidence that we can steer this change in the right direction.”

What emerges is not a vision of AI replacing the teacher or dominating the classroom. It is a vision of human and machine learning in parallel, each enhancing the other, each revealing new ways of knowing. For once, education may not be catching up with technology. It may be shaping it.

Related Posts
Others have also viewed

A new era for AI ecosystem innovation

David Terry, Schneider Electric’s AI Enterprise & Alliance Partner Director for EMEA discusses the emergence ...

AI-scale cooling enters a new phase as data centres seek waterless thermal control

As artificial intelligence reshapes the demands placed on digital infrastructure, data centres face mounting pressure ...

NVIDIA raises the stakes as AI inference enters its industrial phase

As artificial intelligence shifts from experimental models to full-scale production, the economic engine powering it, ...

AI data centres drive demand for real-time renewable energy tracking

A new energy agreement covering nLighten’s French data centres signals a shift in how AI-driven ...