Credit: Pixabay/CC0 Public Domain
Since ChatGPT and other large language models burst into public consciousness, school boards are drafting policies, universities are hosting symposiums and tech companies are relentlessly promoting their latest AI-powered learning tools.
In the race to modernize education, artificial intelligence (AI) has become the new darling of policy innovation. While AI promises efficiency and personalization, it also introduces complexity, ethical dilemmas and new demands.
Teachers, who…
Credit: Pixabay/CC0 Public Domain
Since ChatGPT and other large language models burst into public consciousness, school boards are drafting policies, universities are hosting symposiums and tech companies are relentlessly promoting their latest AI-powered learning tools.
In the race to modernize education, artificial intelligence (AI) has become the new darling of policy innovation. While AI promises efficiency and personalization, it also introduces complexity, ethical dilemmas and new demands.
Teachers, who are at the heart of learning along with students, are watching this transformation with growing unease. For example, according to the Alberta Teachers’ Association, 80% to 90% of educators surveyed expressed concern about AI’s potential negative effects on education.
To understand comprehensive policy needs, we must first understand classrooms—and teachers’ current realities.
As a researcher with expertise in technology-enhanced teaching and learning at the intersections of assessment, leadership and policy, I interviewed teachers from across Canada, with Erik Sveinson, a Bachelor of Education student. We asked them about their experiences with generative AI (GenAI) in the classroom.
Their stories help contextualize the reality of AI in a K–12 context, and offer insights around harnessing AI’s potential without harming education as a human-centered endeavor.
AI policy and teaching wisdom
This qualitative study involved 10 (grades 5 to 12) teachers from Alberta, Saskatchewan, Ontario and British Columbia.
We recruited participants through professional learning networks, teacher associations and district contacts, seeking to ensure a variety of perspectives from varied grade levels, subjects and geographic locations.
We thematically coded interview data, and then cross-referenced this with insights from a review of existing research about GenAI use in K-12 classrooms. We highlighted convergences or tensions between theories about assessment, teaching approaches in technology-enhanced environments, student learning and educator practices.
Across interviews, teachers described a widening gap between policy expectations and the emotional realities of classroom practice.
What we heard
The following themes emerged from our interviews:
- The assessment crisis: Longstanding tools of assessment, such as the essay or the take-home project, have suddenly become vulnerable. Teachers are spending countless hours questioning the authenticity of student work.
All teachers interviewed consistently said they struggled with their current assessment practices and how students may be using GenAI in work. Confidence in the reliability of assessments has been challenging. The majority of teachers shared that they felt they needed to consider students cheating more than ever, given advancing GenAI technology.
-
Equity dilemmas: Teachers are on the front lines of seeing firsthand which students have unlimited access to the latest AI tools at home and which do not.
-
Teachers perceive both opportunities and challenges with AI. Great teaching is about fostering critical thinking and human connection. Ninety percent of teachers interviewed faced complex challenges relating to equity and how best to support critical thinking in the classroom while building foundational knowledge. In particular, middle and high school teachers in core subject areas indicated students were using GenAI tools in their own time outside class without ethical guidance.
‘One more thing piled on’
One teacher from central Alberta said, “AI is definitely helpful for my workflow, but right now it feels like one more thing piled onto an already impossible workload. The policy says, ‘embrace innovation,’ but where’s the guidance and support?”
Classrooms are dynamic ecosystems shaped by emotion, relationships and unpredictability. Teachers manage trauma, neurodiversity, language barriers and social inequities while delivering curriculum and meeting student achievement expectations.
Teachers say there’s little recognition of the cognitive load they already carry, or the time it takes to vet, adapt and ethically deploy AI tools. They say AI policies often treat educators as passive implementers of tech, rather than active agents of learning.
A high school teacher from eastern Canada shared, “AI doesn’t understand the emotional labor of teaching. It can’t see the trauma behind a student’s meltdown. As much as I appreciate professional learning, when it is all about what tools to use, it misses the mark.”
This perspective highlights a broader finding: Teachers are not resisting AI per se; they are resisting implementation that disregards their emotional expertise and contextual judgment. They want professional learning initiatives that honor the human and relational dimensions of their work.
Burnout, professional erosion
This disconnect is not just theoretical, it’s emotional. Teachers are reporting burnout, anxiety and a sense of professional erosion. A 2024 study found that 76.9% of Canadian educators felt emotionally exhausted, and nearly half had considered leaving the profession. The introduction of AI, without proper training or support, is compounding that stress.
There’s also a growing fear reported by the Alberta Teachers’ Association that if not implemented properly with support for teachers new to the profession, AI could deskill the profession.
A teacher in Vancouver shared, “I am a veteran teacher and understand the fundamentals of teaching. For beginner teachers, when algorithms write report cards or generate lesson plans, what happens to teacher autonomy and the art of teaching?”
Turning teaching into a checklist?
Overall, the interview responses suggest what’s missing from AI policy is a fundamental understanding of teaching as a human-centered profession. As policymakers rush to integrate AI into digitized classrooms, they’re missing a critical truth: Technology cannot fix what it may not understand.
Without clear guardrails and professional learning grounded in teacher and student-informed needs, AI risks becoming a tool of surveillance and standardization, rather than empowerment.
This tension between innovation and de-professionalization emerged across many teacher responses. Educators expressed optimism about AI’s potential to reduce workload, but also deep unease about how it could erode their professional judgment and relational roles with students.
A northern Ontario teacher said, “There is hope with new technology, but I worry that AI will turn teaching into a checklist. We’re not technicians, we’re mentors, guides and sometimes lifelines.”
Teachers fear that without educator-led frameworks, AI could shift schooling from a human-centered practice to a compliance-driven one.
Responsible AI policy
If we want to harness AI’s potential without harming education as a human-centered endeavor with students and teachers at the core, we must rethink approaches to AI innovation in education. That starts with listening to teachers.
Teachers must be involved in the design, testing and evaluation of AI tools. Policies must prioritize ethics, transparency and equity. That includes regulating how student data is used, ensuring teachers can ascertain algorithmic bias and ethical implications and also protecting teacher discretion.
Third, we need to slow down. The pace of AI innovation is dizzying, but education isn’t a startup. It’s a public good. Policies must be evidence-based and grounded in the lived experiences of those who teach.
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Citation: Here’s what teachers in Canada have said about their experiences with AI in the classroom (2025, November 10) retrieved 10 November 2025 from https://phys.org/news/2025-11-teachers-canada-ai-classroom.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.