Paper Over Pixels: How Canadian Teachers Are Fighting Back Against AI in Schools
In a quiet Brookes Westshore High School classroom in Victoria, British Columbia, paper and pens are back in style for instructor Gary Ward, thanks to the rapidly growing trend of students using artificial intelligence tools like ChatGPT to complete their homework.
“It was just a few students experimenting with it last year,” Ward told Business Insider. “Now every single student is doing it.”
The influx of AI tools has disrupted education systems worldwide. For Canadian teachers like Ward, it has posed a strange and urgent question: how do you fairly test a student’s knowledge when a chatbot may have written their homework?
The AI Invasion in Classrooms
Students are utilizing AI software like ChatGPT, Claude, and Gemini to answer homework questions, write essays, and even solve challenging math problems. As convenient and efficient as these tools are, they also pose some significant problems with academic integrity.
At Brookes Westshore and many other schools across Canada, teachers have noticed a dramatic shift in how students are completing their work. Homework with perfect grammar, advanced vocabulary, and an impersonal tone has raised red flags. In some cases, entire classes have submitted essays that have looked suspiciously similar in structure and tone—clear signs of AI involvement.
The real problem? Teachers are losing the ability to assess what students actually know.
Fighting AI With AI
Interestingly, some teachers are opting to fight fire with fire. Instead of banning AI devices outright, they’re using them to outsmart students.
One strategy is to pass questions through AI models to see how the system responds—and then design tests the AI will struggle with. These can be prompts that require personal reflection, contextual understanding, or subtle interpretation—areas where AI still falls behind or regurgitates generic, surface-level answers.
That is, educators are now asking AI: “What questions can’t you answer well?” And then designing their tests accordingly.
It’s an irony twist: AI is being used to make it harder for students to cheat using AI.
Making Cheating Harder Work Than It Is Worth
Another trend that is arising is so to speak “complexification.” Instead of banning AI, some instructors design tasks that are more time-consuming and complicated to execute using AI than not.
This may mean asking students to outline their thinking, add personal examples, or bring in small—but specific—details not present in canned internet answers. For example, an essay question might ask for a student’s personal response to a specific class discussion or reference a school-specific event.
Yes, a student can still attempt to utilize AI, but now they’ll have to go through multiple editing and fact-checking rounds to get the response to seem credible. No longer is it the quick and easy shortcut they were hoping for.
The Return of the Paper Test
Despite these high-tech solutions, there are schools that are opting for a more traditional method: welcoming back face-to-face, handwritten tests.
By accepting phones, seating students apart in class, and having them hand-write answers, teachers can simply make the digital devices not available. This may be a backwards-looking strategy, but for many educators, it is the most effective way to regain control of the testing process.
But paper tests have their disadvantages. They take longer to mark, are harder to administer on a large scale, and don’t reflect the technological aids students will be able to use in the real world.
Most educators see it, therefore, as a stopgap, not a long-term fix.
The Bigger Picture: Teaching Students to Use AI Responsibly
Ultimately, the goal isn’t just to stop students cheating—but to teach them how to use AI responsibly.
Teachers like Ward believe students need to be taught to think critically, ask good questions, and verify the information they’re given—even if it comes from an AI. That means developing skills like judgment, creativity, and originality—something no chatbot can replicate.
Rather than ban AI entirely, schools must attempt to integrate it in such a way that it will enhance learning, not work around it. For example, students can use AI to brainstorm, for grammar, or for arranging arguments—but not to write entire assignments.
This shift requires more than new policies. It requires a cultural shift in the way we think about education and technology.
Final Thoughts
The advent of AI in the classroom is a double-edged sword. On one hand, it provides students with tools that can potentially cause them to learn faster and better. On the other hand, it leads them to take shortcuts and shun the hard work that real learning entails.
Canadian teachers are rising to this challenge with creativity, flexibility, and yes, sometimes a box of pens and a ream of paper. But ultimately, their objective is much higher: to ensure students not only use AI, but also understand its limitations and potential.
In a world where anyone can type a question into a chatbot and get an instant answer, the most valuable skill may no longer be finding the answer—but knowing what to ask