AI in the University Classroom
Today’s university students have to ask themselves a stark question: do they merely want to get a degree, or do they also want to get an education?

By Peter Coffman
I’ve been teaching long enough to remember when AI wasn’t even a thing. Which is to say, I’ve been teaching for more than three years.
It sure is a thing now. For many – maybe most – of my first-year students, work on every assignment starts by consulting AI. It doesn’t matter whether I think that’s a good thing or not. It’s everywhere. AI is the giant elephant in every single classroom.
So, last fall I decided to talk about AI with each of the tutorial groups in my first-year survey course – over 200 students in all. I asked them whether they used it, how they used it, what they thought it was good for and whether they thought its usefulness had limits. Here’s some of what I learned.
The Good
Some students do use AI constructively. A couple told me that they copy their lecture notes into it, and ask it to quiz them on the information so that they can make sure they understand the content. This is a great way to study.
Most students also have a healthy wariness of AI’s limitations. “If it doesn’t know the answer, it just makes stuff up”, they told me. This is a new variation on a very old adage: “don’t believe everything you read” – especially if it’s not written by a sentient being. Critical thinking is part of what we teach, so I appreciate the discriminating lens through which some students view AI.
Many students use AI to clean up their grammar. Okay, I’d rather they learned how to write properly themselves, but that’s still a fairly benign (and certainly helpful) use of AI.
The Bad
Other uses of AI are less benign. Some students told me that if they find an assigned reading difficult, they put it into AI and ask for a summary. This can go downhill fast. Will the summary be accurate? Will relying on AI become a habit, a crutch, used as substitute for having to read and think about a text? If you could choose between spending a couple of hours reading an article, or spending a couple of minutes reading an AI-generated summary of it, which would you choose?
Of course, technology saves labour; that’s been one of its most valuable contributions to human life, right? Well, yes and no. The point of getting students to read something is not simply so that they’ll know what it says. The point is to get them to work through the question it addresses in real time; to see how authors gather evidence and how they shape that evidence into arguments. And to get students to critique those arguments, rather than simply accepting their conclusions.
This is how students become clear and critical thinkers, and there are no shortcuts. You don’t become physically strong by going to the gym and watching others work out. And you don’t become intellectually strong by having a machine do your reading and interpreting for you.
The Ugly
There’s no sugarcoating it: some students get AI to do their work for them. I don’t think it’s a majority, but it’s more than just a few. AI can do their reading for them. AI can do their analysis for them. AI can do their writing for them. AI can generate discussion posts and complete online quizzes. Thanks to AI, most measures of student achievement completed outside the classroom or exam room are now unreliable.
Assuming we expect students to complete their own assignments (and why bother assigning them if we don’t?), this is cheating. It’s not completely undetectable, but it’s all but impossible to prosecute, because even when I’m sure it’s happening I can’t provide proof that would ‘stand up in court’, so to speak. So, perpetrators go unpunished.
This is obviously unfair to students who have made a conscious decision not to cheat, and find themselves competing for grades with high-powered software. And unfortunately, those grades matter, because a lot of privileges (scholarships, awards, acceptance in grad school, other opportunities) are at least partly tethered to them.
What can we do about this cheating? Do we ditch more creative take-home assignments in favour of more in-class testing and heavily weighted final exams? This might address the problem, but many would argue that it’s a giant step backwards pedagogically. Do we emphasize face-to-face assessments, where we meet one-on-one with students to discuss their work to evaluate their understanding? Well, I’ve taught almost 400 students this year and class sizes just keep getting bigger, so that’s not going to happen. I don’t know what the answer is, but it’s probably the biggest challenge facing the present and next generation of teachers in the humanities (and likely elsewhere too, but I won’t presume to speak for them).
A New Problem, or an Old One?
AI is a very new thing, but it exacerbates what’s actually an old problem. For years, I’ve tried to push back against an increasingly transactional, even adversarial view of education. Here’s what it looks like. I have control over something that students want: grades (grades that lead to credits, which lead to degrees). I charge a price for those grades: hard work. Students, like many savvy consumers, like to get as many grades as possible for the lowest price (i.e. least work) possible.
This disheartening model of education is fundamentally wrong because it views education as a commodity rather than a process. It’s been possible to live with this problem as long as I controlled the ‘price’ – if I make the students work hard for good marks, they will get both the grades and the education they deserve.
But controlling the price only works if the ‘consumer’ actually has to pay it. AI massively tilts the playing field by flooding the ‘marketplace’ with counterfeit currency. Suddenly, students can ‘buy’ decent grades – and ultimately a degree – with fake, worthless ‘money’ provided by ChatGPT and other platforms.
To be clear, most students don’t have such a cynical view of their education. They come to university to learn, not just to collect credits. But that is the stark choice that students now have to make. Are they here to learn? Or are they just here to purchase a piece of paper called a ‘degree’? Because thanks to AI, both options are now wide open to them.
What’s the End Game?
I don’t pretend to know where AI is going to take university education, or even how to respond to it right now. I am bewildered. On good days, I still feel the excitement of connecting with students, and of our making each other’s world a bigger place. On bad days, I feel like a guy trying to give slow-cooking lessons to people who live in a cafeteria full of free junk food.
But I do know this for a fact: down the road – in work, career, and life – a genuine education is what will serve students well. That’s what will give them the vision, discipline, confidence, and intellectual resources to navigate challenges. A degree bereft of an education will be useless. Right now, it is 100% up to each student which path they take.
Peter Coffman, History & Theory of Architecture program
peter.coffman@carleton.ca
@petercoffman.bsky.social