Instead of banning it, let’s teach students to use it well
Earlier this month, I was at the airport on my way to a meeting on artificial intelligence in higher ed when I noticed an ad for another AI conference. Headlines about AI dominate my phone notifications. I flip on the TV, and there’s more news coverage about what the technology means for our future, our jobs, our lives.
Outside politics, almost every current conversation is about what AI will change, who or what it’ll replace, and what it’ll mean for people and for work. AI discourse is everywhere, all at once.
Yet, amid the omnipresence of AI noise, we’re still treating it like a question in higher ed, asking: Should students use it?
That’s the wrong question. Because students have already answered it.
A new Lumina Foundation and Gallup report shows that more than half of college students are already using AI in their coursework on a daily or weekly basis. Only a small share says they never use it. This isn’t something we can control by allowing or banning it.
At the same time, more than half of students say their college discourages or even prohibits the use of AI. And more than half say at least some of their courses don’t have clear rules about what’s allowed.
We’ve created a system where students use AI, but they’re not always sure how to use it. And that’s where the real risk is.
Because students aren’t using AI to avoid learning. They’re using it to understand, make sense of complex material, check their thinking, and work through problems.
At the same time, other students avoid it because they’ve been told it’s unethical or cheating. As a result, we have students experimenting without guidance and students holding back without clarity. And we’re not really helping either group; in fact, we are probably hurting them.
Even when institutions try to draw a hard line, it doesn’t hold. Students still use AI. Even at schools that prohibit it, about one in four students say they use it regularly. At schools that discourage it, nearly half still use it weekly.
Students will use AI. We need to ensure they understand its implications and prepare them for a world where it’s expected.
Because outside the classroom, AI isn’t optional. It’s already changing how work gets done. It’s replacing some tasks, reshaping others, and becoming part of what employers expect people to know how to navigate. Students see that clearly. Nearly half say they’ve considered changing their major because of AI’s impact on the job market.
Consider the signal we’re sending: In the real world, AI is a basic tool. In the classroom, we’re still debating whether to allow it.
If we treat AI as inherently bad or something students should avoid, we’re not protecting them; we’re putting them at a disadvantage.
At the same time, simply allowing AI without guidance isn’t the answer either. Right now, nearly three in 10 students say they’re not getting enough training on how to use it. And students at schools that discourage or prohibit AI are more likely to say they feel unprepared. So, students are left to figure it out on their own, in a space where the expectations aren’t clear, but the stakes are getting higher.
Here’s where I think we need to shift the conversation: Let’s focus on teaching students how to evaluate and use AI thoughtfully. This includes when to use it and when not to, where it adds value, and where it creates risk. Who it works for and who it might leave out. We need to help them see how it can reinforce bias, spread misinformation, or create a false sense of understanding. Because AI is not just a tool. It shapes decisions. It shapes information. It shapes opportunity. And without that understanding, students aren’t really in control of how they’re using it; they’re reacting to it.
Higher education has an opportunity here. Not to try to stay ahead of every new tool, but to lead on something more important. To help students develop judgment. To use AI in ways that are ethical, informed, and productive. To understand both its power and its limits. Because students are already using AI.
We need to be preparing them to use it well, for the world they’re actually walking into, not the one we wish we could hold onto a little longer.