Here's a paradox that should concern anyone who cares about education: the tools designed to catch students using AI are teaching them to write worse.
AI detection software has become ubiquitous in classrooms, but it's creating perverse incentives. Students are being flagged as cheaters not for plagiarism, but for writing too well. Use a word like "devoid"? That's suspicious. Write with confidence and clarity? You might be a bot.
The result is predictable and depressing. Honest students are learning to dumb down their prose to avoid false accusations. Others are turning to AI tools defensively — not to cheat, but to understand how the detectors work so they can game them.
The Cobra Effect in Action
This is a textbook example of what economists call the cobra effect: a solution that makes the problem worse. Detection tools were supposed to discourage AI use. Instead, they're radicalizing students who would never have considered cheating.
One student, falsely accused despite a history of excellent writing, subscribed to multiple AI services just to understand the algorithms. Another began using AI to rewrite their own work to make it look less sophisticated. The surveillance regime is creating exactly the behavior it aims to prevent.
What Actually Works
Some educators are taking a different approach. Writing instructor Dadland Maye abandoned detection-first policies and instead taught students to use AI responsibly. The result? Classroom dynamics shifted from adversarial to educational. Students engaged in genuine conversations about technology's role in learning.
The difference is philosophical. Detection tools treat writing as "a performance to be managed" rather than a skill to develop. They punish excellence while teaching students that the goal is to fool the algorithm, not improve their craft.
The Technology Is Flawed
Here's what bothers me as someone who understands how these tools work: they're fundamentally unreliable. AI detectors flag sophisticated vocabulary as suspicious because they're trained on patterns, not meaning. They can't distinguish between a student who reads widely and one who used ChatGPT.

