22 quotes about cheating with AI & plagiarism detection

Students should know that this technology is rapidly evolving: future detectors may be able to retroactively identify auto-generated prose from the past. No one should present auto-generated writing as their own on the expectation that this deception is undiscoverable. Inside Higher Ed

Alex Lawrence, professor at Weber State University, described it as “the greatest cheating tool ever invented.” Wall Street Journal

Some plagiarism detection and learning management systems have adapted surveillance techniques, but that leaves systems designed to ensure original work “locked in an arms race” with systems designed to cheat. Inside Higher Ed

Popular essay submission portal Turnitin is developing its own detector, and Hive claims that its service is more accurate than others on the market, including OpenAI’s very own, and some independent testers have agreed. Tech Radar 

While faculty members will likely spend some time trying to identify a boundary line between AI assistance and AI cheating with respect to student writing, that may not be the best use of their time. That path leads to trying to micromanage students’ use of these models. Inside Higher Ed

You can have tools like Quillbot (that can) paraphrase the essays ChatGPT gives you so it doesn't look too obvious. Mashable

“If I’m a very intelligent AI and I want to bypass your detection, I could insert typos into my writing on purpose.” said Diyi Yang, assistant professor of computer science at Stanford University.  Inside Higher Ed 

But what about the cheaters, the students who let a chatbot do their writing for them? I say, who cares? In my normal class of about 28 students, I encounter one every few semesters whom I suspect of plagiarism. Let’s now say that the temptation to use chatbots for nefarious ends increases the number of cheaters to an (unrealistic) 20 percent. It makes no sense to me that I should deprive 22 students who can richly benefit from having to write papers only to prevent the other six from cheating (some of whom might have cheated even without the help of a chatbot). Washington Post 

If a teacher’s concern is that students will “cheat” with ChatGPT, the answer is to give assignments that are personal and focused on thinking. We don’t have to teach students to follow a writing algorithm any more; there’s an app for that. Forbes

What’s to stop a student from getting ChatGPT to write their work, then tweak it slightly until it no longer gets flagged by a classifier? This does take some effort, but a student may still find this preferable to writing an entire assignment themselves. Tech Radar 

If the concern is that students could cheat, it’s worth remembering that they could cheat six months ago and 60 years ago. Students taking a brand-new exam could already get answers to test questions in minutes from services like Chegg. Students could already plagiarize — or pay someone else to write their entire paper. With the entrance of ChatGPT, “what’s changed is the ease and the scope. Chronicle of Higher Ed

If ChatGPT makes it easy to cheat on an assignment, teachers should throw out the assignment rather than ban the chatbot. MIT Tech Review

Professors can create conditions in which cheating is difficult, giving closed-book, closed-note, closed-internet exams in a controlled environment. They can create assignments in which cheating is difficult, by asking students to draw on what was said in class and to reflect on their own learning. They can make cheating less relevant, by letting students collaborate and use any resource at their disposal. Or they can diminish the forces that make cheating appealing: They can reduce pressure by having more-frequent, lower-stakes assessments. Chronicle of Higher Ed

Unlike accusations of plagiarism, AI cheating has no source document to reference as proof. “This leaves the door open for teacher bias to creep in.” Washington Post

Despite their positive attitude towards AI, many students (in a survey say they) feel anxious and lack clear guidance on how to use AI in the learning environments they are in. It is simply difficult to know where the boundary for cheating lies. Neuroscience News

While the AI-detection feature could be helpful in the immediate term, it could also lead to a surge in academic-misconduct cases, Eaton said. Colleges will have to figure out what to do with those reports at a moment when professors have yet to find consensus on how ChatGPT should be dealt with in their classrooms. Chronicle of Higher Ed

“Do you want to go to war with your students over AI tools?” said Ian Linkletter, who serves as emerging technology and open-education librarian at the British Columbia Institute of Technology. “Or do you want to give them clear guidance on what is and isn’t okay, and teach them how to use the tools in an ethical manner?” Washington Post

Even if detection software gets better at detecting AI generated text, it still causes mental and emotional strain when a student is wrongly accused. “False positives carry real harm,” he said. “At the scale of a course, or at the scale of the university, even a one or 2% rate of false positives will negatively impact dozens or hundreds of innocent students.” Washington Post 

On many campuses, high-course-load contingent faculty and graduate students bear much of the responsibility for the kinds of large-enrollment, introductory-level, general-education courses where cheating is rampant. How can large or even mid-sized colleges withstand the flood of nonsense quasi-plagiarism when academic-integrity first responders are so overburdened and undercompensated? Chronicle of Higher Ed

Bruce Schneier, a public interest technologist and lecturer at Harvard University’s Kennedy School of Government, said any attempts to crackdown on the use of AI chatbots in classrooms is misguided, and history proves that educators must adapt to technology. Washington Post

Harsh punishments for cheating might preserve the status quo, but colleges generally give cheaters a slap on the wrist, and that won’t change. Unmonitored academic work will become optional, or a farce. The only thing that will really matter will be exams. And unless the exams are in-person, they’ll be a farce, too. Chronicle of Higher Ed

“I think we should just get used to the fact that we won’t be able to reliably tell if a document is either written by AI — or partially written by AI, or edited by AI — or by humans,” computer science professor Soheil Feizi said. “We should adapt our education system to not police the use of the AI models, but basically embrace it to help students to use it and learn from it.” Washington Post

Also:

21 quotes about cheating with AI & plagiarism detection                        

13 quotes worth reading about Generative AI policies & bans                   

20 quotes worth reading about students using AI                                    

27 quotes about AI & writing assignments            

22 examples of teaching with AI                                                           

27 thoughts on teaching with AI   

13 thoughts on the problems of teaching with AI