What Teachers should Do about AI

1. Learn about generative AI dangers related to biases, privacy concerns, and the impact of AI on vulnerable students. Consider how chatting with AI systems affect vulnerable students, including those with depression, anxiety, and other mental health challenges. (handout: Dangers of AI)

2. Experiment with AI to see if it can enhance your teaching methods and plans. Consider how AI could be ethically used in education and where you draw the line in how you use it to do your work.

3. Talk with students about your expectations regarding the use of generative AI in class. College faculty should include a syllabus statement offering clear guidance regarding expectations for the use of generative AI in the classroom and have open and frank discussions with students about the expectations regarding its use. (handout: AI use cases)

4. Explain to students what counts as AI-enabled plagiarism and when its use is appropriate, especially considering that it is being integrated into many commonly used tools (meaning a blanket ban on its use is nearly impossible). Consider that the answer to this question will change depending on the assignment, the subject, and the learning outcomes.

5. Avoid depending on AI detectors due to their limitations (false positives and legal issues). Rather than focusing on catching cheaters, faculty should focus on developing new pedagogy to address the evolving technology (similar to the rise of the internet).

6. Get students to wrestle with it along with you. 

7. Help students learn to fact check AI-generated writing outputs. They need a healthy skepticism.

8. Talk about AI transparency, providing examples.  

9. Develop pedagogical options for controlling the use of AI: Pen & paper, Blue Books, oral exams, in-class presentations, the use of Google Docs or other writing tools that track writing history, personalization, concept-mapping, scaffolding assignments, etc. Decide what are the cognitive tasks that students need to perform without AI assistance. 

10. Develop new rubrics and assignment descriptions taking generative AI into account. Some assignments should be AI-free by design. Others should actively engage AI, teaching students to evaluate, direct and improve its outputs.

11. Learn AI & double down on what makes you human. It’s never all one-sided. Avoid extreme positions of all-in or all-out Go down both roads. Learn how to use it skeptically—what it can do and its limits, knowing this an ongoing chore. Double down on what makes you human. Focus resources on the other side of the equation—that is, helping students set themselves apart from simply being good at using AI to developing the skills that will become rare and valuable because of AI limitations (including communication, creativity, and flexibility). Help students develop a healthy and ethical use of generative AI as you do this yourself.

12. Prepare students for their careers. They will enter a world where AI usage is expected. Keep in mind that this expectation is that AI will allow employees to do more work faster.   

23 Articles about AI & Teaching

Adapting to a New World: Teachers on How A.I. Is Reshaping the Classroom - New York Times

In some classrooms, teachers ask: Can AI teach students to write better? – Washington Post

These Tools Say They Can Spot A.I. Fakes. Do They Really Work? – New York Times 

College students, professors are making their own AI rules. They don't always agree – KPBS  

Teens Are Using AI-Fueled ‘Slander Pages’ to Mock Their Teachers - Wired

China’s Parents Are Outsourcing the Homework Grind to A.I. - The New York Times 

What AI Is Teaching Us About Humanities Education – The Dispatch  

Will Agentic AI Break Higher Education? – Chronicle of Higher Ed

‘A.I. Literacy’ Is Trending in Schools. Here’s Why. - The New York Times

Agentic AI Can Complete Whole Courses for Students. Now What? – Inside Higher Ed

The Lesson of A.I. Literacy Class: Don’t Let the Chatbot Think for You - The New York Times

In some classrooms, teachers ask: Can AI teach students to write better? - Washington Post 

I’m Not Worried AI Helps My Students Cheat. I’m Worried How It Makes Them Feel - EdWeek

My students compared my writing against ChatGPT – and they all preferred the AI – The Independent

AI Detection Pushed my Students to use AI – Chronicle of Higher Ed

The risks of AI in schools outweigh the benefits, report says - NPR

The Solution to AI Cheating Is Within Our Grasp - Chronicle of Higher Ed

As Schools Embrace A.I. Tools, Skeptics Raise Concerns - The New York Times

More Teachers Are Using AI in Their Classrooms. Here’s Why - EdWeek

To Solve the Student-Attention Problem, Professors Turn to Pencils and Paper - Chronicle of Higher Ed

A veteran teacher explains how to use AI in the classroom the right way – Scientific American

California schools debate how much AI belongs in classrooms - Ed Source

This MIT prof says we don't know enough about AI to teach it - KJZZ

AI Glasses During Exams

“What is to stop someone from sitting in the back of a classroom and whispering into their glasses to say, ‘Hey, I need help with solving this problem,’” said Luke Hobson, an assistant director of instructional design at MIT. “Every time I see someone saying, ‘Blue books are the future,’ I’m like, ‘So are we going to ban students from wearing glasses?’” -Inside Higher Ed

26 Recent Articles about AI & Teaching

You Can’t AI-Proof the Classroom, Experts Say. Get Creative Instead. – Inside Higher Ed

Teachers are using software to see if students used AI. What happens when it's wrong? – NPR

Professors are turning to this old-school method to stop AI use on exams – Washington Post 

I’m a Professor. A.I. Has Changed My Classroom, but Not for the Worse – New York Times

OpenAI Is Giving Teachers Their Own ChatGPT, Free Through 2027 - Newsweek 

How AI Is Changing Higher Education – Chronicle of Higher Ed  

AI-generated lesson plans fall short on inspiring students and promoting critical thinking – The Conversation

Universities are embracing AI: will students get smarter or stop thinking? – Nature

Is AI dulling our minds? Experts weigh in on whether tech poses threat to critical thinking, pointing to cautionary tales in use of other cognitive labor tools – The Harvard Gazette

Are we teaching students AI competence or dependence? - London School of Economics  

AI Has Joined the Faculty - Chronicle of Higher Ed 

To adopt or to ban? Student perceptions and use of generative AI in higher education – Nature

What are the clues that ChatGPT wrote something? - Washington Post

Stop Pretending You Know How to Teach AI - Chronicle of Higher Ed

Their Professors Caught Them Cheating. They Used A.I. to Apologize. - New York Times

Teaching Students to Think Critically About AI – Harvard Graduate School of Education

AI-powered textbooks fail to make the grade in South Korea – Rest of World  

More college students are using AI for class. Their professors aren't far behind – NPR

From Yale to MIT to UCLA: The AI policies of the nation's biggest colleges – Mashable

A researcher’s view on using AI to become a better writer – Hechinger Report  

I Want My Students’ Effort, Not AI’s Shortcut to Perfect Writing – Edsurge

AI-resistant strategies - Chronicle of Higher Ed 

What’s working, not on front lines of AI in classroom - The Harvard Gazette

AI Tutors Are Now Common in Early Reading Instruction. Do They Actually Work? – Edweek

Bridging pedagogy and technology: a generative AI and IoT approach to transformative English language education – Nature

Teaching: How to respond when students don’t want to work with AI - Chronicle of Higher Ed 

What a computer science degree should look like now

Experts suggest that computer science degree requirements should move away from coding and align with the expectations of a liberal arts degree—critical thinking and communication skills, along with computational thinking and AI literacy. The new CS coursework would include basic principles of computing and AI, along with hands-on experience in designing software using new AI tools. AI tools can help with the building of prototype programs, check for coding errors and serve as a digital tutor. 

Computational thinking involves breaking down problems into smaller tasks, developing step-by-step solutions and using data to reach evidence-based conclusions. AI  literacy is an understanding — at varying depths for students at different levels — of how AI works, how to use it responsibly and how it is affecting society. Nurturing informed skepticism should be a goal.

Read more at the NYT: How Do You Teach Computer Science in the AI Era?

AI Learning Options

“In facilitating learning, AI gets us more quickly to the important work (examples might include providing suggestions for how to start researching a topic or possible ways to phrase something). In replacing learning, AI does the important work for us (such as answering exam questions). To these I would add a third category: supplementing learning, the murky middle where AI is used alongside or incorporated into one’s own work (such as providing supporting data or creating an essay outline). Naming this usually unrecognized middle ground is important, because whether AI is helping or harming in these cases will often depend on the context and goals. Managing appropriate forms of AI use will likely be one of our society’s major challenges going forward, in education and elsewhere.” -Chronicle of Higher Ed

How Students are Using AI: Here's what the Data Tell Us

  • AI use by students is increasing.

  • The higher the education level, the more likely that students will use AI. 

  • Business, STEM, and social-science majors are more likely to use AI and are less likely to have concerns about using it than humanities majors. 

  • Top uses by students: information or getting explanations (50-70 percent of respondents in the studies cited above); generating ideas or brainstorming (40-50 percent); and writing support, including checking grammar, editing, starting a paper, and drafting an essay (30-50 percent).

  • 86 percent of students who use ChatGPT for assignments say their use was undetected.

  • A plurality of students think AI will have both positive and negative consequences.

  • A study of high-school students conducted before and after AI became mainstream found no increase in the percentage of students who cheat.

  • 15-25 percent of students across several studies feel AI should not be allowed at all in education or refuse to use it themselves.

  • In a survey asking students why they use AI, the strongest agreement was with the statement that AI “will not judge me” followed by anonymity.

  • Four out of five students think their institutions have not integrated AI sufficiently.

  • 55 percent of students think overreliance on AI in teaching decreases the value received from a course.

  • 89 percent are worried about AI grading.

  • Students think AI is important, in other words, but not that it should replace professors.

    Read more in The Chronicle of Higher Ed

What the Humanities are For

As one student said to his professor at New York University, in an effort to justify using AI to do his work for him, “You’re asking me to go from point A to point B, why wouldn’t I use a car to get there?” It’s a completely logical argument — as long as you accept the utilitarian vision. The real solution, then, is to be honest about what the humanities are for: You’re in the business of helping students with the cultivation of their character. -Sigal Samuel writing in Vox

The Student Herself is the Product

In a humanities education the student herself is the product. She is what’s getting created and recreated by the learning process. This vision of education — as a pursuit that’s supposed to be personally transformative — is what Aristotle proposed back in Ancient Greece. He believed the real goal was not to impart knowledge, but to cultivate the virtues: honesty, justice, courage, and all the other character traits that make for a flourishing life. -Sigal Samuel writing in Vox

19 Recent Articles about the Impact of AI on Students

California colleges spend millions to catch plagiarism and AI. Is the faulty tech worth it? – Cal Matters

My students think it’s fine to cheat with AI. Maybe they’re onto something. – Vox  

Panel with AI experts to review appeal of NTU student penalised for academic misconduct - The Straits Times 

How AI Is Helping Students Find the Right College – Wired

Chinese AI firms block features amid high-stakes university entrance exams – Washington Post

6 College Majors That Will Thrive In An AI-Driven Economy – Forbes

For Some Recent Graduates, the A.I. Job Apocalypse May Already Be Here – New York Times

AI cheating surge pushes schools into chaos – Axios

Here are some guiding ideas to keep in mind as you navigate college in the era of artificial intelligence – Student Guide to AU

A New Headache for Honest Students: Proving They Didn’t Use A.I. – New York Times

What My Students Had To Say About AI – The Broken Copier

Using ChatGPT, students might pass a course, but with a cost – PhysOrg

How Are Students Using AI? – AI and How We Teach

Students Are Humanizing Their Writing—By Putting It Through AI – Wall Street Journal

Why misuse of generative AI is worse than plagiarism – Springer

Students, early career workers use ChatGPT as a mentor - Axios

How Students Use and Think About Their Use of AI – Daily Nous

How AI Helps Our Students Deepen Their Writing (Yes, Really) – EdWeek

As if graduating weren’t daunting enough, now students like me face a jobs market devastated by AI – The Guardian

We miss the thoughts of our students

For a lot of us, our motivation to enter academe was primarily about helping to form students as people. We’re not simply frustrated by trying to police AI use, the labor of having to write up students for academic dishonesty, or the way that reading student work has become a rather nihilistic task.  Our frustration is not merely that we don’t care about what AI has to say and therefore get bored grading; it is that we actively miss reading the thoughts of our human students. -Megan Fritts writing in the Chronicle of Higher Ed

"Current AI Detectors are Not Ready"

"A new study of a dozen A.I.-detection services by researchers at the University of Maryland found that they had erroneously flagged human-written text as A.I.-generated about 6.8 percent of the time, on average.  'At least from our analysis, current detectors are not ready to be used in practice in schools to detect A.I. plagiarism,' said Soheil Feizi, an author of the paper and an associate professor of computer science at Maryland."  -New York Times


"Madness" on Campus

On campus, we’re in a bizarre interlude: everyone seems intent on pretending that the most significant revolution in the world of thought in the past century isn’t happening. The approach appears to be: “We’ll just tell the kids they can’t use these tools and carry on as before.” This is, simply, madness. And it won’t hold for long. -D. Graham Burnett writing in The New Yorker

Fake AI Students

“By the end of the first two weeks of the semester, Smith had whittled down the 104 students enrolled in her classes, including those on the waitlist, to just 15. The rest, she’d concluded, were fake students, often referred to as bots. ‘It’s a surreal experience and it’s just heartbreaking,’ Smith said. ‘I’m not teaching, I’m playing a cop now.’” - Voice of San Diego

What’s our job?

Last year, I sat in a faculty meeting while a guest lecturer gleefully explained how they had used AI to design their class, craft PowerPoint presentations, and develop exams. At the end of the presentation, a colleague leaned over and asked, “Then what’s our job?” I have thought long and hard about that question. If faculty hope to survive, much less prosper, in the age of AI, they need to come up with a compelling answer to that question: “What’s our job?” -Scott Latham writing in the Chronicle of Higher Ed

AI Attending Class

Two students in Austria created a program that is attending classes and is treated like any other student. It attends lectures, turns in artwork for assignments, collaborates with classmates and will receive grades on submitted work. ‘Flynn’ is testing the boundaries of artificial intelligence tools, and could, in theory, progress toward a diploma.” - Washington Post