Addressing AI tools such as ChatGPT
Not the end of the world, but needs attention nonetheless
If you are concerned that students in your classes may be tempted to use an AI to circumvent the learning process, there are steps you can take to refocus their attention on doing the work themselves.
Be clear and specific about your expectations
Companies are actively promoting generative AI tools as labor-saving devices. The default faculty senate policy is to ban all use in classes, while allowing individual instructors to set their own policies. With so many mixed messages, it is important to be very clear with your students about your expectations. Standard academic honesty statements about plagiarism and paper-buying should cover the most obvious misuses of AI, but adding specific language will help make your position clear (see below for example language).
Generative AI can be used for more than just content creation. GenAI tools can also be used to help with other cognitive processes such as summarizing readings, organizing ideas, planning projects, and getting feedback on drafts. If your course objectives are focused on developing such skills, make it clear to students that use of AI for these activities because it will interfere with their development of these skills. If use of AI for cognitive support will not interfere with your course objectives, addressing this in your AI policy will let students know that this kind of use is allowed, and especially helpful for first generation, international, and neurodivergent students.
Crafting an AI Syllabus Statement
The first place to set expectations about AI is in your syllabus. Without a clear statement, students are left to interpret and make assumptions about what is and is not allowed (especially when some companies actively push AI use on students by implying certain uses 🙄 are OK.)
Resources and Templates to help craft an AI policy:
- Should You Add an AI Policy to Your Syllabus? by Kevin Gannon. (spoiler: yes.)
- Draft / Refine an AI policy for your course - Center for Teaching, UMass Amherst.
- Sample Syllabus AI Statements - Digital Learning, College of Education (based on CFT advice, this Google Doc offers a choice of statements based on the level of AI use allowed.)
Enforcing GenAI Policies
As we get more familiar with work produced by AI (generic, bland, slightly off topic, and sometimes riddled with hallucinations) it is becoming easier to spot plagiarized AI work. However, there are two important issues to keep in mind:
- Automated "AI checkers" do not have a good track record of accurately identifying AI-generated content, and should only be used as a tool to flag work for further checking by a human.
- Some language use that gets flagged as AI-generated (by both humans and automated tools) can often be produced by a human who is writing in a language they learned later in life, or who's cognitive divergence affects their style of writing.
If you have the capacity (or a small class), it is best to assume good intent and approach students who you think may be using AI with questions before dropping accusations of unethical behavior.
Design assignments that are harder to complete with AI
When an assignment is crafted like an AI prompt ("Please summarize the main points of...") it can be hard to resist using AI as a shortcut. If you are concerned about AI use, and have the time to make adjustments, there are assignments that are harder to complete using AI.
Starter suggestions
- Personal reflections and application of content to individual experience.
- Low stakes writing assignments (or survey responses) early in the class to get a sense of writing style and ability before assigning longer, higher-stakes assignments.
- Split larger assignments into stages that are turned in along the way for review and revision.
- Formats that require in-person work or are not (yet) as easy to produce with AI; such as handwritten responses on paper, visualizations, in-person presentations, or recordings of live demonstrations. (Caution: many of these approaches will require accommodations for people who have been able to typically use technologies to overcome physical or cognitive limitations. Be prepared to offer alternatives.)
Many of these are adapted from strategies developed by Peter Elbow and outlined in several of his teaching handouts and articles on Low Stakes Writing Assignments.
Design assignments that critique and reflect on AI use
Use of GenAI is becoming commonplace in the workplace. Before students are asked to asked to use AI for a job, we can help them better understand what AI is capable of doing and the risks of over-reliance on large language models that simply spout the "most likely answer" with a confidence that doesn't allow for bias, uncertainty, or edge cases that exist no matter what probability has to say. If you have the bandwidth, making AI use an active part of your course can help students in the long run.
Starter suggestions:
- Explain the probabilistic nature of genAI tools and discuss the issues that may arise from this approach. (not to mention the environmental impact and ethical issues of how many tools are t
- Run an assignment prompt through AI and share it with students to critique before they write their own versions of the assignment.
- Ask students to write their own response to a prompt, then ask an AI, then write up an comparison of the two works.
- Assign prompts that are known to be problematic (my favorite is "explain Bloom's Taxonomy") and ask students to critique the responses.
- Prompt students to interrogate the AI after getting a response and report on the exchange: "Why did you give this answer?", "how certain are you that this is correct?", "what did you leave out of this response?"
- Explain the probabilistic nature of large language models and have students look for patterns that reveal how the output is shaped by probability, not an actual intelligent agent.
Where to learn more
Torrey Trust in the College of Education continues to provide excellent advice on adapting to student use of AI in ways that look more closely at motivations of students, concerns of faculty, and the true functionality of AI tools. These posts are a good place to start:
- Addressing AI tools (Digital Learning, College of Education)
- Essential Considerations for Addressing the Possibility of AI-Driven Cheating - Part One (Trust)
- Essential Considerations for Addressing the Possibility of AI-Driven Cheating - Part Two (Trust)
- How Do I Consider the Impact of AI Tools in My Courses? (CTL)
- How Do I (Re)design Assignments and Assessments in an AI-Impacted World? (CTL)
- From Tool to Temptation: AI’s Impact on Academic Integrity (IDEAS)
- Teaching Students to Be Critical AI Users (IDEAS)
If you have questions or concerns about AI tools, the folks in the college's digital learning group are happy to chat about creative options and connect you with strategies that match your objectives and your capacity. Contact digitallearning [at] umass [dot] edu or visit their page on Addressing AI tools.