Addressing AI tools such as ChatGPT
Not the end of the world, but needs attention nonetheless
If you are concerned that students in your classes may be tempted to use an AI to circumvent the learning process, there are steps you can take to refocus their attention on doing the work themselves. At minimum, be clear with your students about your expectations about the use of AI tools in your classroom. Assuming that all use of AI is cheating can cause harm to students from certain groups who may be falsely accused or prevented from using AI-based accommodations that would otherwise help them level the playing field
Torrey Trust in the College of Education continues to provide excellent advice on adapting to student use of AI in ways that look more closely at motivations of students, concerns of faculty, and the true functionality of AI tools. These latest posts are a good place to start:
Essential Considerations for Addressing the Possibility of AI-Driven Cheating - Part One
Essential Considerations for Addressing the Possibility of AI-Driven Cheating - Part Two
If you are looking for how to adapt your syllabus to address AI, this recent column by Kevin Gannon in the Chronicle has a good, concise approach:
Should You Add an AI Policy to Your Syllabus? (spoiler: yes.)
If you have questions or concerns about AI tools, the folks in the college's digital learning group are also happy to chat about creative options and connect you with strategies that match your objectives and your capacity. Contact digitallearning [at] umass [dot] edu or visit their page of advice and resources:
Addressing AI tools (Digital Learning, College of Education)