On Monday, February 24, 2025, faculty members and teaching assistants from the Kinesiology Department at UMass Amherst gathered for a seminar on “Designing or Modifying Assignments with AI in Mind.” Tim Sheaffer and Joan Giovannini from the Instructional Design, Engagement, and Support (IDEAS) led the seminar, which explored how generative AI (GenAI) can support student learning, contribute to ethical concerns, and integrate, strategically, into assignments and assessments.
Key Takeaways:
- Transparency: Faculty should clearly define AI usage policies in their syllabi, specifying when and how AI can be used in assignments to align with learning objectives.
- AI Can Support (But Not Replace) Learning: AI tools can aid brainstorming, drafting, and feedback. Faculty should guide students in using AI to supplement critical thinking rather than as a shortcut to answers.
- AI Literacy is Crucial: Students and faculty need to understand AI’s strengths, limitations, and ethical considerations to ensure its use fosters learning rather than widening the digital divide.
Warming Up: AI in the Classroom
The seminar kicked off with a discussion about AI’s presence in students’ academic work. Faculty acknowledged that students are already using AI tools and discussed the need to guide their use rather than ignore or prohibit it outright. The key takeaway? Educators need to navigate the role of AI in assignments intentionally and transparently.
The seminar explored how students interact with GenAI tools. Insights included:
- Common Roadblocks:
- Hallucination: AI sometimes generates false or misleading information.
- Data Limitations: AI is not always up-to-date and relies on historical data.
- Prompting Challenges: The quality of AI output depends on the user’s input.
- Bias: AI responses can reflect implicit biases in the data they were trained on.
- How AI Can Be Useful: Despite its limitations, AI can be a powerful learning tool when used effectively. Some productive applications include:
- Brainstorming ideas
- Generating first drafts
- Providing instant, personalized feedback
What’s Ethical and Productive? AI in Assignments
The next discussion revolved around the ethical and productive use of AI, particularly in assignments. A key question surfaced: Can AI support your students in assignments and assessments that are aligned with the learning goals of your course? Joan emphasized that the answer depends on the assignment goals and expectations. Faculty should:
- Clearly Define AI Use in the Syllabus: Faculty should state when and how AI is allowed (or not) in assignments; otherwise, students will not know where they stand on it.
- AI as a Tool: One productive example of AI use is when students use AI to explore different viewpoints rather than just generate answers.
- Creativity Concerns: Over-reliance on AI may limit originality, as AI tends to generate similar responses. Students should understand this drawback.
- AI and Coding: Should students rely on AI for coding help? The discussion emphasized that students should understand coding basics before turning to AI for assistance, especially if they plan to code in the workforce.
- AI Detector Tools Are Unreliable: Faculty should not depend on AI detection tools, as they often produce false positives and can discriminate against non-native English speakers.
Approaches to AI in Coursework
What’s the best way to integrate AI into coursework? Joan introduced an EDUCAUSE framework outlining three strategies.
- Mitigate: Faculty can opt for banning AI use, which may not be effective as AI is pervasive. Rather than designing AI-proof assignments, it is much better for faculty to design assignments where students will not turn to AI for answers.
- Support: Faculty explore AI with students, fostering discussions on its strengths and limitations.
- Elevate: This strategy encourages faculty to fully integrate AI into coursework for real-world skill development.
Final Thoughts: Embracing AI Thoughtfully
Tim and Joan encouraged faculty to engage with AI thoughtfully by designing assignments that foster critical thinking, transparency, and adaptability. One simple way starts by explicitly communicating AI guidelines in the syllabus and assignments.
These guidelines should cover:
- Purpose: Why are students doing the assignment?
- Boundaries: When is AI use allowed or restricted?
- Evaluation. How will student work be assessed?
- Ethical Considerations: Why or why not should AI be used?
As AI becomes a permanent part of academic and professional life, the challenge is not just how to prevent misuse but rather how to harness AI’s potential to enhance student learning.