Skip to main content

This document is prepared by the Graduate Studies Committee (GSC) and outlines the department policy on graduate student use of generative AI technology in the areas of research and teaching while enrolled as a graduate student in the department. AI use may include, but is not limited to, content research, code generation and optimization, graphic generation, and writing aid. 

  1. Research
    1. Research Related to Degree Requirements

      When submitting research related to their degree requirements, such as the second-year paper and dissertation chapters, graduate students must disclose their use of AI tools. An AI disclosure statement must be included in the submitted work and clearly state the names of the AI tools used, the specific purposes of their use, and the extent of human oversight. Failure to disclose AI use may constitute academic dishonesty, and the GSC may impose penalties in accordance with the UMass Academic Integrity Policy

    2. Research Publications

      Graduate students seeking to publish their research papers must disclose their use of AI tools. This includes working papers, research that is not part of the student’s degree requirements, and 

      regardless of whether a disclosure is required by a specific journal. An AI disclosure statement must be included in the submitted work and clearly state the names of the AI tools used, the specific purposes of their use, and the extent of human oversight. Failure to disclose AI use may constitute academic dishonesty, and the GSC may impose penalties in accordance with the UMass Academic Integrity Policy.

    3. Collaborative Research

      When working as a co-author or a research assistant (RA) on a research project, graduate students are required to disclose the use of AI tools to their collaborators. This includes collaborators who are faculty or non-faculty (students, independent researchers, etc.) and those who work at UMass or in other institutions. Graduate students should discuss allowable AI use at the start of a working relationship with project collaborators. Before using AI tools, the student must ensure that a written record of the agreement (e.g., email exchange) is accepted by everyone involved in the project. Be mindful that agreement to use AI tools on a specific aspect of the project (e.g., coding) does not generalize to other aspects (e.g., writing).

  2. Teaching
    1. As Course Instructor

      Graduate students who teach a course as instructor of record must abide by the university policy on AI use and include an AI use policy in their syllabus. Instructors should be mindful that students may use AI in their assignments and should try to design assignments and assessments that are not easily answered by AI tools. Instructors are encouraged to monitor possible AI use. When confronted with a suspected case of inappropriate AI use, instructors should address it in accordance with the course and University policies. Instructors may also seek advice from faculty mentors when this issue arises.

      Note that uploading assignments containing Family Educational Rights and Privacy Act (FERPA) protected information in an AI tool constitutes a violation of FERPA rights and may lead to disciplinary or legal actions.

    2. As Teaching Assistant 

      Graduate students who serve as teaching assistants (TAs) for a course should familiarize themselves with the university and course AI policies and work with the course instructor to monitor for possible violations by students. 

      TAs may not use AI tools to grade assignments, provide written feedback, or compile grades unless there is a written agreement with the course instructor. TAs should not assume that an agreement to grade one assignment (or one aspect of an assignment) using AI tools applies to other assignments, unless specified in the written agreement.

      Note that uploading assignments containing FERPA-protected information in an AI tool constitutes a violation of FERPA rights and may lead to disciplinary or legal actions.