The University of Massachusetts Amherst

campus
University News

UMass-Based Center Issues New Guidance on AI Use in Dispute Resolution

Image
National Center for Technology and Dispute Resolution logo

The National Center for Technology and Dispute Resolution (NCTDR) at UMass Amherst is helping to shape how artificial intelligence is used in dispute resolution, as the technology becomes ubiquitous in legal settings, whether online or in person.

NCTDR and the International Council for Online Dispute Resolution (ICODR) recently released new guidance for third parties, including judges, attorneys, mediators and arbitrators, on applying AI within dispute resolution processes.

The document expands on online dispute resolution standards first issued by NCTDR and ICODR in 2022 and adopted last year by the International Organization for Standardization (ISO). It’s designed to help practitioners decide when and how to use AI tools responsibly as they become increasingly integrated into both online and in-person conflict resolution.

Image
Leah Wing
Leah Wing

“We are facing the profound and rapidly expanding impacts of AI in every sector of society. This raises ethical and practical questions about how best to harness it to increase access to justice and other social goods while preventing its worst externalities,” says NCTDR Director and 2026 ICODR President Leah Wing, senior lecturer II and honors director of legal studies at UMass Amherst. “It is our hope that this new guidance and our previously issued standards will valuably contribute towards that goal.”

The guidance cautions that while AI can improve efficiency and access to justice, it also introduces risks, including bias, data security concerns and lack of transparency. It calls for consistent human oversight and stresses that practitioners remain responsible for decisions made with the help of AI.

It also reinforces core ethical principles, such as confidentiality, competence and transparency. Practitioners are advised to clearly disclose when AI is used, understand the systems they rely on and ensure that sensitive information is protected.

The guidance includes a practical checklist of questions to help dispute resolution professionals evaluate AI tools before using them, including considerations around access, fairness, legal compliance and data protection.

The guidance aims to support both practitioners and developers as AI adoption continues to outpace formal regulation, offering a foundation to maintain trust and accountability in dispute resolution systems.