April 28, 2025
EdTech IDEAS Digest - Trends

What The Text is About
EDUCAUSE, a nonprofit that aims to advance higher education through the use of information technology, conducted a study, which summarizes current sentiments and experiences related to strategy and leadership, policies and guidelines, use cases, the higher education workforce, and the institutional digital divide. Based on the findings, we share key trends on these topics in relation to artificial intelligence (AI)

2025 EDUCAUSE AI Landscape Study: Into the Digital AI Divide

In EDUCAUSE's 2025 AI Landscape Study: Into the Digital AI Divide, the research investigates current AI sentiment and experiences within academia, utilizing responses from 788 professionals across higher education institutions. This study delves into key areas such as use cases, strategy, and existing policies.

Drawing on participants' insights, we will outline AI usage and the sentiments surrounding policies and the future landscape.

01 AIs already rampant use

AI is being utilized by faculty, staff, and students across various functional areas within institutions. The adoption of AI varies among different stakeholder groups. Notably, student adoption of AI has been a significant concern and strategic focus throughout this report. A majority of respondents indicated that students primarily use AI for solving problems (69%), proofreading and editing their work (67%), summarizing content such as lecture notes and articles (61%), brainstorming (55%), and generating images or audio (54%).

A chart titled "Student use of AI powered tools" - top responses: 69% Getting answers to problems; 67% proofreading/editing their work, 61% summarizing content; 55% brainstorming; 54% image or audio generation; 49% personal entertainment; 49% language translation; 44% creating study materials; 33% developing AI literacy; 29% explaining solutions to problems without getting answers; 22% learning discipline-specific workforce skills; 22% obtaining personalized feedback on course assignments or exams

 

On the faculty and staff side of things, over 80% of respondents reported they are already using AI for at least one work-related task. At the top of the list of AI uses are summarizing content (74%) and brainstorming or ideating on work challenges (71%). By understanding which tasks benefit most from AI assistance, institutions can better allocate resources, improve efficiency, and enhance productivity among faculty and staff. This information can also guide future investments in AI technologies to support educational goals.

 

Chart titled "Administrative tasks for which respondents are using AI-powered tools" - top results: Summarizing content 71%; Brainstorming or ideating on work challenges 71%; Creating presentations or slides 51%; Writing email 49%; Creating meeting notes 48%; Developing project plans 43%; creating teaching materials 38%; Writing letters of recommendation 31%; Completing performance related tasks such as annual reviews 21%; hiring 6%; Other 11%; none 8%

 

02 More policies and training essential

Attitudes toward AI seem to be improving in comparing and contrasting EDUCAUSE’s 2024 report to 2025, there’s a marked increase in enthusiasm for AI and a slight decrease in apprehensive attitudes from respondents. 

Chart titled "Impressions of Institutional Leader's General Attitudes Towards AI" - 2024, group of 910 surveyed: 23% Very cautious/cautious, 8% indifferent, 19% Very enthusiastic/enthusiastic, 42% a mix of caution and enthusiasm, 7% don't know/other; 2025 group of 788: 20% very cautious/cautious, 6% indifferent, 25% very enthusiastic/enthusiastic, 46% a mix of caution and enthusiasm, 4% don't know/other

 

Leaders are still grappling with how to strategize around AI in their institutions. More than half of responding institutions have some areas where they are working on AI-related strategy, but there is currently no institution-wide approach. Only 11% of respondents noted that there is no AI-related strategy at their institution.

Chart titled Institution's approaches to AI related strategy: We do not have any AI related strategy 11%; Some areas of the institution are working on AI-related strategy but there is no institution-wide approach 55%; we have an instution-wide AI approach to ai related strategy 22%; other 9%; don't know 3%;

 

This is particularly noteworthy as AI is transforming numerous facets of higher education. From administrative tasks to personalized learning experiences, AI's influence is pervasive. Consequently, the policies impacted by AI are extensive and wide-ranging, necessitating careful consideration and adaptation to ensure they effectively address the evolving landscape of academia.

Chart titled types of institutional policies impacted by AI - Teaching and learning 70% already impacted in 2024, 68% in 2025; technology already impacted 44% in 2024, 56% in 2025; cybersecurity already impacted 40% in 2024, 53% in 2025

 

Campuses are adopting a human-centric approach to AI strategy. With AI being ubiquitous and easily accessible, the focus is on supporting and managing the human aspects of AI integration and less so on the actual tools themselves.

Chart titled Elements of AI Related Strategy - top results: Providing training for faculty to learn new AI technology or skills 63%; Providing training for staff to learn new AI technology or skills 56%; Increasing access to AI tools 50%; Implementing or improving data privacy policies or guidelines 49%; Implementing or improving cybersecurity policies or guidelines 44%; Providing training for students to learn new AI technology and skills 41%; increasing collaboration across the institution 40%;

 

Risks and Opportunities

AI presents both promising opportunities and potential concerns. The concerns about AI seem to be very consistent. Ninety one percent (91%) of respondents said they are concerned about increased misinformation, 90% are concerned about use of data without consent, and 88% have concerns about the inability to evaluate AI-generated content. Of the 22 areas of concern that were brought up, more than three quarters of respondents said they were at least “somewhat” concerned about 18 of them.

The chart titled Concern about AI-related risks shows increased misinformation, use of date with consent, and insufficient data protection as the three leading concerns.

Looking ahead to the next two years, respondents generally held an optimistic view of AI's impact on higher education. They were particularly positive about AI's potential to enhance learning analytics and improve accessibility for those with disabilities. However, there were concerns about student increase in dependency on AI as well as an increase in academic dishonesty.

Chart titled respondents predicitons about the impacts of AI on higher education by 2027 - Students trust AI too much 56%; Academic dishonesty has increased 55%; AI tools reduce workloads 52%; AI tools widen the digital divide 42%

These trends highlight the need for policy to help mitigate unethical uses of AI that are of a concern to many respondents. Developing AI governance, conducting additional trainings for faculty, staff, and students, and conducting risk assessments are just some of the ways institutions can mitigate AI-related risks.

Inspiring resources for faculty: