Despite the visibility and energy surrounding artificial intelligence, higher education is taking measured steps toward incorporating this technology.
EDUCAUSE is helping institutional leaders, IT professionals, and other staff address their pressing challenges by gathering and sharing data. This report is based on an EDUCAUSE QuickPoll. QuickPolls enable us to rapidly gather, analyze, and share input from our community about specific emerging topics.Footnote1
We are on the verge of peak hype about how artificial intelligence (AI) can (and will) transform our lives. No fewer than seven emerging AI technologies were prominently featured on "The Gartner Hype Cycle for Emerging Technologies, 2020."Footnote2 Several technologies on EDUCAUSE's "The Top 10 Strategic Technologies for 2020" explicitly incorporate or are reliant on AI.Footnote3 And while AI might seem to be a technology in search of a campus, some promising applications have been emerging in domains such as teaching and learning, student success, and accessibility.Footnote4 But how widespread is the use of AI in higher education today? In this QuickPoll, we operationalized Elana Zeide's categories of AI applications in higher educationFootnote5 to better understand how and how widely AI is being used for institutional tasks, student success and support tasks, and instructional tasks.Footnote6
The Bottom Line
AI is most developed for instructional use, especially for monitoring student behavior during exams and ferreting out plagiarism. AI is being used the least for institutional tasks. Significant numbers of respondents reported that they don't know the status of AI at their institutions across all categories, suggesting that AI use may be obscure and/or intangible.Footnote7 Immature data governance, concerns about algorithmic bias, and ineffective data management and integration pose the greatest challenges to the implementation of AI in higher education. For now, the hype surrounding the revolutionary impact of AI on higher education appears to be just that—hype.
The Data: Instructional Use
AI is being used to monitor students and their work. The most prominent uses of AI in higher education are attached to applications designed to protect or preserve academic integrity through the use of plagiarism-detection software (60%) and proctoring applications (42%) (see figure 1). Although both applications, especially the former, have been in use for some time, the latter has experienced considerable growth due to the expansion of online learning during the pandemic. Both types of tools have come under scrutiny for violating the privacy of students and producing false positives; the use of proctoring software is also associated with a litany of problems related to exam performance due to anxiety, technology failures, and socioeconomic and racial bias.Footnote8 Responding to some of these concerns, one provider recently announced that it will no longer provide systems based solely on AI, requiring a human being to analyze the captured video.Footnote9
Figure 1. AI Usage for Instructional Tasks
AI is not going to replace instructors anytime soon. Most respondents reported that AI is not in use at their institutions as it relates to instructional tasks, excepting plagiarism-detection software and proctoring applications. Majorities of respondents told us that AI is not in use—and that there are no plans to use it in the future—for key instructional tasks such as providing feedback on assignments, tutoring, conducting assessments, and grading assignments. Although substantial percentages of respondents told us that their institution is tracking AI for these tasks, usage appears to be limited.
The Data: Student Success and Support Use
The chatbots are coming! The chatbots are coming! A sizable percentage (36%) of respondents reported that chatbots and digital assistants are in use at least somewhat on their campuses, with another 17% reporting that their institutions are in the planning, piloting, and initial stages of use (see figure 2). The use of chatbots in higher education by admissions, student affairs, career services, and other student success and support units is not entirely new, but the pandemic has likely contributed to an increase in their use as they help students get efficient, relevant, and correct answers to their questions without long waits.Footnote10 Chatbots may also liberate staff from repeatedly responding to the same questions and reduce errors by deploying updates immediately and universally.
Figure 2. AI Usage for Student Success and Support Tasks
Student success tools are a potential area of growth for AI. A limited but comparatively sizable group of respondents reported that AI is being used for student success tools such as identifying students who are at-risk academically (22%) and sending early academic warnings (16%); another 14% reported that their institutions are in the stage of planning, piloting, and initial usage of AI for these tasks. That said, these numbers seem low, given that student success tools have been around for nearly a decade and are deployed widely.Footnote11 One possible explanation for this discrepancy is semantic—some might not view the analytics that power many student success tools as AI when, in fact, analytics is a type or subset of AI.Footnote12
The Data: Institutional Use
AI is sparsely used for institutional tasks. Most respondents reported that AI is not in use at their institutions as it relates to institutional tasks (see figure 3). Clear majorities of respondents reported a complete lack of interest in using AI for institutional tasks such as planning curricula, making or contributing to financial aid decisions, development and fundraising, and making or contributing to admissions decisions. The tasks with the most AI use are nudging accepted applicants to put down deposits (17%), planning academic support resources (15%), and marketing and recruiting (15%). Some respondents identified additional tasks that use AI, including the analysis of student evaluations of teaching, instructional planning, social media analysis, support desk services, and attendance on campus.
Figure 3. AI Usage for Institutional Tasks
The Data: What You Don't Know…
Can what you don't know hurt you? Significant percentages of respondents reported that they don't know the status of AI at their institutions across all categories. Ranges of "don't know" responses:
Instructional tasks: 8% (using plagiarism-detection software) to 23% (tutoring)
Institutional tasks: 20% (planning academic support resources) to 33% (development and fundraising)
Student success and support tasks: 10% (using chatbots and digital assistants) to 32% (assessing financial need)
The lack of knowledge about AI on one's campus could be attributed to a vague or incorrect understanding of what AI is, an inability to observe AI work (because it tends to be baked into applications and tools), a lack of awareness of the ways in which AI might be used in different units across campus, and/or an actual lack of AI usage on campus. Regardless, that such large percentages responded with "don't know" suggests that the importance of AI to higher education may be presently overstated.
We're just not ready. About two-thirds of respondents reported that institutional deficiencies to support the adoption and maintenance of AI are the main challenges to the implementation of AI at their institutions (see figure 4). Nearly three-quarters of respondents said that ineffective data management and integration (72%) and insufficient technical expertise (71%) present at least a moderate challenge to AI implementation. Financial concerns (67%) and immature data governance (66%) also pose challenges. Insufficient leadership support (56%) is a foundational challenge that is related to each of the previous listed challenges in this group.
Figure 4. Common Challenges to the Implementation of AI
Show me the ethics! Concerns about ethics related to AI use (68%) and concerns about algorithmic bias (67%) pose significant challenges to AI implementation. Echoing the findings of Safiya Umoja Noble's book Algorithms of Oppression, one respondent whose campus primarily serves minority populations told us that, "Bias issues in AI are rampant. As it stands now, [using AI] would have too great a negative impact on our students." Another expressed concerns that "AI has too much bias built in that is very difficult to remove or mitigate." Risk to institutional reputation poses a challenge as well, but what remains unclear is whether respondents see having AI implemented on campus is desirable…or derisible. Figuring out how, if at all, AI aligns with current institutional missions is the least threatening concern.
Current use of AI is a mile wide and an inch deep. We asked respondents to share some promising practices in the use of AI at their institutions. The responses run the gamut of tasks identified above and a few that we hadn't considered:
Chatbots for informational and technical support, HR benefits questions, parking questions, service desk questions, and student tutoring
Research applications, conducting systematic reviews and meta-analyses, and data science research
Recruitment of prospective students
Providing individual instructional material pathways, assessment feedback, and adaptive learning software
Proctoring and plagiarism detection
Student engagement support and nudging, monitoring well-being, and predicting likelihood of disengaging the institution
Detection of network attacks
All QuickPoll results can be found on the EDUCAUSE QuickPolls web page. For more information and analysis about higher education IT research and data, please visit the EDUCAUSE Review EDUCAUSE Research Notes topic channel, as well as the EDUCAUSE Research web page.
QuickPolls gather data in a single day instead of over several weeks, are distributed by EDUCAUSE staff to relevant EDUCAUSE Community Groups rather than via our enterprise survey infrastructure, and do not enable us to associate responses with specific institutions. Jump back to footnote 1 in the text.↩
Kasey Panetta, "5 Trends Drive the Gartner Hype Cycle for Emerging Technologies, 2020," Gartner, August 18, 2020. Jump back to footnote 2 in the text.↩
Mark McCormack, D. Christopher Brooks, and Ben Shulman, Higher Education's 2020 Trend Watch and Top 10 Strategic Technologies, research report (Louisville, CO: ECAR, January 2020). See The Top 10 Strategic Technologies for 2020. Jump back to footnote 3 in the text.↩
Bryan Alexander, "5 Ais in Search of a Campus," EDUCAUSE Review, October 14, 2019; Kathe Pelletier, Malcolm Brown, D. Christopher Brooks, Mark McCormack, Jamie Reeves, and Nichole Arbino, with Aras Bozkurt, Steven Crawford, Laura Czerniewicz, Rob Gibson, Katie Linder, Jon Mason, and Victoria Mondelli, 2021 EDUCAUSE Horizon Report, Teaching and Learning Edition (Boulder, CO: EDUCAUSE, 2021); Thomas Miller and Melissa Irvin, "Using Artificial Intelligence with Human Intelligence for Student Success," EDUCAUSE Review, December 9, 2019; and Judy Brewer, Carly Gerard, and Mark Hakkinen, "The Impact of AI on Accessibility," EDUCAUSE Exchange, EDUCAUSE Review, November 4, 2020. Jump back to footnote 4 in the text.↩
Elena Zeide, "Artificial Intelligence in Higher Education: Applications, Promise and Perils, and Ethical Questions," EDUCAUSE Review, August 26, 2019. Jump back to footnote 5 in the text.↩
The poll was conducted on June 7–8, 2021, consisted of 8 questions, and resulted in 195 responses. Poll invitations were sent to participants in EDUCAUSE community groups focused on IT leadership. Our sample represents a range of institution types and FTE sizes, and most respondents (88%) represented US institutions. Jump back to footnote 6 in the text.↩
For this report, cited percentages are among those respondents who reported knowing the status of AI at their respective institution. Jump back to footnote 7 in the text.↩
Daniel Woldeab and Thomas Brothen, "21st Century Assessment: Online Proctoring, Test Anxiety, and Student Performance," International Journal of E-Learning & Distance Education 34, no. 1 (2019); D. Christopher Brooks, Student Experiences Learning with Technology in the Pandemic, research report (Boulder, CO: EDUCAUSE, April 2021); Shea Swauger, "Software That Monitors Students During Tests Perpetuates Inequality and Violates Their Privacy," MIT Technology Review, August 7, 2020; and Todd Feathers, "Proctorio Is Using Racist Algorithms to Detect Faces," Vice, April 8, 2021. Jump back to footnote 8 in the text.↩
Scott Jaschik, "ProctorU Abandons Business Based Solely on AI," InsideHigherEd, May 24, 2021. Jump back to footnote 9 in the text.↩
D. Christopher Brooks, "EDUCAUSE QuickPoll Results: Student Success Technologies," EDUCAUSE Review, April 9, 2021. Jump back to footnote 10 in the text.↩
See "The EDUCAUSE Student Success Almanac." Jump back to footnote 11 in the text.↩
Zeide, "Artificial Intelligence in Higher Education." Jump back to footnote 12 in the text.↩