Jenzabar
2025 CIO/CTO Survey: Progress, Pitfalls, and Priorities of AI in Higher Education

2025 CIO/CTO Survey: Progress, Pitfalls, and Priorities of AI in Higher Education

All Blog Posts

Artificial intelligence has been steadily gaining momentum in the consumer landscape over the past few years, which has contributed to its spillover into other sectors like higher education. The use of AI on campus is now virtually inevitable; whether they like it or not, institutional decision-makers must find ways to embrace and support the use of AI on campus, even if they have concerns. This was one of the takeaways regarding AI from Inside Higher Ed’s 2025 Survey of Campus Chief Technology/Information Officers.

Conducted in collaboration with Hanover Research, the 2025 Inside Higher Ed survey polled more than 100 CTOs and CIOs and found that interest in AI is rising, but adoption remains slow. Part of the issue stems from concerns over the technology’s potential impact on academic integrity; another part comes from the lack of institution-wide strategies. So, what’s going on with AI in higher education, and how can colleges and universities navigate this uncharted territory with better results?

Concerns Over the Real AI: Academic Integrity

Many campus decision-makers have concerns regarding AI’s impact on honesty, truth, and responsibility. In IHE’s survey, a whopping 74% of CTOs said generative AI (GenAI) was a moderate or significant risk to academic integrity. Nearly half of respondents said they do not provide students with access to GenAI systems.

But GenAI tools like ChatGPT, Copilot, and DALL-E are among the most widely used consumer-oriented AI applications on the market. This means they will, if they haven’t already, find their way onto campus—regardless of whether an institution has a policy for them.

Anxiety over academic integrity may be one of the leading factors behind cautionary explorations of artificial intelligence. And while taking tentative steps into the AI pool can minimize risk, many institutions are prioritizing fear over exploration; they are limiting the scope of AI, which leads to misconceptions, misunderstandings, or misfires.

Cautious Experimentation Leads to Mixed Feelings

The survey found that only about half (54%) of responding CTOs said AI’s overall impact on higher education so far has been positive/very positive. However, the fact that roughly half of respondents have a neutral or negative feeling towards AI reemphasizes the industry’s general hesitation to fully embrace AI. In fact, only 34% of CTOs said investing in GenAI is a priority. Other AI technologies like AI agents and predictive AI tools, which can be used to promote student success, were even lower on the list of priorities.

The bottom line is that higher education has been somewhat slow to adopt AI. Another reason behind AI hesitancy stems from the fact that many institutions are taking too narrow of an approach. According to the survey, 53% of respondents said they are still focusing on individual use cases for AI instead of implementing a campus-wide strategy that would enable more universal adoption. In fact, only 11% of CTOs said that their institution has a comprehensive AI strategy.

With such a narrow, cautious approach, institutions may be missing out on some of the larger innovations and opportunities AI presents. They also run the risk of being forced to play catchup when unpermitted AI technologies crop up on campus.

A Broader Approach, A Better Approach

Micromanagement starves innovation, so it’s no wonder the words “governance” and “ownership” can strike fear in any organization looking to become or remain agile. Unfortunately, only 35% of CTOs believe their institution is handling AI adeptly; only 19% think that higher education, overall, is effectively managing its use of AI. This means that institutions would do better with more holistic policies in place—roughly 31% of CTOs report having no AI policies at all.

Institutions looking to adopt AI should start with governance. This means establishing safe, yet flexible, parameters. Identify who is responsible for finding and evaluating AI tools and building policy. We suggest putting together a diverse committee comprised of members from different teams across campus (faculty, IT, legal experts, data managers, etc.) that can broaden the perspective of AI and take responsible action.

We also suggest bridging the gap between IT and campus leadership—a concern that some survey respondents still feel exists—and working with a trusted technology partner that can support your AI journey. Artificial intelligence is a cataclysmic technology sending waves throughout higher education and beyond. While such a monumental technology can, understandably, invite concern, a properly planned strategy for adoption, implementation, and long-term use can mitigate some of that risk.

AI is just one of the many topics covered in Inside Higher Ed’s 2025 Survey of Campus Chief Technology/Information Officers. For a deeper dive into what CTOs and CIOs are thinking, you can download your copy of the survey here.

CTA_II_2025 Survey of Chief TechnologyInformation Officers

Recent Blogs

Subscribe

Loading...
How to Overcome Challenges and Move Forward With AI Adoption in Higher Education

How to Overcome Challenges and Move Forward With AI Adoption in Higher Education

August 6, 2025

As some campuses press forward with AI adoption, many of their peers remain in AI limbo. Here’s how Higher Ed can move AI projects forward.

A Sneak Peek at “Improving Operations With AI”

A Sneak Peek at “Improving Operations With AI”

July 30, 2025

In The Chronicle’s Key Takeaways report, four IT leaders from across higher ed discuss how their institutions are using AI.