The Conversational Infrastructure: moving beyond the chatbot in higher education

Peter Thomas is Senior Adviser for Higher Education Strategy at CDI, where our focus is on building institutional capability for conversational AI to create digital experiences that reflect the quality and values of the educational environment they represent.

How useful is Conversational AI in Higher Ed?

It seems like no topic is more on the minds of university CIOs, CXOs, and digital transformation committees than Conversational AI. The chatbot has entered the building: ubiquitous, well-intentioned—and mostly underwhelming.

Most higher education institutions now have some form of conversational interface. Usually positioned as a helpdesk, sometimes as student support, but always advertised as "available 24/7." The technology is deployed, the vendor contracts signed, the newsletter updates are posted. But something fundamental is missing: the capability to make these systems genuinely useful.

The eternal return

There’s a lot of talk about the future, but maybe it might be more useful to look at the evolution of Conversational AI and what’s happening right now in this gap between deployment and effectiveness.

There's something Nietzschean about the cycle of educational technology adoption. New technology emerges, promises transformation, gets deployed with fanfare, underperforms, then gets quietly relegated to maintenance mode while the next technological salvation appears on the horizon. Chatbots, it seems, are experiencing their own version of this eternal return.

But unlike previous waves of ed-tech enthusiasm, Conversational AI isn't going away any time soon. It's becoming infrastructure—the layer through which students, staff, and stakeholders encounter the institution.

Which raises a new question: if these systems are becoming part of the institutional fabric, shouldn't we be treating them like it?


The flattening of complexity

To talk about chatbots is to flatten out complexity. There will be multiple futures for how Conversational AI is used—from institution to institution, department to department, and even use case to use case. Conversational AI will develop in fundamentally different ways and at different speeds in different contexts.

Consider the difference between a prospective student's panicked 2 AM question about a last-minute assessment extension and a staff member's inquiry about the finance procedure for expensing that meal on a conference trip. Both can involve "Conversational AI," but they represent entirely different challenges of design, content strategy, and institutional capability.

The first requires sensitivity. The second demands integration with backend systems and an understanding of the rules. Treating both as "chatbot problems" misses the point entirely.

The What, Where and When of Conversational AI

Perhaps we need to ask different questions about Conversational AI in higher education—questions that mirror the complexity of the work that goes inside a university.

The What Conversation: What kinds of interactions are we actually trying to enable? If we are preparing students to graduate into workplaces where the ability to learn, unlearn, and relearn is more valuable than any specific knowledge set, shouldn't our conversational systems be preparing them for intellectual agility rather than just institutional compliance?

The Where Conversation: Where do these interactions fit within the broader ecosystem of institutional touchpoints? Conversational AI doesn't exist in isolation—it sits alongside learning management systems, student information systems and human support staff. The question isn't whether it works, but how it fits.

The When Conversation: When do these systems add value versus when do they create friction? Well all know that the most meaningful connections happen in person, so knowing when not to automate may be as important as knowing when to deploy.


The belonging problem

Research into the nature of work reveals that people don’t leave their jobs because of what they get paid, what their office looks like or even their job title. They leave because they don’t feel a sense of belonging.

The implication for Conversational AI in higher education is that students don't just want answers—they want to feel understood, supported, and connected. Staff don't just want efficient processes—they want systems that augment their capability.

The most successful Conversational AI implementations we see aren't just answering questions more efficiently. They're creating moments of belonging. A system that remembers a student's previous inquiry and follows up appropriately. An interface that understands the difference between urgent and routine requests. Conversations that feel contextually aware rather than algorithmically predetermined.

Perhaps one of the essential questions for conversational AI isn't about natural language processing or intent recognition, but about how these systems help people love their experience with the institution.


Making place

When we think about the physical spaces we occupy, the most satisfying ones are those that make people feel comfortable, happy, and healthy. They provide the opportunity for people to learn, innovate, and grow together.

And in terms of Conversational AI? Instead of thinking about chatbots as customer service automation, we might consider them as experiential spaces where learning, discovery, and problem-solving happen.

This shifts the design challenge from efficiency metrics to experience quality. It means creating conversational environments that feel welcoming rather than transactional, that encourage exploration rather than just provide predetermined paths, that connect people to resources and each other rather than simply deflecting inquiries.


The capability question

Rather than asking "Do we have a chatbot?" the more strategic question becomes are we building conversational infrastructure or just automating existing inefficiencies?

These capabilities—design thinking for conversation, content strategy, cross-functional collaboration, continuous improvement processes—are what distinguish institutions that get value from Conversational AI from those that just have chatbots.

Technology is rarely the limiting factor. The constraint is organizational capability: the ability to think systematically about conversational design, to integrate these systems with broader student and staff experience strategies, to iterate based on actual usage patterns rather than vendor promises.


The hard facts and our emotions

All of this is set in the context of, to use Noreen Malone’s phrase, how "the hard facts of the economy interact with our emotions."

The hard economic facts of higher education—increasing competition, changing demographics, evolving student expectations, intense financial pressures—interact with our emotional investment in education.

Conversational AI sits at this intersection. It's both a practical response to operational pressures and an expression of institutional values about accessibility, support, and student success. Getting it right requires attending to both the technical and the emotional dimensions of the challenge.

The question isn't whether Conversational AI will reshape higher education—it already is. The question is whether institutions will develop the capabilities needed to shape that transformation intentionally, or whether they'll find themselves shaped by it instead.

Perhaps we need to think the unthinkable. In the context of Conversational AI, that might mean imagining educational institutions where every interaction—from first inquiry to graduation and beyond—feels personal, contextual, and genuinely helpful.

That's not a technology problem. That's a design challenge.




CDI also offers a free, Conversational AI Maturity Assessment specifically tailored for the Higher Education Sector. This tool is a great way to get an overarching read on how developed your institution's CAI capability is.