Holding the space: How creative universities should respond to AI

31 March 2026

russ crawford
Type: Text
Category: University news

This is article was written by Deputy Vice-Chancellor (Interim) Professor Russell Crawford.

Artificial intelligence has exposed a recurring tension at the heart of higher education: the pressure to take a position versus the responsibility to hold uncertainty. 

For creative universities, that pressure is particularly acute. 

Across higher education and society at large, conversations about AI, particularly the generative tools, tend to collapse into two familiar responses: a rush towards moral certainty, or an equally rapid move into practical application.  

In practice, this means we are frequently expected to declare a position for or against this new technology based on limited literature, empirical evidence or longitudinal lived experience. 

This instinct is understandable. Although AI has existed in research contexts for decades, its public arrival has been rapid and disruptive. We have seen similar moments and instincts during previous technological advancement, where urgency outpaced understanding. 

As a creative arts institution Falmouth has been on the front line of conversations about AI. A new Masters, AI in Creative Practice, was seen as deeply problematic by some, who are concerned about the impact of generative AI, while students have petitioned for the University to ban the technology entirely. Academic colleagues also embody a variety of perspectives. 

My concern is simple. Universities are not advocacy organisations or resistance movements.  

Historically, and legally, we exist to hold space for debate, uncertainty and academic disagreement. Our purpose is to give our colleagues and students the space and time to explore the new and the complex, particularly at moments when cultural, ethical and social implications have been worked through. 

The risk of rushing to certainty

It is precisely the absence of a single “correct position” on AI that should be firing our collective academic and creative imaginations. Without academic miles on the clock, educational rigour risks collapsing into group-think - or worse, no-think, stripping away critical engagement around a technology as transformative as AI. 

Ultimately neither advocacy nor resistance serves students well, even acknowledging they themselves may exert pressure in either direction. 

For institutions rooted in creative practice, this pressure to stabilise uncertainty also cuts against how creativity itself operates. 

There remains relatively little longitudinal data on AI’s impact on the creative sector, and even less insight on how creative practitioners are adapting in practice.  

Much of the public conversation around AI focuses on efficiency, optimisation and output, which sits uneasily with creative practice. By focussing on responding to how consumer AI tools seek to optimise creativity, we risk fundamentally misunderstanding what creativity is, both within higher education and beyond it. 

Creative disciplines have historically been most resilient when exploratory rather than solution-driven. They value qualities such as ambiguity, a tension that also sits at the heart of contemporary debates about AI. 

A distinct role for creative universities

Creative specialist universities occupy a position distinct from many of our colleagues in the sciences, engineering and other disciplines. Our role is not to immediately solve problems, rule things in or out, or stabilise uncertainty too quickly, but to work within tension in order to question and explore. 

From a learning and teaching perspective, this means resisting the urge to train staff and students in specific tools that will quickly date. Instead, it means equipping them with frameworks to critically understand AI, and the skills to harness, challenge or subvert it. 

In research, the same logic applies. Work in Falmouth’s Centre for Blended Realities begins with people and experience, not applications or systems, placing human and cultural context before technological capability and asking different questions about impact and value. 

It is not only possible but healthy for creative universities to hold multiple, sometimes contradictory positions on AI and its applications. Some disciplines have engaged deeply with AI for years, while others choose to engage differently. This should not be seen as a problem to resolve or a party line to impose. 

At the core, higher education has a responsibility to ensure all students leave with a baseline of AI awareness, literacy and critical thinking capability. This is ultimately less about AI itself and more about equipping learners with the confidence and capacity to question it, whatever context they encounter beyond university. For those in creative disciplines, this is no less the case.  

At a moment when AI is increasingly commodified and public debate pushes towards speed and certainty, universities have a different responsibility: to slow things down and foster thinking to that helps society catch up with technology. 

This is not hesitation. It is meaning-making in action and that is, in itself, a creative act. 

 

Further reading

Read more from William Huber, who designed the Artificial Intelligence for Creative Practice MA.

You might also like