- Story
AI triggers emotions in higher education
23.07.2025 In collaboration with swissuniversities, ZHAW, the University of Education Zurich and the University of Neuchâtel, BFH has explored how artificial intelligence (AI) is transforming academic work. Project leader Elizabeth Steele offers her insights.
Key points at a glance:
- The project reveals how AI is reshaping academic practices in higher education.
- Lecturers are experiencing significant uncertainty – they are concerned about their roles and the quality of student learning processes.
- Clear guidelines, open discussions and support are essential for using AI responsibly and reflectively.
What was the focus of the ‘Digital Literacy in University Contexts’ project?
At its core, the project addressed the question: how should educational institutions respond when students increasingly delegate parts of their work to machines? It investigated the implications of this shift for student work, for assessments and for the role played by lecturers.
The aim was to better understand the emotions surrounding the AI revolution. After all, there was what could be described as panic following the release of ChatGPT, and this deep uncertainty remains palpable in 2025.
How was the project conducted?
In the sub-project ‘Automatic Text Generation’, we interviewed 16 focus groups in the summer of 2023, with data evaluated by December 2024. Interestingly, everything we heard in the summer of 2023 remains relevant in the summer of 2025.
The interviews showed us that people need more than just information. There is a deep desire to talk about the new technology, as AI triggers many emotions. Lecturers need a space to process these feelings.
There is a deep desire to talk about the new technology, as AI triggers many emotions.
Why do lecturers react so strongly to AI?
Lecturers are worried. They believe AI is fundamentally changing how students learn. Before AI, learning involved actively processing information through reading and writing. The AI shortcut means that students are no longer internalising content, but are learning above all to master AI.
Additionally, lecturers face existential concerns. Their roles will need to evolve. Assessments must shift: what once was about testing memorised knowledge now revolves around evaluating what students can do and whether they truly understand what they have produced with AI’s help.
Moreover, lecturers’ knowledge advantage is diminishing. Students are often a step ahead in using AI, meaning lecturers are increasingly having to learn from them.
We must acknowledge that, as lecturers, we know less about AI than our students. That can be challenging. But when you ask students for help, you’re often pleasantly surprised.
What role do lecturers play in the age of AI?
Lecturers remain crucial in determining why and how AI should be used. We must all develop guidelines and ethical frameworks to foster a responsible and critical approach to the new technology.
Initiatives like BFH’s AI policy or its Machine Translation Use in University Contexts Policy are great for a start: however, separate AI policies may be necessary for each module, as the questions arising in practice are highly context-dependent.
Students are asking just as much as lecturers: ‘What is allowed, and what isn’t?’ Clear rules are likely needed, and they could differ from one module to the next. Lecturers must transparently outline the potential uses of AI in their courses while critically evaluating its application. It is essential to discuss AI in the classroom. Ignoring it is not a solution.
Lecturers need training and guidelines to navigate this shift. Projects like Kerstin Denecke’s Education 6.0 are a step forward. Yet many lecturers still feel bewildered at the shifting technological landscape.
Lecturers must transparently outline the potential uses of AI in their courses while critically evaluating its application.
Why this uncertainty?
For non-experts in particular, the technological shift happened very suddenly and overwhelmed many people. While technology advances rapidly, the human issues and emotional aspects of the digital transformation are often overlooked. As a result, some lecturers lack the confidence to experiment with AI.