News

How can AI be used in education?

article_published_on_label
March 18, 2024

Artificial intelligence (AI) is here to stay in education. Plenty of students are already experimenting with AI for their assignments. Teachers are struggling with the question of how to use the technology in a supportive and constructive way that leaves room for human values. In short, how should we deal with AI in higher education?

This interview previously appeared in Resource.

This question was at the heart of this year’s edition of the annual ICAB conference. In this conference for innovation centres in academic science education, held at the start of March in Omnia, 165 people from higher education institutions gathered together to look for answers and share their expertise. ‘Don’t teach general AI skills as a separate module. That will only bore students.’

‘What do you want to write today?’ asks Jenni.
‘I have to do a literature study on materials for transparent electrodes,’ says Andy.
‘Good assignment,’ says Jenni and she starts writing.
After a few sentences have appeared on the screen, she asks: ‘What do you think of this?’
After she gets the OK, Jenni continues writing and in no time she has produced a complete literature review.

Jenni is not a person; ‘she’ is an AI tool developed specifically to generate academic texts. And it works, says YouTuber Andy Stapleton, an academic who tries out AI tools, in his video. ‘It keeps on building up the literature review without you even having to think about it! Of course I need to check it, but I don’t think it could be any easier.’

The video described above serves as the start of the talk given by Laura Koenders, education consultant at Utrecht University. Her talk is about the impact of generative AI tools (tools that can generate their own content) on testing. ‘This is the level of AI capability our students can use now,’ she says. That raises questions among the attendees. Because if students can already let AI do most of the work for writing assignments such as essays and literature studies, what is the point of such assignments? How do you know what the students did themselves and what they used AI to generate? How can you set up your teaching and testing in such a way that AI is used as an aid but is not misused? And how do you avoid a situation where rich students get better grades because they can access the premium versions of such tools, which tend to be better?

Redesign

Koenders helps teachers redesign courses and degree programmes. ‘In the past year, nearly all questions were about ChatGPT,’ she says. ‘If you are here to get some concrete answers, you are going to be disappointed because I don’t have them yet.’

She is able to share some insights, though. ‘If you want students to learn how to write academic texts, test that in a controlled environment, for example with a writing assignment they complete in the classroom. If the emphasis is on critical reflection, you can have an oral exam instead of a written assessment. And if you actually want students to use AI tools for a written assignment, set clear rules. And make them reflect on the use of the tool. Ask students what prompts (the instructions or hints you give the AI tool, ed.) they used for ChatGPT and how they think using the tool made their essay better.’

Teachers also need to ask themselves questions if they want to use AI effectively in their teaching, says Koenders: ‘What is the learning objective? Analysing the literature? Learning to reflect critically? Writing? And is being able to write well a skill we will still find valuable in 10 years’ time?’

Sjoerd van Gurp, a teacher and education developer at Avans University of Applied Sciences, argues that higher education cannot ignore the use of artificial intelligence. He encourages teachers to reflect systematically on its use in education. ‘It starts with assessing the impact of the technology: what are the positive and negative effects on teaching? In the case of positive effects: support its use and see what students need to be able to benefit from AI. In the case of negative effects: adapt the learning activity so as to minimise those negative effects.’ You could offer completely different learning activities, for example. Another option is to discuss the negative effects with the students beforehand so they understand why you should not use AI for absolutely everything.

Three tools to play with:

There are a lot of generative AI tools that can help students, teachers and researchers. Here are three tools you can try out.

Elicit: good tool for searching for specific data in scientific articles and then automatically summarising the information.

Jenni: can take over the academic writing work to a large extent, including citations.

ChatGPT: can summarise texts, generate practice exam questions, restructure sentences, translate, explain complex concepts and help with brainstorming.

AI-skills

Another recurring question at the conference is what skills students should learn now for their jobs in the future. It is clear AI literacy is one such skill. But how can you teach that? In his keynote speech, WUR professor Ioannis Athanasiadis (AI and Data Science) advocates designing lessons in AI skills that are geared to the specific degree subject. ‘AI can help resolve complex problems such as detecting skin cancer or solving difficult maths questions. But you won’t get those solutions just by being skilled in the application of AI; you need to combine that with knowledge of the field.’

In teaching too, that combination needs to be the focus, says Athanasiadis. There will soon come a time when you won’t get far if you are an expert in the field but don’t have AI skills, or if you have AI skills but lack the specific expertise. ‘It is all about the combination, and that is our added value as a university.’

If you deal with artificial intelligence in a separate course and in abstract terms, the students will get bored

Willem-Paul Brinkman, associate professor in Interactive Intelligence and programme director for Computer Science at Delft University of Technology, continues on this theme with his talk on AI-ready Curricula. ‘Students choose to study a particular subject. If you deal with artificial intelligence in a separate course and in abstract terms, the students will get bored. They didn’t come to your university to attend lectures on AI.’ So start by teaching the basics of that degree subject, for example entomology, and then use practical examples from that field in your AI teaching. This means AI education needs to be more than an elective minor or a compulsory course on AI skills that is separate from the degree subject.

Brinkman sees AI education as something that should be incorporated in various courses at all stages of the curriculum and that should be taught by staff who are experts in the degree subject. So the course ‘AI for entomologists’ would be taught by an entomologist, not by an AI expert. ‘But if you are going to do this, you need to figure out how you are going to set it up in the curriculum.’

To incorporate subject-specific AI skills in the curriculum, Brinkman has formulated five principles (see inset). He ends with a warning. ‘Don’t jump in at the deep end with AI: if you let generative AI take over too much of the subject-specific knowledge, you won’t have any experts in that field left.’

Calculator

Brinkman is not the only one to mention this risk during the conference. ‘Basic skills are important,’ says Cynthia Liem, associate professor in Artificial Intelligence at Delft, in her keynote speech. ‘We teach children to do sums before we give them a calculator.’ She calls for renewed appreciation of the inherent value in struggling to get good at something. Liem herself trained as a concert pianist. ‘It sometimes seems as if only the worlds of music and sport still have that painful learning process. In other areas, we increasingly opt for convenience, for example by letting AI do all the hard work. But at what expense? Do we still understand what we are doing when we let a tool do everything?

Who is going to invest time and effort in learning a trade?

She points to the video game industry. ‘A lot of people in junior positions are getting laid off now because they do work that can easily be replaced by AI tools. But the experts in the industry had to go through a tough learning process themselves to get good at what they do. They invested time and effort in learning their trade. Who is going to do that now? If we only look at what AI tools can do in the short term and not at what they will cost us in the long term, we will end up without any experts.’

Figuring it out

The attendees at the conference discuss many other possibilities of generative AI (such as using tools to help compile practice exam questions and check tests) and risks (privacy and copyright issues, whether AI detection software is needed or even possible, or the energy consumption of AI tools like ChatGPT). What is clear at any rate is that AI is going full steam ahead. It is a reality higher education will have to deal with. Or as Arthur Mol (still rector at the time of the conference) said in his welcome speech: ‘We are seeing a mere glimpse of what AI is capable of, how we can use it and what we should definitely not use it for. What role do we want to give AI in our education and research? We will be figuring that out for a long while yet.’

Five principles for teaching AI in a specific degree subject:

  1. AI should strengthen the students’ skills and knowledge about the subject, not weaken them. So teach the basics of the subject BEFORE you teach subject-related AI.
  2. Students need to understand how artificial intelligence works and what it can and can’t do before you teach them subject-related AI.
  3. Subject experts should give the subject-related AI courses, not AI experts.
  4. Students also need general AI skills such as fact checking and how to design prompts (hints and instructions for the AI tool).
  5. Make AI education cohesive. Create a curriculum path.

These tips come from Willem-Paul Brinkman, associate professor in Interactive Intelligence and programme director for Computer Science at Delft University of Technology.