AI revolution requires new teaching plan

'What counts as responsible use of AI?'
The rise of generative AI tools in education is raising questions such as whether being able to write an essay is still an important skill worth teaching. Photo Sven Menschel


From literature studies to Bachelor’s theses, generative AI tools like ChatGPT are becoming capable of performing increasingly complex tasks. Many students and staff are unclear about how and when they are allowed to make use of AI tools. The Education & Student Affairs department is now working on a plan to clarify things.

‘There is a new AI tool from OpenAI called Deep Research. You give it an instruction such as “Write a Bachelor’s thesis on this topic”. The tool then asks you a couple of questions, after which it writes a thesis for you in no time. An amazingly good one in some fields of research.’
Tijmen Kerstens, a member of the ‘AI in Education’ working group in the Teaching & Learning Centre, advises teachers on AI in education. ‘Deep Research doesn’t yet function perfectly, but students can still outsource a lot of the writing and thinking.

Many teachers assume AI is a parrot that can’t generate anything new; they need to change their views

The increasing diversity of AI tools is making it ever harder to determine whether a text was written by a human being. That is putting pressure on assessment methods like essays, literature studies and Bachelor’s theses.’ Kerstens advises teachers to change their assignments in such a way that using AI is either pointless or of educational benefit in its own right. Learning objectives should also be revised. ‘If a learning objective is based on a skill that AI could replace in its entirety, you need to consider whether that is still relevant.’

Is being able to write an essay still an important skill in a world with generative AI? Nelleke Lafeber, education policy advisor at Education & Student Affairs (ESA), has her doubts. Within ESA, she is working on a plan for AI and education.
That plan, with the working title AI in Education: Towards an Integrated Approach, is partly about the learning objectives and assessments mentioned by Kerstens. But it is also about clear guidelines, support and training for teachers, as well as about what IT facilities are needed, such as access to paid versions of AI tools. ‘The plan has two aims,’ says Lafeber. ‘We want to educate students so they have the right knowledge and skills for their future field of work. And we want to set up the teaching in such a way that AI has added value and actually improves the education.’

Not a parrot

A recent survey study by Omid Noroozi (Education and Learning Sciences) on the use and perception of generative AI within WUR showed that many students and staff don’t know what is allowed and what is not when it comes to AI in education. Noroozi was surprised by the results. ‘As far as I’m concerned, the institutional policy is clear: the teacher decides. I always hand out instructions about the use of generative AI at the start of any course I give. In those instructions, I explain how students can use AI to maximize the educational value.’
Not all teachers give that clarity though: they aren’t aware they are supposed to do that or they simply don’t have the expertise or time for it. Kerstens: ‘There are still a lot of teachers who think: hmm, we’ll see. Or who assume AI is a kind of parrot that can’t generate anything new. They really need to change their views. This technology is hugely disruptive. You can compare it to the introduction of the internet and all the innovations that brought.’

Noroozi thinks it would be a good idea to offer teachers standardized workshops on AI. ‘We also need a platform where teachers can learn from one another what works and what doesn’t. Generative AI is constantly being developed further, so we need ongoing discussions with one another on how to use it in education.’
As of January, the Teaching & Learning Centre has been giving weekly generative AI workshops for thesis students. Kerstens: ‘This isn’t really our responsibility as we are supposed to focus on supporting the teachers. But teachers often don’t get round to tackling this properly, so for now we are doing it. So far, 263 students have taken part in these optional workshops. Once AI skills are better integrated into the regular courses, we’ll phase out the workshops.’

Grey area

One year ago, Kerstens chaired a conference on the question of how higher education should deal with generative AI. Twelve months later, students and teachers are still not clear about many things, but in fact distinct progress has been made, he says. ‘We now have a policy: the teacher decides.’ But the communication obviously needs to be improved to make sure students and teachers are aware of the policy. Further details are needed too, for example by answering questions such as ‘Should using AI for spelling and grammar be documented?’ and ‘Should WUR keep a list of permitted AI tools?’
Lafeber also wants to get on with fleshing out the AI policy. ‘But we also need to be able to respond flexibly to developments. The policy shouldn’t be cast in stone.’

Kerstens and Lafeber also urge the university Board to come up with a clear vision on the role of AI more broadly within WUR. ‘We see great initiatives from teachers and students who are experimenting with AI,’ says Lafeber. ‘Now is the time to state clearly how WUR as an institution views this development. What counts as responsible use of AI? How can we seize the opportunities offered by AI? There is a big grey area where we need more clarity. If you do that, you create room to make proper use of the capabilities of AI.’

The plan AI in Education: Toward an Integrated Approach being drawn up by ESA will be ready in the spring. Meanwhile, students and teachers can consult AI support pages for information on reliable tools, manuals and security aspects.

Also read:

Leave a Reply


You must be logged in to write a comment.