NewsBite

AI poses a dilemma for universities over what to teach

Should universities stop teaching students how to do things which are likely to be offloaded to artificial intelligence? Photo: Johannes Simon
Should universities stop teaching students how to do things which are likely to be offloaded to artificial intelligence? Photo: Johannes Simon

It’s pretty amazing to play with the flurry of new generative AI tools readily available online. Want a summary of any topic under the sun? Done. What to compare apples and oranges? Easy. Want an analysis of a lengthy, complex report? Click-Click-Boom!

If I was a university student I would be quickly getting my head around the power of these tools and how they can help me in my study.

If I were a university educator, I’d be thinking carefully about what learning and assessment activities I am asking students to engage in, and how I can harness generative AI tools in the classroom. I would also cast back to the introduction of the calculator in maths education and how it changed the ways teachers taught and how students engaged in learning.

With the rise of generative AI tools and their adoption in education, our university students will clearly need to develop new skills in using these tools effectively and in generative AI “prompting”.

As “prompt engineers” students will need to become adept in creating and refining requests of generative AI tools to deliver useful results.

But as the generative AI tsunami continues to wash over us, it is worth pausing to consider how the use of these tools might undermine students’ learning processes and outcomes, even if they become sophisticated prompt engineers.

The fancy term used to describe students’ use of external tools in their learning is cognitive offloading. Like the maths calculator, this idea has been around for decades and it basically means getting something else to support your thinking or do the thinking for you.

For any given educational task, students’ cognitive offloading to generative AI tools may be significant in both quantity and quality and may, therefore, represent a significant omission from current or ‘established’ learning processes.

If our current processes of learning are significantly disrupted by cognitive offloading to generative AI tools, students’ development of learning skills on the one hand, and their understanding of the concepts being taught on the other, may very well be diminished.

If students get generative AI to ‘do more of the work’ of generating, sifting, comparing, and compiling ideas and material, this may undermine their development of skills educators refer to as ‘learning’ or ‘study’ skills. These skills are, of course, not restricted to when students are studying, they are essential skills for a life of work and learning.

Students’ active, cognitive engagement in processes such as ‘compare and contrast’, ‘analyse’, and ‘summarise’, and their regular and deliberate practice of these as part of learning, is intrinsic to their development of these skills over time.

These learning processes are also very important to students’ learning outcomes; their deep understanding of discipline-based ideas, concepts and principles.

By not offloading and doing the work themselves, students engage in cognitive strategies such as organisation, elaboration and self-regulation that allow them – demand of them really – to think through the material they are studying, leading to a richer understanding of it. This often takes time and is sometimes challenging. Genuinely learning something new is rarely effortless; it is effortful.

If a fair bit of this hard work is offloaded to a generative AI tool then the development of a deep understanding of the discipline-based material is potentially undermined.

Even if students become adept in prompting generative AI tools and use them ‘well’, they may start to sidestep learning processes that support the development of essential thinking skills and the learning outcomes in their chosen area of study.

Broadly speaking, if we assume that we want to maintain a commitment to the learning outcomes we have asked students to achieve in the past then we need to think very carefully about how the use of generative AI tools might undermine these outcomes.

That is, we want to be confident that university graduates of the near future, who are expert prompt engineers, are still able to understand the fundamentals of engineering or biology or politics, and are able to both critically interrogate and apply these fundamentals, and are able to communicate them.

Perhaps educators will be prepared to forego some of these previously established learning outcomes. Some types of knowledge and skills may no longer be required if they can be offloaded to generative AI or easily prompted. If this is the case then we need to think carefully and specifically about what these are, and be explicit about this in our curriculum and with our students.

And of course, our industry partners and accrediting bodies will be pretty interested in decisions we make about any areas of knowledge that have typically been expected of students, and now might be easily offloaded to generative AI tools.

New generative AI tools may well be akin to other educational technology evolutions and revolutions we have seen before; time will tell. But I am not so sure comparisons to the maths calculator or printing press quite fit here. Less cognitive offloading was in play and the processes being offloaded were more operational.

Becoming an excellent prompt engineer will become an essential capability for university students and will require honed skills.

But let’s not be duped into thinking the skills involved in becoming an excellent prompt engineer are the same creative, critical and analytical thinking skills that students develop after fully engaging in their university studies.

Gregor Kennedy is deputy vice-chancellor (academic) at the University of Melbourne and a professor of higher education in the Centre for the Study of Higher Education. His research has focused on educational technology, the learning sciences and learning analytics.

Add your comment to this story

To join the conversation, please Don't have an account? Register

Join the conversation, you are commenting as Logout

Original URL: https://www.theaustralian.com.au/higher-education/ai-could-make-some-types-of-knowledge-and-skills-obsolete/news-story/4328a6fd05d6cbe5104c05c959145c67