NewsBite

commentary

AI is taking away educators’ role in helping young people know themselves and the power they have to change the world

Many of us are not noticing – or are willing to ignore – AI’s wider impact because of the personal conveniences it affords us.
Many of us are not noticing – or are willing to ignore – AI’s wider impact because of the personal conveniences it affords us.

My mother is about to turn 90. She has just successfully completed two short online courses in Chinese history through HarvardX (and has emailed copies of the certificates to her six children).

What motivated her? She didn’t need it for her career. It was completely disconnected with any practical or financial concerns. So was it for pleasure? No, because she could have just continued to enjoy the various history documentaries she watches and books she reads. “It was very difficult.”

So why did she put herself through it?

“Because I wanted to keep my mind active and needed to be sure that what I was learning was true, based on the knowledge of the academic experts who chose the readings and ran the course, not just opinion and propaganda.”

In other words, she was driven by what Canadian Jesuit philosopher Bernard Lonergan has described as “the eros of the human spirit, our pure and unrestricted desire to know”.

While academic expertise and opinion and propaganda are not always and everywhere mutually exclusive – the boundary between evidence and ideology is sometimes more porous than we would like to imagine – my mother needed to be confident that she would finish the course with correct understanding, and this con­fidence would flow from it being measured against an external standard set by authoritative special­ists.

It goes without saying that she did not use ChatGPT to help her write any of her assignments. There was no reason for her to plagiarise its output in any way. To have done so would have completely undermined her intrinsic motivation, namely deepening her own understanding.

But for school or university students, the situation is not so straightforward. For many, if not most of them, the award of the credential is a more powerful, extrinsic motive, more important than the underlying understanding for which it purports to provide evidence. This is a function of the way our education system has evolved into an instrument at the service of the economy, a sorting mechanism and a tool for training that prepares people for work.

If cheating using artificial intelligence improves one’s chances of gaining the credential but reduces the actual depth of understanding, then many may make the calculation that the credential is more important and act accordingly.

But there are much bigger problems. First, in the age of AI, what employment opportunities will there be for these graduates once they gain these credentials? Many of the jobs in the so-called knowledge economy will be done much more effectively and cheaply by AI systems. All those people studying and working in data analytics across multiple industry sectors should be worried.

When Geoffrey Hinton quit his job at Google last month, he said many of these jobs would disappear overnight. On cue, the share price for tech giant Chegg, which produces homework study guides in the US, plummeted by 50 per cent overnight due to fears about competition from ChatGPT. What happens to Chegg’s “knowledge workers”?

Artificial intelligence pioneer Geoffrey Hinton.
Artificial intelligence pioneer Geoffrey Hinton.

Perhaps even more worrying, Hinton said he was concerned that developments in artificial intelligence – with its ability to produce “deep fakes”, meant we were entering a world in which we would “not be able to know what’s true any more”, a concern recently echoed by Australia’s Human Rights Commissioner, Lorraine Finlay.

So the ChatGPT phenomenon, in addition to raising questions of academic integrity, is raising important questions about how we govern technological change, about how we warrant the accuracy and reliability of sources of information, about the nature of truth and knowledge, about humanity’s self-perception as the smartest species on planet.

And so it also raises questions about the wider purposes of educa­tion – the why, not just the what (curriculum) or the how (pedagogy). What are these wider purposes?

The earliest known curriculum document was a two-word inscription on the Temple of Apollo at Delphi in Greece. It read simply “Know yourself”. Knowing yourself and examining one’s life in a systematic and fundamentally honest way, so as to become wise, this is the most profound outcome of a successful education.

You won’t find the answer to that on ChatGPT.

Once called “character education”, this notion tended to carry the sense of compliance with a set of extrinsic and perhaps socially shaped behaviours. But developing in our students a commitment to thoughtful, honest, purposeful human agency, respectful of others and embracing the common concerns of one’s communities, this is indeed the wider objective of those who call themselves educators: to help young people come to know themselves and the power they have to change the world.

If AI is going to do our jobs so much more cheaply and effectively, what does that mean for the way we think about education’s purpose? Picture: iStock
If AI is going to do our jobs so much more cheaply and effectively, what does that mean for the way we think about education’s purpose? Picture: iStock

It used to be that to describe someone as educated meant that a person’s innate talents, interests and desires had been cultivated, enhanced, refined, deepened, broadened and developed with the help of elders and experts in a variety of disciplines who understood that learning was as much about attitudes as aptitudes. Now the term education connotes in the popular mind something much thinner – a process of acquiring skills and knowledge that will make us employable.

But if AI is going to do our jobs so much more cheaply and effectively, what does that mean for the way we think about education’s purpose? Perhaps we need to return to that earlier meaning of being educated – the meaning embodied by my mother’s efforts – because the more utilitarian rationale is being seriously undermined in the age of AI.

The problem with many of the current conversations about AI and education is that they tend to focus on questions like how can we ensure academic integrity and how can we use AI in the classroom to improve student learning.

While these are perfectly reasonable and important questions to ask, they are bounded by a very short time-horizon, and they frame AI as simply the latest in a long list of technologies – like paper replacing slate and calculators replacing logarithm books – that can be integrated across time into the existing paradigm of schooling and its employment-related purpose.

AI as being framed as the latest in a long list of technologies – like paper replacing slate and calculators replacing logarithm books. But there is more to it than that. Picture: Richard A. Brooks/AFP
AI as being framed as the latest in a long list of technologies – like paper replacing slate and calculators replacing logarithm books. But there is more to it than that. Picture: Richard A. Brooks/AFP

It’s akin to asking: “How do we harness this powerful new stallion to our existing educational buggy?” A better way to consider the situation is: “How do we surf this tsunami?” But to put the issue into sharper focus, it’s arguably more akin to the passengers on the Titanic asking: “Shall we put the deckchairs here or there?”

AI is an iceberg that is going to sink the current schooling paradigm because it is disrupting the society for which schooling is supposed to prepare our children.

Many, if not most, of us are not noticing – a la the boiled frog – or we are willing to ignore AI’s wider impact because of the personal conveniences it affords us. The change that AI is most like is the invention of the printing press. The medieval church did not rejoice in the democratising of knowledge that it allowed. It didn’t delight in the fact the monks would no longer need to sit at their desks all day transcribing the ancient texts that represented centuries of traditional and fixed knowledge over which the church exercised a monopoly.

The feudal power-wielders were very much alive to the fact knowledge was power, much more alive to the printing press’s social, economic, cultural, religious, scientific and political significance than we appear to be when it comes to the impact of AI.

Artificial Intelligence is a ‘danger’ to society

Where the analogy breaks down is that, with AI, this technology is reversing the democratising dynamic of the printing press.

It is being driven by a very small number of global mega-corporations that are sucking up the personal data we are handing to them on a plate and selling our increasingly shortened attention spans via their worldwide network platforms. Those companies are racing one another for AI dominance and leaving governments and regulatory agencies well behind them. Their commercial motives are not aligned with the interests of democracy, which relies on open debate and free exchange of ideas about factual situations.

The global network platforms are designed to give us more of what we want – feeding our own biases and prejudices – not what we need: objective, reliable facts.

So what is the new paradigm of education that will ensure that the economic, social and cultural disruption being caused by AI serves humanity rather than enslaves it?

The global network platforms are designed to give us more of what we want – feeding our own biases and prejudices – not what we need: objective, reliable facts. Picture: iStock
The global network platforms are designed to give us more of what we want – feeding our own biases and prejudices – not what we need: objective, reliable facts. Picture: iStock

During the next 18 months education ministers will consider a new version of the National School Reform Agreement, which aims to give effect to the Alice Springs (Mparntwe) Education Declaration signed in December 2019.

That statement says our educa­tion system should promote both excellence and equity, and that it should produce young people who are “successful lifelong learners, confident and creative individuals, and active and informed members of the community”.

Right now there is a particular need to focus on the third of these: active and informed citizens. This is because of the irony that despite advances in technology such as ChatGPT giving us access to more information than ever, this access has not ushered in a new Age of Enlightenment.

Rather, as Tom Nichols has written in his 2017 book The Death of Expertise, it has “helped fuel a surge in narcissistic and misguided anti-intellectual egalitarianism that has crippled the possibility of informal reasoned debate on all manner of public issues”.

Building on the epistemic relativism encouraged by postmodernism, our age is one in which all opinions are equally valid, and the internet will provide us all with whatever “alternative facts” we need to give our prejudices the veneer of rationality.

There are so many “facts” at our disposal, we feel free just to choose the ones that fit our prejudices, and not bother with asking the question as to what, ultimately is true and good.

While postmodernism has led to many important insights about power and ideology, a damaging aspect of its legacy has been the undermining of the importance of factual truth and cultural reference points. The undermining of the solid ground of reality on which individuals can base their own solid sense of self and community and of their own agency in a world that is knowable, a world in which things can be true or false, represents a serious threat to our democratic society.

The undermining of the solid ground of reality on which individuals can base their own solid sense of self and community and of their own agency in a world that is knowable, a world in which things can be true or false, represents a threat to people’s mental health. Picture: iStock
The undermining of the solid ground of reality on which individuals can base their own solid sense of self and community and of their own agency in a world that is knowable, a world in which things can be true or false, represents a threat to people’s mental health. Picture: iStock

It also represents a threat to people’s mental health as they confront a world of meaninglessness and find themselves standing on the edge of a bottomless abyss of relativity.

This existential threat to our sense of personal and autonomy and agency is under threat when more and more decisions that affect our lives are being made by machines.

So schooling, while preparing our children as best it can for the changing world of work, is going to need to focus more on those aspects of humanity that are exclusively human and vitally important for our development as democratic communities.

In addition to reading, writing, numeracy and digital literacy, the other four general capabilities in the Australian Curriculum – ethical understanding, personal and social capability, intercultural understanding, and critical and creative thinking, are going to be more and more important.

But this focus cannot come at the expense of factual knowledge and an emphasis on truth. Rather, it has to come through the teaching of a knowledge-rich curriculum, taught by teachers whose social and cultural role as authoritative sources of information, knowledge and wisdom needs urgent buttressing. Our children are inheriting a dystopian brave new world and need to be equipped for the knowledge, capabilities and attitudes needed to renew and re-humanise that world. How we convert that purpose of schooling into concrete action is the challenge for all of us.

David de Carvalho.
David de Carvalho.

In a few weekends, four generations of my family will celebrate my mother’s 90 years of life, an occasion for reflection as well as for joy. Of her many virtues, perhaps the one I will celebrate the most is her unquenchable curiosity about important things. The desire and ability to keep asking the big questions about life is what defines us as human beings. We need to ensure our schooling system keeps fanning the flames of our very human wonder.

David de Carvalho is chief executive of the Australian Curriculum, Assessment and Reporting Authority.

Add your comment to this story

To join the conversation, please Don't have an account? Register

Join the conversation, you are commenting as Logout

Original URL: https://www.theaustralian.com.au/inquirer/ai-is-taking-away-educators-role-in-helping-young-people-know-themselves-and-the-power-they-have-to-change-the-world/news-story/b7f4a67c7b7ae0d8debacc0d43d62b9d