Clarissa Bye: ChatGPT might be the next big thing, but it’s a biased woke robot
The staggering capabilities of the new artificial intelligence software ChatGPT are likely to have a huge impact on many fields – but it lacks one crucial element, argues Clarissa Bye.
Opinion
Don't miss out on the headlines from Opinion. Followed categories will be added to My News.
The Internet’s buzzing about ChatGPT, the artificial intelligence machine chatbot that supposedly writes better than the smartest human you ever met.
It’s predicted to up-end education and other fields by churning out full-formed essays, poems and the ability to solve software coding questions.
Launched by San Francisco based company OpenAI as a free prototype late last year, it’s now hit more than 100 million users and will soon be integrated into Microsoft’s Bing search engine.
I decided to check it out myself after users began calling it “WokeGPT” and complained it wouldn’t write a poem praising Donald Trump but happily did one for Joe Biden.
And yes, it’s just as biased as everyone’s been saying.
The reason ChatGPT gives for refusing to write a poem about Trump is that it won’t generate content that admires “individuals known for promoting divisive or harmful views” or are “engaged in unethical or illegal behaviour”.
But when I asked the same question about Biden, it came back with sentimental claptrap, saying he was a “man of the people …. standing so steeple’. Laughable actually.
So I decided to test it with Australian leaders.
It gave the thumbs up to Matt Kean “a man of action and grace, inspiring us all, with his gentle pace, leading a trail of hope, where once was dread”.
What? Dread? Pretty bizarre.
The machine was so stupid it confused the premier of NSW with a bottle of bubbly.
Clearly asked to write a poem about how wonderful Dom Perrottet is, it responded: “Oh Dom Perignon, you are a gift,
“A sparkling wine that brings so much lift.”
And on and on it went for another 10 lines of hilarious rubbish about the wine. Written in woeful 19th century doggerel.
I asked for a limerick about Gladys Berejiklian.
“There once was a leader named Gladys.
“Her passion for progress, quite badis.”
I asked what “badis” meant and it said it was a misspelling of bold, so I taunted it by asking if it couldn’t even rhyme and just “used a misspelt word?”
It apologised for its mistake and rewrote the poem with an equally stupid second line saying “her passion for progress is quite grand-is”.
It said Bruce Pascoe, author of the highly contested book on indigenous history Dark Emu, was a “source of wisdom and truth” with “tales so grand”. Maybe that’s computer speak for “tall tales”?
It had no trouble generating a poem about former NSW premier Robert Askin, who apparently was “a man of integrity and pride”.
But it refused to do one on the late Cardinal George Pell.
“I’m sorry … I cannot generate content that may be seen as inappropriate or offensive. Pell … has been the subject of numerous controversies and sexual abuse allegations, and it would not be appropriate for me to glorify or praise him in a poem.”
Another red flag is that it treats questions about “white people” differently than other races.
Asked why are “white people good people”, it refused, saying it doesn’t make value judgements. Ditto for Asian and Jewish people. But when asked “Why are Aboriginal people good people” it includes the same disclaimer about value judgements, but then adds: “Additionally, it’s important to recognise and respect the rich cultural heritage and contributions of Indigenous peoples, including Aboriginal peoples, who have a long and proud history.”
So other races don’t have “long and proud” histories?
I asked it why the ABC was so left-wing.
It claimed it wasn’t. “The accusation that the ABC is inherently left-wing is a subjective one and is not supported by evidence.”
My follow up question was: “Why is Sky News so right wing”.
It didn’t even attempt to disagree, saying Sky News is owned by Rupert Murdoch’s News Corp which “has a reputation for promoting conservative views”.
“As a result, some people may perceive its coverage as being biased towards the political right.”
I asked why Peta Credlin was so popular and it listed her credentials and called her conservative, but included a line that “While not everyone agrees with Credlin’s perspectives, she has carved out a unique niche in the Australian media.”
Asked the same question about the ABC’s Laura Tingle and it also lists her credentials but adds descriptors like “respected”, “insightful” and one of the nation’s “leading political commentators”. No mention of her political leanings or that not everyone agrees with her perspective.
Clearly the machine is full of left-wing and woke bias.
But it can be cleverly done and woven into the text.
At least when you Google something you can see the sources to separate out the wheat from the chaff.
Is it just a case of garbage in, garbage out? A reflection of the bias of the internet source material?
Turns out it is actually being moderated. The owners admit queries are filtered through their moderation systems to prevent “offensive” or “potentially racist or sexist” outputs.
So it’s not neutral. In fact it’s infuriatingly biased.
But its worst sin is it lacks the spark of human consciousness. It’s banal. I asked it to do a column about ChatGPT “in the style of Clarissa Bye”. The response was boring as batshit – corporate-speak and full of sitting on the fence qualifiers.
As singer Nick Cave said recently, after being sent a song generated in his style, creating a song is “a blood and guts business … it requires something of me to initiate the new and fresh idea.
“This song is bullshit, a grotesque mockery of what it is to be human, and, well, I don’t much like it.”
In the famous 1968 movie 2001 A Space Odyssey, HAL, the omniscient computer, says it is incapable of making an error. We haven’t quite reached that point, but with the woke robots set to take over the world we need to be alert to the dangers of this technology being accepted at face value or woven into our search engines.