This was published 6 months ago
Opinion
The AI bubble might blow up in our face
By Andrew Orlowski
Apple briefly became the most valuable company in the world again last week, but this is no surprise. The stock market is rewarding everyone who says they have added some AI magic to their business – and punishing anyone who hasn’t.
After Apple announced some very modest AI additions to its platforms, its share price rocketed – despite the fact that there was no solid evidence it would impact sales.
“At the minute AI is a very expensive, non-profit-generating PR expense,” noted one sceptical investment analyst.
These are very frothy times. How frothy? Professor Robert Shiller, a Yale economist, devised a useful metric to measure it. Shiller’s PE ratio is an inflation-adjusted price to earnings metric, in essence measuring how highly investors are valuing profits based on stock prices.
This measure reveals that US markets have only been as bubbly as they are now three times in the past 150 years. The last high was in August 2021, as the world emerged from Covid lockdowns and the meme stock craze was in full swing. Before that was 1999, shortly before the dotcom bubble collapsed. And before that, it was 1929.
Today’s bubble is based on a confident belief that AI will automate lots of tasks that humans perform, increasing productivity and therefore profits. But this belief so far isn’t evident in the underlying numbers. In fact, there are some ominous warnings for anyone who cares to look more closely.
Early-stage investors, who tend to study a business’s underlying technology, are cooling on AI. So-called “seed” investment is down 76 per cent from its peak last autumn. Funders are “chiefly concerned about how questions around profitability will shake out”, Pitchbook reports.
Elsewhere, after a flurry of initial excitement, only 2 per cent of Britons are now using ChatGPT daily, a Reuters-Oxford study has found. The authors identified a “mismatch” between the hype and genuine public interest.
Then there’s those pesky “hallucinations”, where the AI generates moronic errors. They are an intrinsic part of this kind of “connectionist” AI, which is a simple but expensive-to-run word completion engine.
When Google tells us to stay healthy by eating one small rock a day, it’s funny and probably harmless. It is not so funny when you’ve automated your key business decisions using what risks being a compulsive fabricator.
Businesses’ hallucination rate
Anecdotally, businesses are encountering a 20 per cent hallucination rate across many different applications, which is nowhere near good enough. So today’s very best AI generates lots of what we don’t need, but doesn’t automate what we would like automating. It’s a solution looking for a problem.
Ask the Tony Blair Institute, or any of the Government’s own AI advisors, about all this and you’ll find that such thoughts don’t seem to have entered their heads.
Hallucinations did not concern the AI Safety Summit last year and the word does not appear in the Blair Institute’s grandly-titled paper Governing in the Age of AI: A New Model to Transform the State. Politicians and think tanks appear to be taking the promises of the carnival barkers – the AI producers and their venture capital backers – at their word.
“Top-level business and technology leaders do fall prey to collective hallucinations and become irrational,” is how Andrew Odlzyko describes it. He’s a professor of mathematics at the University of Minnesota, and a former research scientist in industry, who has studied the phenomenon of technology bubbles.
He puts them down to gullibility, a willingness to believe a fantasy. The fields of economics, psychology, sociology and tech are all prone to these delusions, he argues. Even very clever people are not immune – Isaac Newton lost most of his fortune in the South Sea bubble, a notorious bout of stock market speculation in the 18th century.
Bubbles become self-reinforcing. The technology industry has spawned a sprawling lackey class that promises miracles and berates those who don’t get on board. It takes a brave manager to resist this and an even braver one to suggest that the Emperor may not be wearing any clothes.
While this column does not offer investment advice, I have a small list of areas where I do expect investments in machine learning – the underlying technology powering generative AI – to pay off. These are almost entirely background processes and will supplant existing techniques.
However, the list of fields where there is no plausible business model is far longer. People may love novelty automation, but don’t want to pay for it.
‘AI is wasting all the energy in California’
“I don’t see what point there is to GPT except helping some student fake an exam,” mused Noam Chomsky, the MIT linguist and a veteran critic of AI fantasies, last year.
“It has no scientific contribution. It doesn’t seem to have any engineering contribution. So as far as I can see it’s just wasting all the energy in California.”
Ultimately, we all pay for the gullibility of our elites – bubbles are not harmless.
In his book Irrational Exuberance, Professor Shiller warned: “If we exaggerate the present and future value of the stock market, then as a society we may invest too much in business start-ups and expansions, and too little in infrastructure, education, and other forms of human capital.”
All AI bubbles to date have ended in a “winter”, and the next one may be the chilliest of all.
Telegraph, London
The Business Briefing newsletter delivers major stories, exclusive coverage and expert opinion. Sign up to get it every weekday morning.