[ad_1]
Then there’s those pesky “hallucinations”, where the AI generates moronic errors. They are an intrinsic part of this kind of “connectionist” AI, which is a simple but expensive-to-run word completion engine.
When Google tells us to stay healthy by eating one small rock a day, it’s funny and probably harmless. It is not so funny when you’ve automated your key business decisions using what risks being a compulsive fabricator.
Businesses’ hallucination rate
Anecdotally, businesses are encountering a 20 per cent hallucination rate across many different applications, which is nowhere near good enough. So today’s very best AI generates lots of what we don’t need, but doesn’t automate what we would like automating. It’s a solution looking for a problem.
Ask the Tony Blair Institute, or any of the Government’s own AI advisors, about all this and you’ll find that such thoughts don’t seem to have entered their heads.
Loading
Hallucinations did not concern the AI Safety Summit last year and the word does not appear in the Blair Institute’s grandly-titled paper Governing in the Age of AI: A New Model to Transform the State. Politicians and think tanks appear to be taking the promises of the carnival barkers – the AI producers and their venture capital backers – at their word.
“Top-level business and technology leaders do fall prey to collective hallucinations and become irrational,” is how Andrew Odlzyko describes it. He’s a professor of mathematics at the University of Minnesota, and a former research scientist in industry, who has studied the phenomenon of technology bubbles.
He puts them down to gullibility, a willingness to believe a fantasy. The fields of economics, psychology, sociology and tech are all prone to these delusions, he argues. Even very clever people are not immune – Isaac Newton lost most of his fortune in the South Sea bubble, a notorious bout of stock market speculation in the 18th century.
Bubbles become self-reinforcing. The technology industry has spawned a sprawling lackey class that promises miracles and berates those who don’t get on board. It takes a brave manager to resist this and an even braver one to suggest that the Emperor may not be wearing any clothes.
While this column does not offer investment advice, I have a small list of areas where I do expect investments in machine learning – the underlying technology powering generative AI – to pay off. These are almost entirely background processes and will supplant existing techniques.
However, the list of fields where there is no plausible business model is far longer. People may love novelty automation, but don’t want to pay for it.
‘AI is wasting all the energy in California’
“I don’t see what point there is to GPT except helping some student fake an exam,” mused Noam Chomsky, the MIT linguist and a veteran critic of AI fantasies, last year.
“It has no scientific contribution. It doesn’t seem to have any engineering contribution. So as far as I can see it’s just wasting all the energy in California.”
Loading
Ultimately, we all pay for the gullibility of our elites – bubbles are not harmless.
In his book Irrational Exuberance, Professor Shiller warned: “If we exaggerate the present and future value of the stock market, then as a society we may invest too much in business start-ups and expansions, and too little in infrastructure, education, and other forms of human capital.”
All AI bubbles to date have ended in a “winter”, and the next one may be the chilliest of all.
Telegraph, London
The Business Briefing newsletter delivers major stories, exclusive coverage and expert opinion. Sign up to get it every weekday morning.
[ad_2]
Source link