AI monster is leaving dungeon
When I studied computer science in the 70s, artificial intelligence was crude. Now it’s set to be in our lives every day.
When I studied computer science in the early 1970s, artificial intelligence was on the syllabus. It was crude. As students we programmed a computer to play a binary guessing game. Was the object blue: yes or no. Square: yes or no. Lying flat: yes or no. Finally we’d get an answer.
For ensuing decades, AI remained like Frankenstein’s monster: something hideously unfathomable that was better left battened down in an underground lab, for fear it would destroy the world. Like triffids, machines controlled by AI were a potential pox on humanity.
But now the monster is out of the dungeon, and whether we like it or not, this Frankenstein will be with us in our lives for every hour of every day, not in the future but from now.
AI is no longer for elites. Amazing progress in the field of machine learning has increased the power of computers to learn on their own. That has to do with today’s incredible computational speeds and the vast amounts of available digital data to crunch. We don’t have to tell machines when they are right or wrong. We don’t have to say the animal in that photo is a dog, not a cat, and for the machine to learn from that.
And while some say we’re coming to the end of the line with Moore’s Law and our ability to shrink nano-sized chipsets further, the exponentially increasing power of computer systems is assured through our ability to link devices across the internet to create unimaginable power and learning capacity.
For example, an automated car collision avoidance system and computer voice recognition system replicated across the globe in one million locations in one day conceivably could learn as much as a single version in one location for a million days, just over 2700 years. That’s if we pool that learning, which we can. This may be a crude analogy, but it’s an unfathomable rate of artificial comprehension nonetheless.
Interest in artificial intelligence and machine learning was at fever pitch last week when chip maker Nvidia announced an AI-specific chip with 15 billion transistors on a single piece of silicon called the Tesla P100 chip. Its chief executive said it was the largest chip made.
Nvidia is showing that it wants to be centre stage in the new AI world. At CES in January it showed off a computer board that car manufacturers can install to offer driverless capabilities without manufacturers having to build them from scratch themselves. They link up displays and sensors: radar, Lidar, infra-red and cameras, and they’re close to up and running.
Last week Nvidia also announced it would offer its own AI computer with eight Tesla chips (with 90 billion transistors) for $US129,000 ($158,275), the cost of a high-end car. At that price, medium as well as large businesses probably could afford their own AI hardware powerhouse if they are not happy with AI services in the cloud.
Nvidia’s chief executive Jen-Hsun Huang said one of these DGX-1 computers could process an AI chore that typically would take 150 hours on a standard server in two hours. That’s a lot of AI firepower for your average suburban legal practice or hospital processing patient images.
Globally and locally the incidence of AI/machine learning systems is starting to proliferate. The Australian has reported how machine learning will soon decide which delivery van or motor cyclist will deliver your home-ordered restaurant food, machines will decide which journalists receive which press releases based on their writing interests, and AI bots will assist us to book hotel rooms and airline tickets in the blink of an eye.
Machines now can easily create systems that classify images, compile video analytics, transcribe prerecorded speech and parse and understand everyday language: so-called natural language processing. Transforming the financial system, medical uses, marketing uses, driverless car capabilities and, in particular, artificial intelligence that manages data centres are ahead.
However, there remains an urgency for humanity to keep its eye on fast-learning computer networks, and to be across what’s happening so that we can debate what’s viable and stave off undesirable consequences of Frankenstein’s monster.
Hopefully, reining in the monster won’t be necessary.