Today’s AI systems are superhuman. Computer models based loosely on the neural networks in our brains are trained on vast amounts of data using huge clusters of processors. They can now classify objects in images better than we can. And as IBM and Google’s DeepMind have demonstrated, they can beat us at games such as chess and Go, and even achieve the highest rank in the computer game StarCraft II.
But at the same time, AI systems are inhuman. Even inhumane. Our AIs do not comprehend our world or their place within it.
Biological creatures are not trained once on a static pool of data in the way we train an AI. It would be like presenting a newborn baby with the complete Encyclopaedia Britannica and telling them, “learn that perfectly, and that’s all you’ll ever need”.
We require years of experience in ever-changing environments before we can understand our world. Research has shown that, if we’re trying to focus on an object, our brains aren’t fully able to filter out visual distractions until age 17, and our ability to perceive faces keeps developing until age 20.
Read more about AI:
We’re prebuilt to learn, having descended from 3.5 billion years of creatures who each faced life-and-death situations, in which they had to perceive and act correctly to survive. AIs have none of this. Their algorithms use highly simplified ideas of learning, mostly doing little more than data classification or prediction.
The only way we can make our AI algorithms work is by training them on large amounts of narrowly focused sets of data, with defined objectives. They are still not able to handle changing scenarios in the way we can.
© Scott Balmer
They do not understand cause and effect. They cannot properly link words such as ‘chair’ or ‘vehicle’ to real physical objects, because they never experience reality how we do.
And while some AIs may be able to classify emotions by processing images of faces, research is still in its infancy into how an AI might actually feel emotions, empathise, or understand how its behaviour might affect us.
Over time, AIs might come closer to us. Maybe we can help them to think more like us by developing algorithms that learn and process information in new ways. But the gap between us will likely always be there.
AIs do not share our evolutionary history, and they may never have a brain as complex or as fine-tuned as ours. They can become masters of the digital universes they inhabit, whether computer games, image processing, or the internet.
But, for the foreseeable future, we will remain masters of our own world.
Read more wild ideas in science:
As Albert Einstein once said, “imagination is more important than knowledge.” So with that in mind, here are our picks of the most radical theories in science.