While it might seem like the ever-increasing advancement of artificial intelligence will eventually reach a point where it exceeds human intelligence and control, there’s something to remember: The human brain is the most complex machine in the known universe.
That’s what neuroscientist and electrical engineer, Dr Alan Finkel, told CIO Australia ahead of the Creative Innovation 2015 event, which will take place in Melbourne 23-25 March.
“We’ve got 100 billion neurons, and each neuron has 10,000 connections. Each of those 100 billion neurons is not like a transistor in an integrated circuit; each neuron is like a little processing unit. So we have enormous capability,” Finkel explained on the complexity of the organic human brain.
“People can do calculations that show today the complexity of the human brain far exceeds the complexity of the most sophisticated computers, not in our ability to do accurate and rapid calculations but in our overall ability to reason, to be creative, and to recognise faces and things like that.”
One advantage of our brains over machines is that we don’t require a huge amount of power because we are able to carry out cognitive capabilities more efficiently, Finkel said.
“The human brain needs the chemical energy that you can get from eating four squares of chocolate every hour, whereas computers use huge amounts of electricity.
“If you try to build a computer today that has the same level of complexity, it would occupy many, many dozens or hundreds of hectares [of space] and probably use a gigawatt of electricity, which is about as much electricity as the city of Melbourne uses,” he said.
Finkel pointed out that a 1 gigahertz microprocessor cycles or updates itself a billion times per second, meaning the clock cycle in computers can be quite energy consuming.
“Each neuron, this little processor, looks at its environment and gets inputs coming from 10,000 other cells. It just slowly waits till the various inputs reach a threshold that it finds interesting for its particular job function, and then sends out a transmission to one of the neighbouring neurons. So instead of being clocked, they are all working independently and talking to each other only when it makes sense. So there’s a lot lower energy use.”
Another advantage of the human brain over machines is diverse thinking and creativity, which are not necessarily easy to replicate in a machine, Finkel said. Input data to a bunch of humans and there will be variations in the output, due to highly complex transient states in the brain.
“With 7 billion brains [on Earth], if you gave them all the same information you would get some variation of 7 billion results. And that adds at a global level to our creative capability.”
However, Finkel believes it’s a “realistic possibility” that artificial intelligence will reach a point where it will stand up against the human brain, in what is referred to as the 'Singularity'.
“If you look at the rate of improvement of computer capability at the moment, in 15 to 20 years from now a computer with the complexity of the human brain will be relatively small and use a relatively small amount of power or electricity,” he said.
The algorithms are also becoming increasingly smarter. Finkel gave an example of the University of Hong Kong last year developing algorithms running on a supercomputer that could recognise faces. It was tested with more than 6,000 people under different lighting conditions and changing hairstyles. The computer outperformed humans in the task.
Machines are also composing music and creating original artworks, with the University of Konstanz’s e-David painting robot that came out last year being an example of the latter.
“They are doing it in a way that looks spontaneous in the sense they are not just duplicating something that has been done before," Finkel said.
“Programs like that will have a massive impact on jobs and not just jobs where there’s an element of repetition and you expect them to be automated but also jobs where there’s an element of intellectual capacity.
“The things that we think will never happen keep happening. And the rate of change is enormous. It’s not just in Moore’s Law, which applies to the electronic hardware, but the algorithms are improving at a rapid rate,” he said.