Bringing brains to computers
- 17 December, 2013 21:48
For decades, scientists have fantasized about creating robots with brain-like intelligence. This year, researchers tempted by that dream made great progress on achieving what has been called the holy grail of computing.
Today, a wide variety of efforts are aimed at creating intelligent computers that can progressively learn and make smarter decisions. Millions of dollars this year were poured in efforts to create "silicon brains," or neuromorphic chips that mimic brain-like functionality to make computers smarter.
The new chips could give eyes and ears to smart robots, which will be able to drive, identify objects, or even point out rotten fruit. This chip technology could let humans get mind control over machines, mobile devices anticipate actions by users and wearable devices like Google Glass to diagnose diseases. In the long run, neural chip implants could boost mental, visual and cognitive capabilities of humans.
Scientists are looking to create advanced computers with these neural chips, which replicate the brain's circuitry and can retain information and make decisions based on patterns discovered through probabilities and associations. Projects funded by the U.S. government, European Union and private organizations are attempting to re-create the manner in which the brain's neurons and synapses work by redesigning the memory, computation and communication features of traditional circuitry.
The brain has 100 billion interconnected neurons, nerve cells that process and transmit information via electrical and chemical signals. These neurons compute in parallel and communicate via trillions of connections, which are the synapses. Connections among neurons in the neural network are either strengthened or pruned as the brain learns more. Today's processors are wired and regulate voltage differently than the brain's neural network, but researchers are keen on exploiting the parallelism of the brain, which among other things also reduces power requirements.
Researchers hope neural chips will accomplish cognitive tasks and respond to a wide range of stimuli. Computers can already see and hear; robots have already been built to respond to sensory input. Within five years, computers could get smell and taste, and this sort of sensory information could be fed to chips for processing.
To be sure, most of the chip development efforts are in early experimental phases. Brains of small insects and worms have been simulated on prototype neural chips, but human brains operate on a different scale. While it could be decades until chips simulate the human brain, the groundwork is bring laid by new models of computing that are now being established.
Among other things, new data-processing techniques are needed that allow more information to be fed to computers, researchers said. Helping this effort, the physical limits of manufacturing techniques for the chips that power today's computers could fall within a decade, opening the door for new computing designs and chip architectures, said Robert Colwell, director of the microsystems technology office at DARPA (Defense Advanced Research Projects Agency), in a speech earlier this year.
Currently, computers don't have the capacity to learn from past experiences, and rely on preprogrammed code to make decisions. On the other hand, brain cells do not require programming, are high tolerance, can regenerate, and can draw conclusions that computers are not able to reach, said Karlheinz Meier, professor and chair of experimental physics at the University of Heidelberg.
Traditional computers won't go away, meanwhile, as some activities don't require intelligent processing, said Meier, who is also co-director of the European Union-funded Human Brain Project.
"You will always do your text processing and email," Meier said.
But like the brain, neural chips will excel at certain things, like cutting through "noisy" data to make intelligent decisions, said Nabil Imam, a computer scientist and researcher at Cornell University.
The neuromorphic chips will complement, not replace, other processors in a computer, Imam said.
Chips modeled after the human brain have electronic neurons that can dynamically rewire the connections among them, blast information at each other, and forage for relevant data -- a process more power efficient than throwing lots of data to CPUs and other coprocessors like GPUs. IBM's Watson supercomputer made history when it beat participants at the game of Jeopardy, but it threw lots of data at processors to find answers.
"Our brains were wired to do certain things very well like pattern recognition. Computers can't do that. These processors have a different class of applications," Imam said.
Imam is involved in the development of neuromorphic chips as part of the multiphase Synapse (Systems of Neuromorphic Adaptive Plastic Scalable Electronics) project funded by DARPA. The Synapse project, initiated in 2008, involves IBM, Hewlett-Packard, Cornell, Stanford University and other universities.
The first tangible results for Synapse came in early 2011, when IBM demonstrated a prototype chip with 256 digital neurons running at slow speeds of 10MHz. The chip was able to demonstrate navigation and pattern recognition abilities.
One chip core had 262,144 programmable synapses, while another core had 65,536 learning synapses. The connections between digital neurons got stronger depending on the number of signals sent. If an electronic spike from one neuron affects the voltage of another neuron, the two are synaptically connected. In chips, spiking neurons communicate with other neurons when triggers, such as certain values, are reached.
The next big Synapse announcement will come next year, when a new neural chip system that mimics a "very big brain" will be announced, Imam said. The chip will have a novel design of memory arrays so that large numbers of connections can be made among digital neurons. An asynchronous design will ensure communication signals are organized by local circuits. The chip will be made using a new manufacturing process.
"It's the largest neuromorphic system that's been built to date," Imam said.
IBM, one of the lead researcher companies in Synapse, this year said that it ultimately wants to build a "chip system" that has 10 billion neurons and a hundred trillion synapses but draws just 1 kilowatt of power.
Another research project drawing interest is Qualcomm's Zeroth chip, which the company calls a "neural processing unit." By analyzing patterns of human behavior, the chip could make interaction with mobile devices easier by anticipating user actions, said company CEO Paul Jacobs during a speech last month.
Qualcomm has already demonstrated a robot based on Zeroth that can make navigation decisions. The company wants to expand Zeroth's capabilities and is researching possibilities, said Sameer Kumar , director of business development at Qualcomm.
The Synapse and Qualcomm research efforts are based on digital neurons, but one neuromorphic system due in Europe will be based on analog circuitry, which keeps it truer to the brain. The system, located at the University of Heidelberg in Germany, is part of the Human Brain Project, a 10-year, US$1.6 billion effort backed by the European Union to understand the brain's inner workings.
The university already has a neuromorphic computing system operational with a silicon wafer containing 200,000 neurons and 50 million synapses. In two years, researchers hope to offer a 20-wafer system with a combined 4 million analog neurons, said Meier, who is spearheading the project. The highly parallel chip design has configurable electronic neurons, and the goal is to understand the dependencies, synchronization and communication among neurons and synapses, and adopt them to computing.
The project's intent is not to develop the best neural chip, but to understand architectures, Meier said. That could pave the way for neuromorphic computing models .
Other neural chip research efforts include Stanford's Neurogrid and the University of Manchester's Spinnaker, which is part of the E.U.'s Human Brain Project. HP is developing memristor memory technology, which could bolster a computer's decision-making ability by understanding patterns from previously collected data, much like human brains collecting memories of and understanding a series of events.
It's easy to create theories, but what's important is to make the chips usable, said Guy Paillet, who holds a 1995 patent on neural circuit design, along with IBM and others. Paillet is the executive chairman of General Vision, which sells a chip called CM1K, based on a neural network design.
Research efforts under way are focused on so-called spiking neurons, which Paillet said are "close to biology to replicate the synapse model."
Copying features of the way the brain works and applying them to chip technology is easier said than done. Neuron behavior is hard to predict, and making a chip that rewires millions of connections is a challenge. Morever, the brain is yet to be fully understood, and neuroscience researchers are uncovering new facts everyday.
But neural chip researchers are sharing data and taking complementary approaches, Meier said, adding that a little competition among peers doesn't hurt.
"This is a chance to produce a new way of computing and we have to do whatever we can," Meier said.
Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.
How to Switch From iPhone 5S to BlackBerry Z30 (and Why)
CIOs to Become In-House Brokers -- and That's a Good Thing
The future of computing
10 Hot Hadoop Startups to Watch
The future of computing
The CIO Paradox
As there are timeless leadership principles underlying IT value, there are unfortunately also timeless challenges that thwart the IT organisations efforts and make for a rocky path to CIO success. These are the inherent contradictions we call the CIO Paradox.
The Future of IT: From Chaos to Service Automation
Technology has become the heart and soul of every business, but IT workload and system complexity become more challenging. This whitepaper details the future of IT, the major challenges facing CIOs, and the three ways to transform IT so CIOs can lead the way.
Empowering Modern Finance - The CFO as Technology Evangelist
The CFO as Technology Evangelist is a research report commissioned by Oracle and Accenture, in collaboration with Longitude Research, that explores how modern CFOs and finance executives are adopting emerging technologies within their finance functions to enable the development of new capabilities and to transform the role of finance.