Followers

Sunday, February 8, 2009

Cognitive Computing Project Aims to Reverse-Engineer the Mind

By Priya Ganapati

Imagine a computer that can process text, video and audio in an instant, solve problems on the fly, and do it all while consuming just 10 watts of power.

It would be the ultimate computing machine if it were built with silicon instead of human nerve cells.

Compare that to current computers, which require extensive, custom programming for each application, consume hundreds of watts in power, and are still not fast enough. So it's no surprise that some computer scientists want to go back to the drawing board and try building computers that more closely emulate nature.

"The plan is to engineer the mind by reverse-engineering the brain," says Dharmendra Modha, manager of the cognitive computing project at IBM Almaden Research Center.

In what could be one of the most ambitious computing projects ever, neuroscientists, computer engineers and psychologists are coming together in a bid to create an entirely new computing architecture that can simulate the brain's abilities for perception, interaction and cognition. All that, while being small enough to fit into a lunch box and consuming extremely small amounts of power.

The 39-year old Modha, a Mumbai, India-born computer science engineer, has helped assemble a coalition of the country's best researchers in a collaborative project that includes five universities, including Stanford, Cornell and Columbia, in addition to IBM.

The researchers' goal is first to simulate a human brain on a supercomputer. Then they plan to use new nano-materials to create logic gates and transistor-based equivalents of neurons and synapses, in order to build a hardware-based, brain-like system. It's the first attempt of its kind.

In October, the group bagged a $5 million grant from Darpa -- just enough to get the first phase of the project going. If successful, they say, we could have the basics of a new computing system within the next decade.

"The idea is to do software simulations and build hardware chips that would be based on what we know about how the brain and how neural circuits work," says Christopher Kello, an associate professor at the University of California-Merced who's involved in the project.

Computing today is based on the von Neumann architecture, a design whose building blocks -- the control unit, the arithmetic logic unit and the memory -- is the stuff of Computing 101. But that architecture presents two fundamental problems: The connection between the memory and the processor can get overloaded, limiting the speed of the computer to the pace at which it can transfer data between the two. And it requires specific programs written to perform specific tasks.

In contrast, the brain distributes memory and processing functions throughout the system, learning through situations and solving problems it has never encountered before, using a complex combination of reasoning, synthesis and creativity.

"The brain works in a massively multi-threaded way," says Charles King, an analyst with Pund-IT, a research and consulting firm. "Information is coming through all the five senses in a very nonlinear fashion and it creates logical sense out of it."

The brain is composed of billions of interlinked neurons, or nerve cells that transmit signals. Each neuron receives input from 8,000 other neurons and sends an output to another 8,000. If the input is enough to agitate the neuron, it fires, transmitting a signal through its axon in the direction of another neuron. The junction between two neurons is called a synapse, and that's where signals move from one neuron to another.

"The brain is the hardware," says Modha, "and from it arises processes such as sensation, perception, action, cognition, emotion and interaction." Of this, the most important is cognition, the seat of which is believed to reside in the cerebral cortex.

The structure of the cerebral cortex is the same in all mammals. So researchers started with a real-time simulation of a small brain, about the size of a rat's, in which they put together simulated neurons connected through a digital network. It took 8 terabytes of memory on a 32,768-processor BlueGene/L supercomputer to make it happen.

The simulation doesn't replicate the rat brain itself, but rather imitates just the cortex. Despite being incomplete, the simulation is enough to offer insights into the brain's high-level computational principles, says Modha.

The human cortex has about 22 billion neurons and 220 trillion synapses, making it roughly 400 times larger than the rat scale model. A supercomputer capable of running a software simulation of the human brain doesn't exist yet. Researchers would require at least a machine with a computational capacity of 36.8 petaflops and a memory capacity of 3.2 petabytes -- a scale that supercomputer technology isn't expected to hit for at least three years.

While waiting for the hardware to catch up, Modha is hoping some of the coalition's partners inch forward towards their targets.

Software simulation of the human brain is just one half the solution. The other is to create a new chip design that will mimic the neuron and synaptic structure of the brain.

That's where Kwabena Boahen, associate professor of bioengineering at Stanford University, hopes to help. Boahen, along with other Stanford professors, has been working on implementing neural architectures in silicon.

One of the main challenges to building this system in hardware, explains Boahen, is that each neuron connects to others through 8,000 synapses. It takes about 20 transistors to implement a synapse, so building the silicon equivalent of 220 trillion synapses is a tall order, indeed.

"You end up with a technology where the cost is very unfavorable," says Boahen. "That's why we have to use nanotech to implement synapses in a way that will make them much smaller and more cost-effective."

Boahen and his team are trying to create a device smaller than a single transistor that can do the job of 20 transistors. "We are essentially inventing a new device," he says.

Meanwhile, at the University of California-Merced, Kello and his team are creating a virtual environment that could train the simulated brain to experience and learn. They are using the Unreal Tournament videogame engine to help train the system. When it's ready, it will be used to teach the neural networks how to make decisions and learn along the way.

Modha and his team say they want to create a fundamentally different approach. "What we have today is a way where you start with the objective and then figure out an algorithm to achieve it," says Modha.

Cognitive computing is hoping to change that perspective. The researchers say they want to an algorithm that will be capable of handling most problems thrown at it.

The virtual environment should help the system learn. "Here there are no instructions," says Kello. "What we have are basic learning principles so we need to give neural circuits a world where they can have experiences and learn from them."

Getting there will be a long, tough road. "The materials are a big challenge," says Kello. "The nanoscale engineering of a circuit that is programmable, extremely small and that requires extremely low power requires an enormous engineering feat."

There are also concerns that the $5 million Darpa grant and IBM's largess -- researchers and resources--while enough to get the project started may not be sufficient to see it till end.

Then there's the difficulty of explaining that mimicking the cerebral cortex isn't exactly the same as recreating the brain. The cerebral cortex is associated with functions such as thought, computation and action, while other parts of the brain handle emotions, co-ordination and vital functions. These researchers haven't even begun to address simulating those parts yet.

Original here

No comments: