Wednesday, November 18, 2015

Single Artificial Neuron Taught to Recognize Hundreds of Patterns


Thanks to the work of Jeff Hawkins and Subutai Ahmad at Numenta, a Silicon Valley startup focused on understanding and exploiting the principles behind biological information processing. The breakthrough these guys have made is to come up with a new theory that finally explains the role of the vast number of synapses in real neurons and to create a model based on this theory that reproduces many of the intelligent behaviors of real neurons.

Real neurons consist of a cell body, known as the soma, that contains the cell nucleus and from which extend a number of nearby, or proximal, dendrites as well as the axon, a fine cable-like projection that can extend many centimeters to connect to other neurons. At the end of the axon are another set of branches, known as distal dendrites because of their distance from the soma..

Proximal and distal dendrites all make thousands connections, called synapses, to the axons of other nerve cells. These connections famously influence the rate at which the nerve cell produces electrical signals known as spikes..

The consensus is that neurons “learn” by recognizing certain patterns of connections among its synapses and  fire when they see this pattern..

But while it’s easy to understand how proximal synapses can influence the cell body and the rate of firing, it’s hard to understand how distal synapses can do the same thing, because they are so far away..

Hawkins and Ahmad now say they know what’s going on. Their new idea is that distal and proximal synapses play entirely different roles in the process of learning. Proximal synapses play the conventional role of triggering the cell to fire when certain patterns of connections crop up..

This is the conventional process of learning. “We show that a neuron can recognize hundreds of patterns even in the presence of large amounts of noise and variability as long as overall neural activity is sparse,” say Hawkins and Ahmad..

But distal synapses do something else. They also recognize when certain patterns are present, but do not trigger firing. Instead, they influence the electric state of the cell in a way that makes firing more likely if another specific pattern occurs. So distal synapses prepare the cell for the arrival of other patterns. Or, as Hawkins and Ahmad put it, these synapses help the cell predict what the next pattern sensed by the proximal synapses will be..

That’s hugely important. It means that in addition learning when a specific pattern is present, the cell also learns the sequence in which patterns appear. “We show how a network of neurons with this property will learn and recall sequences of patterns,” they say..

What’s more, they show that all this works well, even in the presence of large amounts of noise, as is always the case in biological systems..

That’s a significant new way of thinking about neurons and one that reproduces some of the key features of information processing in the human brain. For example, Hawkins and Ahmad show that this system doesn’t remember every detail of every pattern in a sequence but instead stores the difference between one pattern and the next..

So what’s important is not the total amount of information in a pattern but the difference between this pattern and the next.


- More Here


No comments: