icon bookmark-bicon bookmarkicon cameraicon checkicon chevron downicon chevron lefticon chevron righticon chevron upicon closeicon v-compressicon downloadicon editicon v-expandicon fbicon fileicon filtericon flag ruicon full chevron downicon full chevron lefticon full chevron righticon full chevron upicon gpicon insicon mailicon moveicon-musicicon mutedicon nomutedicon okicon v-pauseicon v-playicon searchicon shareicon sign inicon sign upicon stepbackicon stepforicon swipe downicon tagicon tagsicon tgicon trashicon twicon vkicon yticon wticon fm
21 Jan, 2016 20:40

‘Bombshell discovery’: Human brain can hold 10 times more memories than previously thought – study

‘Bombshell discovery’: Human brain can hold 10 times more memories than previously thought – study

The human brain can hold 10 times more memories than previously believed, according to a new study. The key to its amazing ability lies in synapses, the neural connections responsible for storing memories.

Researchers from the Salk Institute found that each synapse can hold about 4.7 bits of information. This means that the human brain has a capacity of one petabyte, or 1,000,000,000,000,000 bytes. This is equivalent to approximately 20 million four-drawer filing cabinets filled with text.

"This is a real bombshell in the field of neuroscience...our new measurements of the brain's memory capacity increase conservative estimates by a factor of 10 to at least a petabyte, in the same ballpark as the World Wide Web,” said Terry Sejnowski, Salk professor and co-senior author of the paper, which was published in the journal eLife.

The discovery of the human brain's impressive capability came while the scientists were building a 3D reconstruction of rat hippocampus tissue, using it as a proxy for human brain cells.

Upon observing their creation, they noticed something unusual – but to understand the finding, one must understand the basic science of memories.

Memories and thoughts are the results of patterns of electrical and chemical activity in the brain. A vital part of that activity happens when branches of neurons intersect at certain junctions, known as synapses.

An output 'wire' (axon) from one neuron then connects to an input 'wire' (dendrite) of a second neuron. Signals travel across the synapse as chemicals called neurotransmitters tell the receiving neuron whether to convey an electrical signal to other neurons.

However, when the researchers reviewed the 3D reconstruction, they saw that “in some cases, a single axon from one neuron formed two synapses reaching out to a single dendrite of a second neuron, signifying that the first neuron seemed to be sending a duplicate message to the receiving neuron,” according to a Salk Institute press release.

They decided to further examine the synapses, measuring the difference similar ones.

"We were amazed to find that the difference in the sizes of the pairs of synapses were very small, on average, only about eight percent different in size. No one thought it would be such a small difference. This was a curveball from nature," said Bartol.

The team went on to conclude that there could be about 26 categories of sizes of synapses, rather than just a few.

"Our data suggests there are 10 times more discrete sizes of synapses than previously thought," said Bartol. It was previously believed that the brain was capable of just one to two bits for short and long-term storage in the hippocampus.

According to Bartol, events in the brain cause the synapses to change in size, “adjusting themselves according to the signals they receive.”

"The implications of what we found are far-reaching," said Sejnowski. "Hidden under the apparent chaos and messiness of the brain is an underlying precision to the size and shapes of synapses that was hidden from us."

The findings also offer an explanation for the brain's surprising efficiency. According to the press release, the waking adult brain generates only about 20 watts of continuous power – as much as a very dim light bulb.

The study could help computer scientists build energy efficient computers, particularly ones that employ “deep learning” and artificial neural nets – techniques capable of sophisticated learning and analysis, such as speech, object recognition, and translation.

"This trick of the brain absolutely points to a way to design better computers," said Sejnowski. "Using probabilistic transmission turns out to be as accurate and require much less energy for both computers and brains."

Podcasts
0:00
28:18
0:00
25:17