All right so we have a neural network in a brain right? Let’s say there are around 100 billion neurons there and around 100 trillion or more synapses. Now what I’d like to know is how the actual information in the neural network is stored in relation to neurons and synapses. Do neurons actually store some information, like say we would have a “0” and “1” neurons like in binary code, and then the way in which they are connected gives us something meaningful? Or are neurons more passive and they don’t really carry any information, and their purpose is only to allow for synapses to exist and to relay signals?
In other words: where does the processing power come from? Does an individual neuron have a processing capacity? You know, like you send a signal into neuron, and then inside the neuron some operation is made with that signal; or is this processing purely dependent on the network and the synapses?
I’ll draw a little metaphor that crossed my mind:
I have a computer screen with resolution 1366 x 768. That would give me around 1 million pixels. Now these pixels are all that I have, I cannot stuff more onto my screen. These pixels represent neurons. The images that my screen can show me would represent output of the neural network (e.g. a mental image that I imagine in my mind). The system of synchronizing my pixels, giving them the right color, etc., would represent synapses (and as a single neuron has many inputs - dendrites, so can a single pixel represent various inputs i.e. colors). So as a result, with a limited number of neurons, I can create a big number of synaptic configurations and thus get a high variety of output. Does this make sense, or this isn’t how neural processing works? What do you guys think?
Information isn’t really stored in neurons (well, apart from the information in DNA etc. that is stored in any cell, and has nothing especially to do with the brain or thinking). It is stored in the number, arrangement, types and strength of the synapses that are made onto the neuron (mostly amongst its dendrites) from other neurons. There are typically hundreds of synapses made upon each neuron in the brain. When these synapses are fired (by the other neurons they come from) they either stimulate or inhibit the neuron they are on, and when the level of stimulation passes a certain point the neuron itself will fire and send on stimulatory or inhibitory signals (depending on what sort of neuron it is) to other neurons on down the line. The way the incoming stimulatory or inhibitory signals add together to bring the neuron to its firing threshold is very complicated and (AFAIK) still not well understood, but it depends upon (amongst other things) the way the synapses are arranged in the dendritic tree, which itself can have a very complex structure. Not all synapses have equal effects. Their effect will depend on just where they are on the cell.
If you really want to understand this, though, you might do better looking at a textbook of neuroscience, rather than asking people on an internet message board.
Reported for forum change.
I was once in a PhD program in behavioral neuroscience. The short answer to your question is that no one knows. There are theories and some progress but it is not as fast as everyone had hoped. We do know the brain and nervous system are almost completely unlike a digital computer despite popular comparisons even though there have been some successes at interfacing the two with things like artificial retinas.
I don’t mean to dismiss your question because it is a good one. It is just something that no one can answer right now. I got out of the field when I found out just how primitive it still is and could not begin to answer questions like this that I wanted to know about.
If you are really interested in this topic, you will have to read the latest research in neuroscience journals or at least summaries of it in reputable popular media like Discover magazine (easy) or Scientific American (harder). It won’t answer all your questions because again, no one knows that but it can show you the work in progress.
Saying that no one knows is overstating the case. We know that there are at least two kinds of information storage. One is the current electrical state of the brain, i.e. all the action potentials at a given instant of time. This can change rapidly in response to stimuli. The second is the number and strength of the synapses. The strength of the synapses change with time in response to beinig activated. The details are complex and not well understood, but this certainly plays a role in long term memory.
The processing power comes from the neuron itself, which decides whether or not to fire an action potential depending on the inputs it is getting from its synapses. Think of it like a threshold logic gate, with an enormous fan-in and fan-out. The input is analog, the action potential is digital, but each of the outputs is an analog signal.
Coursera has an upcoming online class on this - https://www.coursera.org/course/bluebrain
Some interesting recent bits of info that show the complexity and lack of full knowledge regarding this topic:
White matter plays a larger role in computation then previously thought and it also changes due to learning.
In some areas of the brain, strong vs weak dendrites propagate signals differently and weak dendrites can be transformed into strong dendrites, which is a change that takes place at the dendrite instead of the synapse (a different form of change related to learning and memory than was realized previously).
Brain wave synchronization (from an article a couple days ago):
“The brain holds in mind what has just been seen by synchronizing brain waves in a working memory circuit, an animal study supported by the National Institutes of Health suggests. The more in-sync such electrical signals of neurons were in two key hubs of the circuit, the more those cells held the short-term memory of a just-seen object.”