Frankly, I still think you misunderstand how a neuron works. Clearly, you have a decent understanding, but bear with me for a moment. A typical neuron will have a large number of inputs directly from other neurons, and a single output. Inputs are summed over time and space, and if these inputs reach a certain threshold, the neuron will fire, often repeatedly. The threshold, itself, is often a moving target based on all sorts of shifting electrochemical equilibria. That output is not a single, discrete event, however. The strength of a single firing can vary, the frequency of firing is variable. Both of those quantities, which really describe the output of a neuron, are continuously variable.
Well, speaking of neurons and analog computers, one of the advantage of analog computing is the high temiporal resolution. I know a group of researchs who are builing analog chips that simulate neurons. The chips are analog, i.e. reponses are simulated by assembling resistances, condensators and so on. The idea is to plug about 100 of them in order to simulate a little network of ‘realistic’ neurons.
In brief, the idea is to answer questions such as ‘how do my parametres affect the shape of PSEP (i.e. the dynamic of electrical responses) in a neuron that receive multiple inputs ?’
Now it gets interesting, indeed if you want to simulate things such as ‘the voltage in one given neuron at time t’, you are going to need a sampling of 10 kHz at least. If you have a network of 100 neurons with, let’s say, 20 connections by neurons, you will have a hard time simulating it on a digital computer. With analog computing, on the other hand, all you have to do is to plug a DAC.
So, one point for digital computing is time resolution. Notice however that the example I gave is an example of a very specific computation (the cricuit is designed to simulate neurons) and not a multi-purpose programmable machine.
Notice also that this has nothing to do with artificial intelligence.
Modems use analog tones instead of Morse code to communicate. So does, to an extent, Gigabit and Wi-Fi. When you get down to it, it’s just a better, denser way of relaying information.
I think that this is actually what the multi-value voltage folk were getting at. Maybe the signal quality and overshoot/undershoot problems really are a nail. But maybe it’s doable and just really, really hard. I can maybe see future computers, if they don’t go quantum, moving toward multi-value voltage or quote-unquote analog for lack of another way to improve. (And YES, you can still do error correction.) But it’d be a huge jump in difficulty of design, and maybe it’d be our super-intelligent AI descendants that’d be creating them.
I see what you’re saying, but I think that’s getting a little nitpicky and playing with definitions (after all, one could just as well argue that on an atomic level everything is digital because of quantum mechanics).
In My Ever-So Humble Opinion, synapses are far more complicated than the input to a resistor in a logic array, sufficiently so that --if the terms are going to be useful at all-- a resistor in a logic array is digital but a synapse is analog.
For instance, the resistor input has no ‘memory’ – the state depends entirely on a single variable: the input voltage. In contrast, a synapse depends on a huge number of variables, including past history and how much glands on the other side of the body are pumping things into the bloodstream. And all of those variables are smoothly varying.
But **lazybratsche **, can neurons really fire at different strengths (as opposed to different frequencies) ?
This isn’t exactly answering your question, but some neurons don’t fire, instead they have a graded potential that is a signal that varies continuously in strength according to the strength of the stimulus.
The other thing that seems non-digital is the communication between neurons and glial cells which is based on chemical signaling. Although the neuron firing causes signaling to the glial cells, the glial cells don’t fire in response but do release chemical signals that interact with other glial cells and moderate neuron firing.
Yep, though it’s mostly dependent on frequency. One action potential results in one squirt of neurotransmitters out into the synapse. In isolation, the amount released will be roughly constant for a given cell. Current understanding is that there are various pools of neurotransmitter waiting to be released: an active pool waiting right at the synapse, various reserve pools to replenish the active pool, and even pools of a second type of neurotransmitter. As the neuron fires repeatedly, the active pool can be depleted faster than it can be replenished, meaning the amount of neurotransmitter released during each action potential is less each time. Also, for some of those neurons, the amount of secondary neurotransmitters release will be zero for an isolated action potential, but will slowly increase with repeated firing.
In the synapse, you can further modulate the signal that makes it to the receiving neuron, independent of the frequency and pattern of the firing neuron. You can change how fast the neurotransmitter is cleared from the synapse – this is how a lot of antidepressants work. You can also modulate how neurotransmitter receptors behave in a lot of ways, and there are many classes of drugs that act here as well.
Whether you want to count the synaptic modulation in a neuron’s output could be debatable, but it’s definitely a big part of how the nervous system works.
There is more than one kind of “optical computing”.
See Butts good post up thread for details, of the type that sure as hell needs lenses to work.
Quoth billfish678:
Being sequential and deterministic has nothing whatsoever to do with being digital. A slide rule has a program, data, and physical structure, and its operation is very sequential and strongly deterministic, but it’s completely analog. Meanwhile, some data analysis techniques implemented on digital computers are not sequential or deterministic.
[Bolding mine] I find this very hard to believe. Care for an example?
True enough.
But are you of the brain is basically digital or not camp?
You’ve never used Windows have you?
Which would count:
Monte Carlo techniques using, say, random.org and a PRNG?
Merge sort with duplicates?
In my understanding, the brain can and does grow new connections, which neither digital nor analog systems do - in general. However there are Field Programmable Gate Arrays (FPGAs) in which the actual circuit is controlled by a memory array. There is a distinct set of resources, but the resources used, and the connections between them can be changed more or less on the fly. The main reason for using these is that since you have only one chip design for many applications, you get economies of scale, but some people have made use of this capability in a system. Say you are building a chess playing computer. Some of these, like Belle, have special hardware to accelerate the computation needed. If you used FPGAs, you can have one design optimized for openings, then change it for the middle game, and then change it again for the end game, with only one chip. I think Belle did use FPGAs, but I don’t think the version I heard about used this capability.
IANA brain expert by ANY definition.
But you might be able to also phrase it this way.
In a brain, the structure IS the data, the structure IS the programing, the data IS the programming, the programing IS the data…blah blah blah.
It would be a real stretch IMO to say the same of current digital computers and computing, much less the growing aspect you brought up.
Now, whether that is true or how important that is a whole nother matter for debate.
No, that is pretty much my understanding, but I hadn’t remembered the output modulation. Digital devices are designed to avoid these kinds of problems, by sizing cells so that they won’t be affected by driving too many outputs, or by putting in repeaters to make sure signals arrive at the other end of long paths in good shape.
But the fundamental difference between an analog and a digital design is that the output of an analog design is directly affected by its inputs, while in a digital design the signals going through it are regenerated at each cell. None of this implies that there must be a single bit of information coming out of a cell or neuron.
Almost all digital design these days is synchronous, with frequencies fixed in each clock domain. There has been plenty of work on asynchronous designs, in which the frequency of firing has been changed. They are still digital though, but have all sorts of practical problems such that I know of several projects that started out asynchronous and ended synchronous. The difference between computers and the brain is that computers evolve faster and use intelligent design, though some days I have my doubts.
Sure - I’m definitely not saying that the brain has the structure of a computer or is directly analogous to a digital design. However the fact that the brain works on structure says nothing about whether it is digital or analog, since you can construct a computer that works the same way, if you wished to.
Software and hardware are equivalent at the fundamental level. These days hardware designs at the top level are expressed in software languages, and are synthesized into hardware. I know people who pretty much automatically converted a compute intensive part of a program they had into hardware by compiling the C into hardware. This has been true since the early days of computing, when some floating point was done in software or hardware depending on the price and throughput requirements of two machines in the same family.
How do you define sequential here? That’s a word that has a lot of meanings depending on the context.
By not being deterministic, I assume you mean the use of random methods, which might be pseudorandom in practice but could be truly random if necessary. That I have no problem with.
Do you mean transistor? No one has used resistors in ICs since RTL, except maybe for some very special sensor blocks or other mixed signal pieces.
States are preserved in digital designs by feedback among several memoryless elements. A brief look at a seminconductor memory book shows a static RAM cell using 6 transistors. The heart of a flop is two gates cross coupled, but more are used for things like control.
I wish the analog nature of digital cells was nitpicking - almost all the interesting debug problems we see come from the fact that you can’t ignore the analog effects, not a over 1 GHz. Logic problems can be debugged on a simulator, and are a lot easier to handle.
I’m interested in this statement, because I’m not sure if it really demonstrates analog technology. Rather, I think you are describing non-binary technology. I think that ternary, quatinary or even decimal system can be used that is not analog. Since ultimately the modem signals have to be converted to binary, I have trouble imagining that a modem can be analog.