What exactly are modern analog computers and how are they better?

This doesn’t seem correct, but I’m happy to be educated. When I google to verify I see definitions based on continuous vs discrete, which is what I expected. Can you point me to any definitions?

It’s a perfect example of a mixed signal device. The transmitter takes digital input and produces analog waveforms, the receiver does just the opposite.

I don’t think there is anyplace that is going to give a definition about what we are covering here.

Let me explain. When you go down deep enough, everything is analog. (Ignoring quantum level.) The waveforms between gates are not square waves, and signals between neurons are not digital either.

The next level up, you can have devices like amplifiers or resistors for which the output is a direct function of the inputs, or you can have an output controlled by the inputs but which does not flow directly from the inputs. lazybratsche mentioned a pool of neurotransmitters sitting at the output of a neuron which gets squirted into the synapse. The neurotransmitters received at the input don’t go right through. In a gate, the electrical signal received is used to turn on and off transistors, and the voltage on the output comes from a distinct voltage source, not the input. That is how a small signal at the input, which might be very messy, gets regenerated into a strong clean signal at the output.

My source is 30 years in this business, being program chair of conferences which cover both analog and digital designs, and doing book reviews on books containing chapters on analog and digital designs. I’m not an expert in analog or mixed signal stuff by any means, but I’ve been exposed to more of it than most people.

Voyager this seems different from what you said before, and what you said before seems a lot more in keeping with the ideas of analog and digital I’m familiar with.

To quote your previous comment I’m referring to:

Yes, completely understand all of that. But if the output is continuous as a function of the input and not discrete, to me that is analog, regardless of the number intermediaries or the specific mapping of input to output.

It seems to me that you are saying that even if the output is not discrete it can still be considered analog. Now I fully understand the fact that digital computers are created using analog mechanisms, but in the case of digital computers we can follow the chain all the way through and be sure it’s digital.

The brain, on the other hand, has so many analog actions that don’t seem to be clearly transformed into digital (e.g. graded potentials) that it seems like we can’t really say yet which it is.

To the extent that most neurons have an output that is either “fire” or “not fire”, the brain is mostly digital. Now, there may be neurons which can fire at varying strengths, and if so, those portions are analog, but this is the first I’ve heard of them.

There are, of course, some fundamental differences between the way the brain works and the way most non-brain computers work, but they’re differences other than the analog-digital divide.

I’m probably not expressing myself well. Take this comment from lazybratsche

The input is continuously varying, check. The output is continuously varying, check. But the output does nothing until the threshold is achieved. If you have ten inputs from other neurons, and reduce the strength of one by 50%, you are not going to get a 10% reduction on the output. You might get a total reduction, if you’ve missed the threshold, or practically none.

The input to a gate is somewhat similar to this. If you have a three input AND, and change all inputs from 0 to 1, you get a voltage rise on the inputs, and the gate output changes value based on when all three exceed the threshold voltage of the input. Now the gate output holds steady, so we don’t have the interesting varying output we see in a neuron. There are some specialized cells that do, but I think we’re talking basic building blocks here, so they shouldn’t count.

I’d guess that since the brain is based more on living cells and structures that carry signal, moderated by chemicals, things are inherently messier (more analog) than designed computer cells. I suspect evolution took advantage of some of this, and now uses multivalue logic.

When I was a freshman they thought CS majors weren’t ready for real hardware, so we learned about circuits with blocks called integrators and dividers and stuff. Then they told us that these were really capacitors and resistors and the like. You can learn analog circuits that way - you can’t describe a digital circuit that way. Unless I totally misunderstand the design of the brain, you can’t model a batch of neurons with resistors and capacitors either.

Why is there anything mixed about it? Certainly the signal that goes to the modem is digital. The modem converts it to whatever the modem uses, then converts it back to digital. The carrier signal may be analog, but it is clearly communicating in digital format. I don’t see how it is any less digital than flash memory. Certainly, each stored bit on a flash drive could technically exist in an intermediate state, but it would be interpereted as an error the same way a modem would interperet something that doesn’t fit its own format.

That’s the definition of mixed signal! You start, say, with digital, using gates and stuff, and then feed a digital signal into a d2a (digital to analog) circuit which feeds a traditionally analog chunk. And vice versa, of course. Draw a box around some circuitry. If it has only resistors and capacitors and such, it is an analog circuit. If it only has gates, flops, and memory, it is a digital circuit. If it has both, it is mixed signal.

Want to make it more complicated? Inside the digital parts, there are buffers that do nothing except strengthen the signal. You don’t put them into a schematic, they get added when you “power up” the design to make timing. Old style normal I/Os has an analog part since they have to be powerful enough to drive a signal off a chip. I was involved in a debug once where the internals of a chip were working perfectly, but it totally failed any tests where the test was applied at the inputs, and pretty much all the outputs were bad. It turned out that there was an analog input to our I/O cells which people forgot, and was not connected. The digital only netlist was fine, and digital people like more ignore the analog signals because they are just irrelevant to what really matters - except when they’re not. It is really hard to see that something is missing when you’ve trained yourself to ignore it.

I too tired to fight tonight. I’ll just say this regarding the brain digital analog debate.

Its my impression that folks are stretching the analog side of the digital one direction and the digital side of the analog in the other directions, and with enough stretching from both directions you can “get” them to meet in the middle and say “taa daa, see, they are both the same!”

What about that previous post where the researchers are building a physical analog model that will model a whooping 100 neurons! Then the poster goes on to claim that the reason they are doing that is because it would take one hell of a digital computer to simulate it.

A digital computer that has millions of transitors, gates, and reticulated framuses, operates at gigahertz speeds, has access to mega/tera bytes of data can’t easily simulate a measly 100 neurons.

If true, that tells me that calling neurons digital, even if technically true at some level, is just barely this side of a lie.

Just my opinion mind you.

And get of my analog lawn!

Oops!

Regarding my previous post. Note, I am NOT, repeat NOT calling anyone here liars, claiming they are lieing, or anything lieish going on.

I was just trying to make the point, that IMO if condition XYZ was true, saying ABC would be so factually incorrect, though it may be technically correct, it would be the informational equivalent of a lie.

Please, forgive this old analog fool !

No offense was intended.

And still get off my lawn!

Here is an interesting article on various ways neurons/synapses encode information:

http://www.pnas.org/content/94/24/12740.full

Some methods:

  1. Firing rate
  2. Relative timing of previous action potentials
  3. Phase of action potential relative to oscillations
  4. and others

Here’s a paper exploring the digital/analog debate.
http://watarts.uwaterloo.ca/~celiasmi/Papers/ce.2000.continuity.debate.csq.html

“As well, the neurophysiological evidence for discreteness is not conclusive. Though it strongly suggests that we are discrete in state (with respect to neurons), this does not mean we are discrete in time (van Gelder 1995). Even though the spikes themselves are all-or-none, the precise distance between any two spikes can only be expressed as a real number. If these distances are the basis of neural information processing, the brain is clearly continuous.

If this person is correct with the bolded statement, and given the above encoding schemes which are time dependent, that would seem to imply analog.
But I think this is a key statement because clearly both analog and digital activity is happening:
“…the heart of the debate lies in the question of whether cognition can be explained by analog or digital processes. The strict analogicity of the depolarization of neuron membranes may be uninteresting if its analog nature does not affect the cognitive behavior of the system. Similarly, even though the transmission of neural spikes is digital, if this digitalness is not relevant to cognition (perhaps only the time between spikes is relevant) then we should not consider the brain to be digital.”
I haven’t read the rest of this persons paper yet, but I do know he thinks the brain is digital.

That’s okay. I published a column by an analog guy about how analog was just better. A colleague of mine at Bell Labs, a pretty well known analog guy, had a good time telling all us digital guys that everything was actually analog. And I had lunch with Bob Pease not all that long ago. So I know that analog guys are kind of, well, odd. :smiley:

I think the problem is the several levels of meaning of analog, and that we don’t see the abstraction levels of the brain all that clearly, since we don’t have to design the damn thing. I bet if all we had to go on was the most basic semiconductor level representation of a microprocessor, we’d be damn sure the thing was an analog design.

Thanks for the links. I’ll try to read them over the weekend. If it turns out that the fundamental unit of information in the brain is a continuously varying delta between events, and that this carries through the neuron (perfectly possible) then I agree the brain is analog. It almost sounds like a neuron is doing Fourier transforms or something.

I just finished that second link and he takes apart the analog arguent by showing that there is a limit to the information that can be encoded by the firing rate. He didn’t have a final conclusion (meaning all cognition clearly digital) due to many levels to consider, but it’s interesting logic.

Yup, exactly. You make digital out of analog, and you can make analog out of digital. These things change depending on scale (ie, level of abstraction), and are not decided by the most base constitutent.

(This supposed “true nature” gets erased with each shift in scale, at least enough to become part of the background noise. The analog nature of digital is effectively erased when the rate of hardware errors is smaller than of software crashes. The digital nature of a weather simulation is erased when the quality of the pseudorandom number generator and precision of the floating-point numbers has less significance than the quality of the atmospheric model. It’s all relative, baby.)

I read it again to make sure I followed it. I enjoyed his explanation, logical but short and sweet. A few thoughts:

  1. He states that a real number contains infinite information because it requires an infinite bit string to represent it. This seems to imply a real number contains more information than an integer, which doesn’t make sense to me. An infinite bit string has to do with converting the value to some other representation, which is not the value itself. I would imagine there is a different transformation that would result in the integer requiring infinite bits to represent and the real number requiring finite bits.

  2. I’m glad I read his explanation of noise constraining amount of information that can be xmitted, I can see now how even a continuously varying signal can be more limited than it first appears.

  3. At the end, he leaves it open as to whether the brain is analog at a higher level. Thoughts on how it could be analog at a higher level if digital at the level he discusses?

I’m glad this guy is on my side. The paper is badly flawed, and since it supports my position I don’t need to spend as much time refuting it as if it supported the other one. He references Shannon, but he clearly does not understand him. Information transfer requires energy - infinite information would require infinite energy. It is not a matter of analog or digital - Shannon developed information theory on very analog telephone lines. Also, transmitting infinite information would require either infinite bandwidth or infinite time. I know he comes down on the right side of this argument, but he doesn’t seem to understand the basics of information.

He is making a better argument when he says

“Examine” is metaphorical here, but it captures the discrete nature of the structure of the brain. Resistors and other analog components don’t examine anything - a logic gate does. But he is saying this as an assertion, so some might object.

The real problem, which is connected to your question 3, is that analog and digital don’t make much sense at a higher level. Most people would say a computer is digital, but it has tons of analog - in the screen driver, the disk, etc. We can make a clear distinction in small parts of the design, but when you get high enough it is mixed. We might say that the computer is digital because most of its computation is done digitally, and that requires you look at the nature of the majority of its components. So, it all boils down to what you think the neuron is. Without seeing a direct path from input to output, the way an analog component has, I’ll still call it digital.

Dang it, Pushkin! I clicked on your innocent-looking link, and then spent hours at Wikipedia!

I feel like we’re debating “does a dog have buddha nature?” here. Still, I’m going to stick with “analog” for the brain, with full disclosure that it’s just a gut feeling and that reality usually doesn’t pay much attention to my gut.

On a more practical level, I’m not sure if it’s useful to call a neuron or circuit “digital”. We’ve got a pretty good understanding of how a single neuron behaves, and can model it really well (by biological standards at least). We have a decent understanding of very small circuits… but beyond that, the brain is still very mysterious. Questions on the nature of cognition is still the domain of philosophers. Trying to call it “analog” or “digital” leads to interesting debates on a message board, but I’m not sure if it’ll be a productive way of trying to understand the brain. Personally, I’m unimpressed by the “Continuity Debate” linked above – way to many arguments based on how things “seem”. Sure, you can model neural systems with a discrete computer with some success, but you can also do that for the weather.

I guess that’s why I’m throwing in with the neuroscientists. I likes me some good empirical data.