What exactly are modern analog computers and how are they better?

What does this mean? If you’re willing to think of, e.g., integers as naught but strings of 0s, 1s, and "null"s, well, analog data (i.e., a real number) is just a string of 0s, 1s, and "null"s too. (Albeit a very long string; e.g., the k-th bit of the string specifies whether the real is below (0), above (1), or equal to (“null”) the k-th rational number. (Incidentally, in the system I have in mind, it would not generally be possible to affirmatively determine of a value that it is “null”, though this is probably not the use of the term you had in mind, and so I should perhaps pick a different name for it; “|”, say).

It works on the Data In Garbage Out theory :slight_smile:

I can’t speak much to the computer science side of things, but as a biologist I have to strongly object to this statement. It’s true that there are some behaviors in (biological) neural networks that are discrete - the neuron fires, or it doesn’t. However, there’s a lot more than to it than that. What threshold does the neuron fire at? What are the rates of signal conduction within a neuron? What rate does it fire at? How long does it keep firing? How much neurotransmitter does it release? Which neurotransmitters are released, and in which relative quantities? What is the strength of the connection to the next neuron? Then there are numerous modulating signals at every step of the process, and stochastic behaviors throughout.

At the circuit level, sometimes there are nice discrete behaviors. These are well-studied, comparatively, because they’re easy to deal with. Ultimately, neural circuits are built from incredibly squishy, analogue things.

But it is more digital than analog. The signals don’t go from one end of the brain to the other - they cascade through neurons, which regenerate the signal just like gates do. They are a lot more complex than simple gates, but cell libraries these days are also. So you are more digital than you think you are.

Hmm. The real problem was that 15 years ago he was already about 15 years behind the times.

The real distinction between an analog and digital is that the output of an analog block is a smooth function of its inputs, while the output of a digital cell may be controlled by its inputs, but isn’t directly proportional to them. That’s true no matter how complex the input function is that causes the output to fire. You can (and I think I have for some reason or other) build a cell that only fires when you get a certain number of 1s on the input. You can have as many inputs as you want to get any level of precision. You can even add control inputs to change this. It is still digital because the output pretty much looks the same no matter what the input is.

A while back there was a lot of work on multivalue logic, which used more than 0 and 1 at an input. There was even an IEEE technical committee on this. I think it vanished, since voltages are so low these days that you’d have major problems with noise at the inputs, but IIRC they were inspired by neurons.

What kind of work on multivalued logic was this? What I mean is, in a sense, the computer on my desk handles 256-valued logic just fine, only when it operates on such values, we call them “bytes” instead of “bits”. What above and beyond that sort of thing was meant by a computer architecture which used multivalued logic (or am I misguided from the start in thinking this was work on computer architectures)?

ETA: Or perhaps you were talking specifically about multivalued logic using real values and continuous functions upon them, rather than just a discrete set (“fuzzy logic” and all that)? I suppose that would make more sense in the context of this thread. In fact, that must be what you meant. Yeah, I’m dumb. Ignore me…

Check out Fuzzy logic. It can be implemented without analog, but it is mighty useful. For instance, in data mining, not all things fall into neat clusters, and it is useful to classify things as kind of tall or kind of short.

This was in building gates which inherently handled more than two values at the input. I think Intel came out with a memory that did this, but I don’t know what happened to it. I don’t think it is used today.

The benefit is that you can store two bits of information in the space required for one. The downside, which is the thing that I suspect killed it, is that you have less room for fluctuations of input voltage. This means you have to go slowly, since there is a lot of voltage swing, and that you have to use high voltages, which eats up power. For a 5 volt design, a digital circuit may be a 0 under a volt and a half, a 1 over 3 volts, and guarantee that it never stabilizes in the middle. You could still give each of the 4 values a range of about a volt in multivalue logic, which might work. Today we use 1 volts supplies, so there isn’t a lot of margin. If you’ve ever seen a 1 GHz waveform, it would be obvious why this isn’t too useful any more.

My expertise in this area mostly comes from eating dinner with the head of the multivalue logic committee at Computer Society Tech Board meetings when I was on that, so I’m not a good cite. Fuzzy logic is kind of like this, but runs on normal computers.

Multivalue logic was more of a circuit design thing than an architecture thing.

If you want to go the other way, you get into an invention of mine, Base 1 arithmetic, which I invented my first year of grad school when three classes felt the need to teach me binary logic yet again. I published a short summary of it as my first column. Base 1 has several advantages - it is immune to noise, quite fault tolerant, and, if you plug in base 1 to the standard information theory equations, you will find that it is very energy efficient. I would venture you could represent most web pages in base 1 with very little loss of content.

:smiley:

At the gate level, yes, at the real circuit level, where you have to worry about waveforms and the like, things have gotten pretty messy also. But see my comment above - I’d say neurons are fundamentally digital, while admitting they are a lot more complex than simple gates.

No, I’d say you think I am more digital than I think am.

IMO there are enough fundamental differences(both in the physcial aspect and the operation) between brains and digital computers that to call the brain fundamentally digital is a REAL stretch.

We will have to just agree to disagree.

You’re only half right. It’s mostly true that a neuron either fires or it doesn’t (with some complications), but how it decides to fire is an inherently analog non-digital process. It’s not like a single wired connection, where neuron A firing always leads to neuron B firing. Instead whether a neuron fires depends on the recent history of the neuron, the concentration of exciting and inhibiting chemical neurotransmitters released by various other neurons (and the concentrations in turn depend on when nearby neurons released them, the size of the physical gap between the neurons, and how active are various systems that reabsorb the chemical), the concentration of various chemicals released by non-neurons in the brain, and the concentration of hormones released into the bloodstream by glands outside of the brain (or inserted into the bloodstream by that nice anesthesiologist). Plus probably other factors that I’ve forgotten (or that we just don’t know about yet).
So the brain is really more like a whole bunch of analog calculators that use a digital connection as one channel of communicating with each other.

That’s right, and I’d say it supports the view of the brain as analog, since things like catching a ball, walking, having sex, and so on require the ability to vary output smoothly according to inputs.

But of course the brain can implement various digital machines as well, but that doesn’t in itself make it fundamentally digital.

Something interesting to read related to these issues is Dynamic Systems by Kelso. He argues (and a lot of cognitive scientists have since taken up his view) that the brain (/body/world system) functions by switching between various dynamic states. Each dynamic state is in itself an analog machine (that’s my reading, not a phrasing he himself uses) but the swithing process between these various analog machines is itself discrete. An example is gait. A horse, for example, can walk, trot or gallop. Each of these gaits can be seen as the implemenation of an analog machine. However, the switching between these three gaits doesn’t happen smoothly. It happens practically instantaneously once the conditions appropriate to a new gate obtain.

I think the upshot of all this might be that the analog/digital distinction doesn’t quite apply (at least not usefully) to neurological systems. They’re analog, but (at least in the case of humans) seem apt to the implementation of digital systems. On the other hand, one of the things about them that is pretty digital–their ability to switch into different dynamic states–serves to make them apt to implement various analog systems.

The answer would seem to be, the brain is analog in certain fundamental ways, and digital in certain fundamental ways.

-FrL-

I’m talking about the fundamental primitive level - a neuron for the brain, and a cell for a digital design. Whether or not you can call the brain a computer is another matter entirely, I agree.

Check out a CD player. It has smoothly varying outputs, yet is basically digital. However, you are talking about the brain, and I’ve been talking about building blocks. There are clearly analog things in the brain and body, but there are in mostly digital systems also - they are called mixed signal. I don’t know where the digital / analog boundary in the brain and body is. Muscles are clearly purely analog, but don’t they get control signals from the brain, which might be considered digital?

Switching between states is an architectural thing, and neither digital nor analog in a fundamental sense. Our peripherals, and peripherals for a computer, are often analog. Maybe states are a new idea to biologists (no offense, I’m married to one) but they are very fundamental in digital design, where anything interesting involves the design of a state machine where the outputs depend on the inputs and the current state. But there are states at the higher level also. For instance, to save power, microprocessors turn off chunks of themselves which are not being used.

However, in the sense of this thread, which is analog computation not the status of peripherals such as muscles, I still contend that the fundamental computational building block of the brain looks digital, in the sense I’ve already defined.

If you look deeply enough, the “decision” for a gate to change value is analog also - at this level there are no 1s and 0s, but only voltages which put a bunch of transistors into different states. In fact, modern designs have elements called tristate gates where the output can be 1, 0, or Z, which is a value between 1 and 0. These are commonly used in buses and as the heart of multiplexers, since you can have many gates tied together. The bus will take the value of the gate that is non-Z, driving it. It gets more complicated than this, since you can have gates producing strong 1s and weak 1s, strong 0s and weak 0s also, and a strong 1 can dominate a weak 0. When you have a strong 1 and a strong 0 both you have what is called bus contention, the output is undefined, and nasty things might happen.

I was thinking about this too as I grapple with “is brain analog or digital”, clearly you can make something digital out of something analog. However, given that firing rate (which is analog) appears to be used to transmit information in the brain, are we back to analog?

I’m also wondering what is the definition of analog vs digital with respect to a system like the brain. Does analog mean that there is a continuous landscape of states the brain can move between? Or does it mean something else?

Another way to look at brain vs computer (smackdown down down down).

A computer has a program, data, and a physical structure. All VERY digital, very sequential, and strongly deterministic.

A brains program, data, and physical structure all highly intermeshed. I might even go so far as to say you can’t really seperate them out. And I also think calling it digital is a stretch as well.

Just a quick thought for pondering…

I don’t know what the state of optical computing is, but I do know they don’t depend on “lenses”. Optical computing depends on non-linear optics. Essentially, you have to design a way for one electromagnetic wave effect the way that another electromagnetic wave interacts with a material. Then you have to develop a logic circuit with it. I don’t know what the state of development is anymore and I have no idea how it compares with quantum computing. I don’t know anything about quantum computing, but I do know photonic computing is not analog.

Also, it will generate heat, but not as much as electronic computing.

Oh god, I’ve been doing it all wrong!