What exactly are modern analog computers and how are they better?

Do you have a link to this theory or a name, I would be interested in reading their reasoning. It seems to me that a digital system with enough discrete states (for each neuron, for example), would at some point be indistinquishable from a continuous system within the boundaries of the age of the universe, for example.

There is a big difference between processing an instruction or microoperation and a digital calculation. Digital gates obviously work in parallel. The cite talked about neural networks, which can be implemented digitally also. It has nothing to do with analog computing or even analog circuits.

Actually, totally wrong. The great thing about digital logic is that each gate is a little amplifier that cleans up the signal. Analog circuits are not like this, and analog circuitry inside a chip is usually very small, tens of components, as opposed to the hundreds of millions inside a microprocessor.

I think this is a case of you mistaking neural nets for analog.

That word precision doesn’t mean what you think it means. Digital logic is extremely robust. Plus there is a gigantic field of fault tolerance that says how digital systems can recover from any number of errors - assuming you want to pay for enough coding and redundancy. ECCs are just the tip of the iceberg in this area.

Your source is 11 years old. The lack of more recent links should be a dead giveaway.

That kind of thing - and quantum computing is inherently stochastic, is probably the wave of the future. Luckily it won’t happen until long after I retire. I work in hardware testing, which is tough enough when the design is deterministic! This stuff isn’t analog in the commonly used sense of the word, though.

When I was in Bell Labs some people in my center were working on manufacturing issues for optical computing and optical switches. They failed because they never could get cheaper and better than digital switches, but the theory was all there. My old director, who came from Area 11, was an expert on this, and she is a lot smarter than me. :slight_smile:

Not easily dismissed? Just try me.

Somethings are inherently analog, and lots of chips these days have analog blocks which interact with the digital ones by DACs and ADCs. There are also SiPs (system in packages) which have analog and digital dies sitting together in one package. But you can do anything in digital that you can in analog, and I assure you the ambiguity is not an advantage. There is also no way in hell that you can build a practical analog design big enough to do AI.

It should be noted that there is not a dichotomy between optical computing and digital computing. Most of the optical computing work you hear about is digital, just using photons instead of electrons. Using a lens to do a Fourier transform is a completely different sort of operation from constructing a NAND gate that works on light.

Totally correct. When I think of optical computing, I think of soliton dragging and similar technologies. Optical signal processing is the spacial/spectral frequency based processing done with refraction/diffraction…

I’ve got a 140 pound 1.0 version sitting right here that does it fine. Though it is rather stinky and ugly. It even runs on SPAM and beer under the right conditions.

I am, and it seems to have nothing to do with analog. But thanks!

Yep… like the professor who taught me computer architecture and had a huge hardon for stack architecture computers, and was convinced that once he worked out the problems with running multiple processes, that they’d take the world by storm.

It’s 15 years and counting…

And here I sit. A GIS programmer stunned by this discussion, and my satellite dish is basically down. 80k down and 12k up (I’ve been on the phone for two hours to ‘Rachael’ in India’.

Stunning ideas from what I can manage to read.

Consider what GIS was 20 years ago. You now have it in your phone.

ANALOG computers?

Or is it Analog programming and interpretation? Are we looking for a Boolean field that includes ‘maybe’? Is that it?

I’m going to have to reset the breakers. I do believe you just blew my mind. Very interesting concept.

But it is still data, is it not? It has to be checked and compared. I also think that you are looking at perhaps real time temporal systems that look at the 4th dimension which is time. Time can’t be measured. Not in a computer sense.

Your mind’s been blown by nonsense. As others have pointed out, analog computers are the ones which are susceptible to error-buildup; digital computation is the one which can error-correct. Digital data is far more robust than analog data (if a signal is meant to be either a 0 or a 1, there’s rarely any ambiguity as to what to boost it back up to, even if it degrades a little, so to speak. On the other hand, if a signal can vary across a continuous range, then once some small error is introduced, it is generally not possible to determine that such has happened and correct for it).

You’re also spouting some nonsense of your own… “Time can’t be measured. Not in a computer sense.” What does that mean?

You really shouldn’t eat junk mail, you know.

Actually, no. The human brain is heavily digital, when you get right down to it. A neuron fires, or it doesn’t. A neurotransmitter is released, or it isn’t. That digital aspect is how a living nervous system can function, despite organic sloppiness; the analog uncertainty is pared away into “this neuron has fired”. “1”, instead of a “0”, in other words.

I have seen a device constructed that can decide whether a given number is rational, which is not possible for a digital computer to do. I’m not going to try to give the details, but it’s in chapter 33 of this book.

How do you even specify the number to the device, without making it transparently clear whether it’s rational?

Why say no digital computer can do this? Which is to say, it depends on what exactly this is. It depends on how the input is provided. I mean, if arbitrary real number input is meant to be provided as an analog quantity, then the computation is automatically, at least in part, analog, to the extent that it manipulates that quantity. In that sense, sure, no digital computer can pull this off, but that’s trivial.

(Incidentally, in case anyone is curious, the method given in that book is “Shoot a laser into a pinhole at the corner of a square box lined with mirrors, the slope of its direction being the input number. If it ever comes back out of the pinhole, that slope was rational.” Of course, this method has zero error-tolerance (as would any computation trying to distinguish the rationals from the irrationals); it depends on the pinhole being exactly one point with 0 width and so forth)

ETA: Chronos beat me to the first point, somewhat

But the threshold at which a neuron fires (i.e., the strenghth of incoming signal required to cause the neuron to fire) changes over time and can vary smoothly.

So perhaps at any given point in time, the brain can be construed as a digital computer. But it seems more accurate to say that the brain is an analogue machine. If there’s something to characterizing the brain as digital, it involves the fact that the analogue machine that is the brain functions to create temporary digital machines.

-FrL-

Because the mind is not able to count more than minutes or hours easily. And keep track of them.

Time is interesting to me in how it influences our decision making process. And yes. It can be coded. We do it every day when we look at our watch.

But I’m getting off subject.

The idea of an analog computer intrigued me. I write code. All results of my code are the basis of analysis of information. It’s either 1, 0 or Null.

I can right code that can create ‘maybes’ or ‘perhaps’ based on information.

The idea that there may be direct information besides 1, 0 or null intrigues me.