PDA

View Full Version : What exactly are modern analog computers and how are they better?


No Wikipedia Cites
02-25-2009, 08:02 AM
A professor told me that the future of computers was with analog, and that they would be capable of greater power such as more sophisticated artificial intelligence. An attempt to review what the benefits of a modern analog computer would be made things worse in my head.

SO what exactly are modern analog computers (not astrolabes but CPU electric-run) and why are they better? What do they do that is better and how could they even work?

Keeve
02-25-2009, 08:21 AM
modern analog computers (not astrolabes but CPU electric-run)Sounds like a contradiction to me. The most recent analog computer I can think of is a slide rule.

Then I looked it up on Wikipedia: Electronic analog computers (http://en.wikipedia.org/wiki/Analog_computer#Electronic_analog_computers)

Q.E.D.
02-25-2009, 09:01 AM
They are not better. They are worse. Digital signals are unambiguous, in the absence of severe noise. Even mild to moderate noise is enough to skew an analog system, and when you're doing multiple billions of calculations in your program, those errors are cumulative. Error correction of the sort used in digital systems is not possible, compounding the problem.

billfish678
02-25-2009, 09:11 AM
Well, WAG guess here.

The human brain is more analog than digital (maybe) and certainly noisy (hush Bob, I dont care about the army of the 12 monkeys! I am trying to type here), so one could make the arguement that analog might be required for artificial intelligence.

You can simulate analog with digital, but that might still cause problems.

And certain calculations, even done analog, are probably IMO good enough when done analog even though in theory they might not be a precise as digital. A crappy answer with somewhat uncertain precision/accuracy is still better than one where the accuracy/precision is well known but you never get the answer.

friedo
02-25-2009, 09:15 AM
A professor told me that the future of computers was with analog, and that they would be capable of greater power such as more sophisticated artificial intelligence.

Like most professors, this guy has his head in the clouds and is completely disconnected from reality.

We've spent the past 60 years inventing and improving the digital computer. Digital signals have discrete values which can be communicated without degradation and checked for errors.

Digital data can be stored in an unambiguous way, with a known level of precision. The precision of an analog system depends upon its components, and there's no guarantee that two uncalibrated systems will treat a given analog value exactly the same way.

Analog calculations are generally one-way, involving the summing of currents and similar operations. They require massive amounts of hardware to implement complex operations which on digital systems would be issues for software or the OS to manage.

But if you want to see a cool, modern analog computer, this guy (http://www.meccano.us/differential_analyzers/robinson_da/) build an entire differential analyzer out of Meccano.

Pushkin
02-25-2009, 09:46 AM
A professor told me that the future of computers was with analog

He wasn't mistaking it for Quantum computing (http://en.wikipedia.org/wiki/Quantum_computer) by any chance?

Tyrrell McAllister
02-25-2009, 11:28 AM
A professor told me . . .
A professor of what?

smiling bandit
02-25-2009, 12:28 PM
Digital computing is moe unambiguous, which is why we use it. However, analog computing has advantages precisely because it is ambiguous. That's its basic advantage, and allows it to do things digital systems can't. There's a non-trivial argument that you can't fundamentally make an intelligent digital AI, because you wind up destroying the thing you're trying to make; it can't think with just on/off states. (the theory is more complex and deeper than I present here, but it's not easily dismissed).

Tyrrell McAllister
02-25-2009, 12:35 PM
Could you point to a presentation of this theory that does justice to its complexity and depth?

I'm skeptical that anyone has built an analogue computer that can "do things", in any precise functional sense, that a digital computer cannot.

billfish678
02-25-2009, 12:48 PM
Could you point to a presentation of this theory that does justice to its complexity and depth?

I'm skeptical that anyone has built an analogue computer that can "do things", in any precise functional sense, that a digital computer cannot.

Back in the day optical computing was all the rage in theory.

Take an image, many pixels by many pixels. Do a 2-D fourier transform on it. Thats computationally intensive as hell.

A lens "automatically" does that to a whole image at the speed of light. So, in theory, its massively parallel and about as FAST as you can get. And it doesnt generate any heat in the process.

I guess the problem is most computations are not easily/efficiently "transformable" to a 2D fourier transform problem in order to be solved. An then there is the data I/O problem. But again in theory its da bomb.

Tyrrell McAllister
02-25-2009, 12:57 PM
I would say that such a computer is not da bomb even in theory if most computations cannot be efficiently formulated in terms of the computer's functions. At best it is da bomb in a very incomplete theory.

smiling bandit
02-25-2009, 01:03 PM
A lens "automatically" does that to a whole image at the speed of light. So, in theory, its massively parallel and about as FAST as you can get. And it doesnt generate any heat in the process.

That's basically the principle.

A neuron, for example, doesn't just fire in sequence. They fire in weird patterns, according to their own internal logic and connections, and their nature can change over time. Digital calculations are always done in sequence and each is entirely unconnected to the next.

Now, the problem (among others) here is that, ironically, Digital can't handle ambiguity. AN analog system can be self-correcting. A digital one will rapidly go out of whack, because it can never check its own work properly. It makes a mistake - maybe a tiny one, or it's even not a mistake at all, but just bad data. But that knocks the next calculation out, and the next, and the next. Bam, the system breaks.

Analog systems can take in the whole data set, and errors are adjusted for automatically. Missing data can be assumed. You can adjust errors on digital systems, but then you have to have a whole 'nother system checking for them, and then another to check that, and another. A closely related function is massive parrallelism: analog systems are inherently parrallel-function designs.

Basically, digital is extremely precise but limited in "robustness". Analog is unlimited in robustness but limited in precision.

http://www.mikiko.net/library/weekly/1998articles/aa053198.htm - diasgrees but explains the basic idea. I'm having trouble finding better sources right now because of too much garbage on google, and some older ones are dead links.

Tyrrell McAllister
02-25-2009, 01:10 PM
smiling bandit, where are you getting these notions about how digital computers work? Where ever you are getting them, I suggest finding another source. You have been severely misinformed.

Start by learning about error correcting codes (http://en.wikipedia.org/wiki/Error_correcting_codes).

billfish678
02-25-2009, 01:29 PM
I would say that such a computer is not da bomb even in theory if most computations cannot be efficiently formulated in terms of the computer's functions. At best it is da bomb in a very incomplete theory.


Are you here to contribute to the discusion or nitpick?

How many qualifiers does this place require before somebody will let something "slide"?

Geezus.

I am a walking biological 3 D AND temporial transform system. Get back to me when you've got a digital equivalent of me :)

smithsb
02-25-2009, 01:33 PM
Personally, I think Pushkin had the answer - should have been Quantum Computing.

Digital vs. analog. Not perhaps a computer but sound recording and reproduction may be an apt analogy. Digital slices up an analog waveform into smaller and smaller samples to "approximate" the information. Sampling rates and bit depth are factors. An analog recording system may record on tape or other media the actual waveform. Digital reproduction at high rates and depths get close to the original waveform/sound, it can also sound pretty sh**y and lifeless at low rates (crappy MP3s). Analog reproduction gets closer to realism but does come up with the crackles, hiss, and pops. Your favorite LP or tape may develop "noise" but still sounds like music. The CD, if damage/deteriorated, simply won't play. I am aware that many / perhaps most sound reproduction systems may incorporate digital elements (switching amplifiers for example) but analog front to back is still viable.

I can see in a few cases where a system that will accept higher noise levels could be desirable to one that breaks down in spite of error correction algorithms.

erislover
02-25-2009, 01:42 PM
Man, and I just read about some new chip that uses fuzzy logic or analog or something. They even created an example chip and its power consumption was much lower. Please, someone has to remember this announcement.

friedo
02-25-2009, 01:51 PM
Man, and I just read about some new chip that uses fuzzy logic or analog or something. They even created an example chip and its power consumption was much lower. Please, someone has to remember this announcement.

You can do fuzzy logic digitally. In fact, a digital computer can simulate any analog computation (to within a certain degree of accuracy.)

erislover
02-25-2009, 02:00 PM
Yeah my post was way too hasty and now I cannot find the reference anymore. I've been searching blogs I usually visit and some tech sites but I cannot find it anymore.

Happy Fun Ball
02-25-2009, 02:26 PM
Statements of "analog computers are not better" are misleading at best, if not flat out wrong, at least in my opinion. I should probably say "cite?" to these claims, but this is probably not productive.

The term analog computing is pretty vague and covers a lot of ground. I am not a computer scientist so it would be difficult for me to talk generally about the differences and advantages of analog computers. I can however talk about stuff I have done: in grad school I worked on a project to use spectral holography to do signal processing of range/Doppler lidar signals (here (http://www.opticsinfobase.org/abstract.cfm?&uri=ao-45-25-6409) is a publication with the initial research). This is absolutely analog processing / computing using optical signals instead of more conventional electronics and it is orders of magnitude higher speed and with orders of magnitude more bandwidth than is possible with current digital technology. More generally, note that a simple lens preforms a 2-D Fourier transform of coherent fields at its focal plane. Take a pixellated image (say a 1000 x 1000 spatial light modulator) and a good laser and you can easily do a 2-D Fourier transform in less than a microsecond. This is equivalent to 10^18 analog multiplies/s. If you were using a standard FFT N log(N) calculation, this rate would be equivalent to 2 teraflops. This is only for a single lens. The problem with most optical processors is that they perform only very specific operations (Fourier Transforms, Convolution / Correlation / Filtering / Pattern Recognition, etc...). But they do it very well, much faster than a digital system could hope to.

Here (http://optics.colorado.edu/~kelvin/vugraphs/pqe02_SSHarrayimaging.pdf) is an excellent presentation (warning PDF) by a co-student (is this a word?) of mine on a squint compensated RF imaging array system capable of more than 10^17 flops of image processing. Find me a computer that can match it...

beowulff
02-25-2009, 02:35 PM
Yeah my post was way too hasty and now I cannot find the reference anymore. I've been searching blogs I usually visit and some tech sites but I cannot find it anymore.

Are you thinking of this:
http://www.electronista.com/articles/09/02/08/rice.university.pcmos/ ?

RaftPeople
02-25-2009, 03:23 PM
Digital computing is moe unambiguous, which is why we use it. However, analog computing has advantages precisely because it is ambiguous. That's its basic advantage, and allows it to do things digital systems can't. There's a non-trivial argument that you can't fundamentally make an intelligent digital AI, because you wind up destroying the thing you're trying to make; it can't think with just on/off states. (the theory is more complex and deeper than I present here, but it's not easily dismissed).

Do you have a link to this theory or a name, I would be interested in reading their reasoning. It seems to me that a digital system with enough discrete states (for each neuron, for example), would at some point be indistinquishable from a continuous system within the boundaries of the age of the universe, for example.

Voyager
02-25-2009, 03:29 PM
That's basically the principle.

A neuron, for example, doesn't just fire in sequence. They fire in weird patterns, according to their own internal logic and connections, and their nature can change over time. Digital calculations are always done in sequence and each is entirely unconnected to the next.

There is a big difference between processing an instruction or microoperation and a digital calculation. Digital gates obviously work in parallel. The cite talked about neural networks, which can be implemented digitally also. It has nothing to do with analog computing or even analog circuits.

Now, the problem (among others) here is that, ironically, Digital can't handle ambiguity. AN analog system can be self-correcting. A digital one will rapidly go out of whack, because it can never check its own work properly. It makes a mistake - maybe a tiny one, or it's even not a mistake at all, but just bad data. But that knocks the next calculation out, and the next, and the next. Bam, the system breaks.

Actually, totally wrong. The great thing about digital logic is that each gate is a little amplifier that cleans up the signal. Analog circuits are not like this, and analog circuitry inside a chip is usually very small, tens of components, as opposed to the hundreds of millions inside a microprocessor.

Analog systems can take in the whole data set, and errors are adjusted for automatically. Missing data can be assumed. You can adjust errors on digital systems, but then you have to have a whole 'nother system checking for them, and then another to check that, and another. A closely related function is massive parrallelism: analog systems are inherently parrallel-function designs.

I think this is a case of you mistaking neural nets for analog.

Basically, digital is extremely precise but limited in "robustness". Analog is unlimited in robustness but limited in precision.

That word precision doesn't mean what you think it means. Digital logic is extremely robust. Plus there is a gigantic field of fault tolerance that says how digital systems can recover from any number of errors - assuming you want to pay for enough coding and redundancy. ECCs are just the tip of the iceberg in this area.

http://www.mikiko.net/library/weekly/1998articles/aa053198.htm - diasgrees but explains the basic idea. I'm having trouble finding better sources right now because of too much garbage on google, and some older ones are dead links.

Your source is 11 years old. The lack of more recent links should be a dead giveaway.

Voyager
02-25-2009, 03:32 PM
Are you thinking of this:
http://www.electronista.com/articles/09/02/08/rice.university.pcmos/ ?

That kind of thing - and quantum computing is inherently stochastic, is probably the wave of the future. Luckily it won't happen until long after I retire. I work in hardware testing, which is tough enough when the design is deterministic! This stuff isn't analog in the commonly used sense of the word, though.

Voyager
02-25-2009, 03:36 PM
I would say that such a computer is not da bomb even in theory if most computations cannot be efficiently formulated in terms of the computer's functions. At best it is da bomb in a very incomplete theory.

When I was in Bell Labs some people in my center were working on manufacturing issues for optical computing and optical switches. They failed because they never could get cheaper and better than digital switches, but the theory was all there. My old director, who came from Area 11, was an expert on this, and she is a lot smarter than me. :)

Voyager
02-25-2009, 03:40 PM
Digital computing is moe unambiguous, which is why we use it. However, analog computing has advantages precisely because it is ambiguous. That's its basic advantage, and allows it to do things digital systems can't. There's a non-trivial argument that you can't fundamentally make an intelligent digital AI, because you wind up destroying the thing you're trying to make; it can't think with just on/off states. (the theory is more complex and deeper than I present here, but it's not easily dismissed).

Not easily dismissed? Just try me.

Somethings are inherently analog, and lots of chips these days have analog blocks which interact with the digital ones by DACs and ADCs. There are also SiPs (system in packages) which have analog and digital dies sitting together in one package. But you can do anything in digital that you can in analog, and I assure you the ambiguity is not an advantage. There is also no way in hell that you can build a practical analog design big enough to do AI.

Chronos
02-25-2009, 04:29 PM
It should be noted that there is not a dichotomy between optical computing and digital computing. Most of the optical computing work you hear about is digital, just using photons instead of electrons. Using a lens to do a Fourier transform is a completely different sort of operation from constructing a NAND gate that works on light.

Happy Fun Ball
02-25-2009, 05:24 PM
It should be noted that there is not a dichotomy between optical computing and digital computing. Most of the optical computing work you hear about is digital, just using photons instead of electrons. Using a lens to do a Fourier transform is a completely different sort of operation from constructing a NAND gate that works on light.
Totally correct. When I think of optical computing, I think of soliton dragging and similar technologies. Optical signal processing is the spacial/spectral frequency based processing done with refraction/diffraction...

billfish678
02-25-2009, 06:52 PM
age. There is also no way in hell that you can build a practical analog design big enough to do AI.

I've got a 140 pound 1.0 version sitting right here that does it fine. Though it is rather stinky and ugly. It even runs on SPAM and beer under the right conditions.

erislover
02-25-2009, 06:58 PM
Are you thinking of this:
http://www.electronista.com/articles/09/02/08/rice.university.pcmos/ ?I am, and it seems to have nothing to do with analog. But thanks!

bump
02-25-2009, 07:25 PM
Like most professors, this guy has his head in the clouds and is completely disconnected from reality.


Yep... like the professor who taught me computer architecture and had a huge hardon for stack architecture computers, and was convinced that once he worked out the problems with running multiple processes, that they'd take the world by storm.

It's 15 years and counting....

enipla
02-25-2009, 07:51 PM
And here I sit. A GIS programmer stunned by this discussion, and my satellite dish is basically down. 80k down and 12k up (I’ve been on the phone for two hours to ‘Rachael’ in India’.

Stunning ideas from what I can manage to read.

Consider what GIS was 20 years ago. You now have it in your phone.

ANALOG computers?

Or is it Analog programming and interpretation? Are we looking for a Boolean field that includes ‘maybe’? Is that it?

enipla
02-25-2009, 08:09 PM
That's basically the principle.

A neuron, for example, doesn't just fire in sequence. They fire in weird patterns, according to their own internal logic and connections, and their nature can change over time. Digital calculations are always done in sequence and each is entirely unconnected to the next.

Now, the problem (among others) here is that, ironically, Digital can't handle ambiguity. AN analog system can be self-correcting. A digital one will rapidly go out of whack, because it can never check its own work properly. It makes a mistake - maybe a tiny one, or it's even not a mistake at all, but just bad data. But that knocks the next calculation out, and the next, and the next. Bam, the system breaks.

Analog systems can take in the whole data set, and errors are adjusted for automatically. Missing data can be assumed. You can adjust errors on digital systems, but then you have to have a whole 'nother system checking for them, and then another to check that, and another. A closely related function is massive parrallelism: analog systems are inherently parrallel-function designs.

Basically, digital is extremely precise but limited in "robustness". Analog is unlimited in robustness but limited in precision.

http://www.mikiko.net/library/weekly/1998articles/aa053198.htm - disagrees but explains the basic idea. I'm having trouble finding better sources right now because of too much garbage on google, and some older ones are dead links.I'm going to have to reset the breakers. I do believe you just blew my mind. Very interesting concept.

But it is still data, is it not? It has to be checked and compared. I also think that you are looking at perhaps real time temporal systems that look at the 4th dimension which is time. Time can’t be measured. Not in a computer sense.

Indistinguishable
02-25-2009, 08:16 PM
Your mind's been blown by nonsense. As others have pointed out, analog computers are the ones which are susceptible to error-buildup; digital computation is the one which can error-correct. Digital data is far more robust than analog data (if a signal is meant to be either a 0 or a 1, there's rarely any ambiguity as to what to boost it back up to, even if it degrades a little, so to speak. On the other hand, if a signal can vary across a continuous range, then once some small error is introduced, it is generally not possible to determine that such has happened and correct for it).

You're also spouting some nonsense of your own... "Time can't be measured. Not in a computer sense." What does that mean?

Q.E.D.
02-25-2009, 08:22 PM
I've got a 140 pound 1.0 version sitting right here that does it fine. Though it is rather stinky and ugly. It even runs on SPAM and beer under the right conditions.

You really shouldn't eat junk mail, you know.

Der Trihs
02-25-2009, 08:34 PM
There is also no way in hell that you can build a practical analog design big enough to do AI.I've got a 140 pound 1.0 version sitting right here that does it fine. Though it is rather stinky and ugly. It even runs on SPAM and beer under the right conditions.Actually, no. The human brain is heavily digital, when you get right down to it. A neuron fires, or it doesn't. A neurotransmitter is released, or it isn't. That digital aspect is how a living nervous system can function, despite organic sloppiness; the analog uncertainty is pared away into "this neuron has fired". "1", instead of a "0", in other words.

ultrafilter
02-25-2009, 08:44 PM
I have seen a device constructed that can decide whether a given number is rational, which is not possible for a digital computer to do. I'm not going to try to give the details, but it's in chapter 33 of this book (http://www.amazon.com/New-Turing-Omnibus-Sixty-Six-Excursions/dp/0805071660/ref=pd_bbs_sr_1?ie=UTF8&s=books&qid=1235616116&sr=8-1).

Chronos
02-25-2009, 08:54 PM
How do you even specify the number to the device, without making it transparently clear whether it's rational?

Indistinguishable
02-25-2009, 08:57 PM
Why say no digital computer can do this? Which is to say, it depends on what exactly this is. It depends on how the input is provided. I mean, if arbitrary real number input is meant to be provided as an analog quantity, then the computation is automatically, at least in part, analog, to the extent that it manipulates that quantity. In that sense, sure, no digital computer can pull this off, but that's trivial.

(Incidentally, in case anyone is curious, the method given in that book is "Shoot a laser into a pinhole at the corner of a square box lined with mirrors, the slope of its direction being the input number. If it ever comes back out of the pinhole, that slope was rational." Of course, this method has zero error-tolerance (as would any computation trying to distinguish the rationals from the irrationals); it depends on the pinhole being exactly one point with 0 width and so forth)

ETA: Chronos beat me to the first point, somewhat

Frylock
02-25-2009, 09:08 PM
Actually, no. The human brain is heavily digital, when you get right down to it. A neuron fires, or it doesn't. A neurotransmitter is released, or it isn't. That digital aspect is how a living nervous system can function, despite organic sloppiness; the analog uncertainty is pared away into "this neuron has fired". "1", instead of a "0", in other words.

But the threshold at which a neuron fires (i.e., the strenghth of incoming signal required to cause the neuron to fire) changes over time and can vary smoothly.

So perhaps at any given point in time, the brain can be construed as a digital computer. But it seems more accurate to say that the brain is an analogue machine. If there's something to characterizing the brain as digital, it involves the fact that the analogue machine that is the brain functions to create temporary digital machines.

-FrL-

enipla
02-25-2009, 09:19 PM
You're also spouting some nonsense of your own... "Time can't be measured. Not in a computer sense." What does that mean?Because the mind is not able to count more than minutes or hours easily. And keep track of them.

Time is interesting to me in how it influences our decision making process. And yes. It can be coded. We do it every day when we look at our watch.

But I'm getting off subject.

The idea of an analog computer intrigued me. I write code. All results of my code are the basis of analysis of information. It's either 1, 0 or Null.

I can right code that can create ‘maybes’ or ‘perhaps’ based on information.

The idea that there may be direct information besides 1, 0 or null intrigues me.

Indistinguishable
02-25-2009, 09:32 PM
The idea that there may be direct information besides 1, 0 or null intrigues me.
What does this mean? If you're willing to think of, e.g., integers as naught but strings of 0s, 1s, and "null"s, well, analog data (i.e., a real number) is just a string of 0s, 1s, and "null"s too. (Albeit a very long string; e.g., the k-th bit of the string specifies whether the real is below (0), above (1), or equal to ("null") the k-th rational number. (Incidentally, in the system I have in mind, it would not generally be possible to affirmatively determine of a value that it is "null", though this is probably not the use of the term you had in mind, and so I should perhaps pick a different name for it; "_|_", say).

billfish678
02-25-2009, 10:05 PM
You really shouldn't eat junk mail, you know.

It works on the Data In Garbage Out theory :)

lazybratsche
02-25-2009, 10:31 PM
Actually, no. The human brain is heavily digital, when you get right down to it. A neuron fires, or it doesn't. A neurotransmitter is released, or it isn't. That digital aspect is how a living nervous system can function, despite organic sloppiness; the analog uncertainty is pared away into "this neuron has fired". "1", instead of a "0", in other words.

I can't speak much to the computer science side of things, but as a biologist I have to strongly object to this statement. It's true that there are some behaviors in (biological) neural networks that are discrete - the neuron fires, or it doesn't. However, there's a lot more than to it than that. What threshold does the neuron fire at? What are the rates of signal conduction within a neuron? What rate does it fire at? How long does it keep firing? How much neurotransmitter does it release? Which neurotransmitters are released, and in which relative quantities? What is the strength of the connection to the next neuron? Then there are numerous modulating signals at every step of the process, and stochastic behaviors throughout.

At the circuit level, sometimes there are nice discrete behaviors. These are well-studied, comparatively, because they're easy to deal with. Ultimately, neural circuits are built from incredibly squishy, analogue things.

Voyager
02-26-2009, 01:51 AM
I've got a 140 pound 1.0 version sitting right here that does it fine. Though it is rather stinky and ugly. It even runs on SPAM and beer under the right conditions.

But it is more digital than analog. The signals don't go from one end of the brain to the other - they cascade through neurons, which regenerate the signal just like gates do. They are a lot more complex than simple gates, but cell libraries these days are also. So you are more digital than you think you are.

Voyager
02-26-2009, 01:53 AM
Yep... like the professor who taught me computer architecture and had a huge hardon for stack architecture computers, and was convinced that once he worked out the problems with running multiple processes, that they'd take the world by storm.

It's 15 years and counting....

Hmm. The real problem was that 15 years ago he was already about 15 years behind the times.

Voyager
02-26-2009, 02:07 AM
But the threshold at which a neuron fires (i.e., the strenghth of incoming signal required to cause the neuron to fire) changes over time and can vary smoothly.

So perhaps at any given point in time, the brain can be construed as a digital computer. But it seems more accurate to say that the brain is an analogue machine. If there's something to characterizing the brain as digital, it involves the fact that the analogue machine that is the brain functions to create temporary digital machines.

-FrL-

The real distinction between an analog and digital is that the output of an analog block is a smooth function of its inputs, while the output of a digital cell may be controlled by its inputs, but isn't directly proportional to them. That's true no matter how complex the input function is that causes the output to fire. You can (and I think I have for some reason or other) build a cell that only fires when you get a certain number of 1s on the input. You can have as many inputs as you want to get any level of precision. You can even add control inputs to change this. It is still digital because the output pretty much looks the same no matter what the input is.

A while back there was a lot of work on multivalue logic, which used more than 0 and 1 at an input. There was even an IEEE technical committee on this. I think it vanished, since voltages are so low these days that you'd have major problems with noise at the inputs, but IIRC they were inspired by neurons.

Indistinguishable
02-26-2009, 02:11 AM
A while back there was a lot of work on multivalue logic, which used more than 0 and 1 at an input.
What kind of work on multivalued logic was this? What I mean is, in a sense, the computer on my desk handles 256-valued logic just fine, only when it operates on such values, we call them "bytes" instead of "bits". What above and beyond that sort of thing was meant by a computer architecture which used multivalued logic (or am I misguided from the start in thinking this was work on computer architectures)?

ETA: Or perhaps you were talking specifically about multivalued logic using real values and continuous functions upon them, rather than just a discrete set ("fuzzy logic" and all that)? I suppose that would make more sense in the context of this thread. In fact, that must be what you meant. Yeah, I'm dumb. Ignore me...

Voyager
02-26-2009, 02:12 AM
The idea of an analog computer intrigued me. I write code. All results of my code are the basis of analysis of information. It's either 1, 0 or Null.

I can right code that can create ‘maybes’ or ‘perhaps’ based on information.

The idea that there may be direct information besides 1, 0 or null intrigues me.

Check out Fuzzy logic (http://en.wikipedia.org/wiki/Fuzzy_logic). It can be implemented without analog, but it is mighty useful. For instance, in data mining, not all things fall into neat clusters, and it is useful to classify things as kind of tall or kind of short.

Voyager
02-26-2009, 02:26 AM
What kind of work on multivalued logic was this? What I mean is, in a sense, the computer on my desk handles 256-valued logic just fine, only when it operates on such values, we call them "bytes" instead of "bits". What above and beyond that sort of thing was meant by a computer architecture which used multivalued logic (or am I misguided from the start in thinking this was work on computer architectures)?

Or perhaps you meant multivalued logic using real values and continuous functions upon them, rather than just a discrete set ("fuzzy logic" and all that)? I suppose that would make more sense in the context of this thread. In that case, ignore me...

This was in building gates which inherently handled more than two values at the input. I think Intel came out with a memory that did this, but I don't know what happened to it. I don't think it is used today.

The benefit is that you can store two bits of information in the space required for one. The downside, which is the thing that I suspect killed it, is that you have less room for fluctuations of input voltage. This means you have to go slowly, since there is a lot of voltage swing, and that you have to use high voltages, which eats up power. For a 5 volt design, a digital circuit may be a 0 under a volt and a half, a 1 over 3 volts, and guarantee that it never stabilizes in the middle. You could still give each of the 4 values a range of about a volt in multivalue logic, which might work. Today we use 1 volts supplies, so there isn't a lot of margin. If you've ever seen a 1 GHz waveform, it would be obvious why this isn't too useful any more.

My expertise in this area mostly comes from eating dinner with the head of the multivalue logic committee at Computer Society Tech Board meetings when I was on that, so I'm not a good cite. Fuzzy logic is kind of like this, but runs on normal computers.

Multivalue logic was more of a circuit design thing than an architecture thing.

If you want to go the other way, you get into an invention of mine, Base 1 arithmetic, which I invented my first year of grad school when three classes felt the need to teach me binary logic yet again. I published a short summary of it as my first column. Base 1 has several advantages - it is immune to noise, quite fault tolerant, and, if you plug in base 1 to the standard information theory equations, you will find that it is very energy efficient. I would venture you could represent most web pages in base 1 with very little loss of content.










:D

Voyager
02-26-2009, 02:29 AM
At the circuit level, sometimes there are nice discrete behaviors. These are well-studied, comparatively, because they're easy to deal with. Ultimately, neural circuits are built from incredibly squishy, analogue things.

At the gate level, yes, at the real circuit level, where you have to worry about waveforms and the like, things have gotten pretty messy also. But see my comment above - I'd say neurons are fundamentally digital, while admitting they are a lot more complex than simple gates.

billfish678
02-26-2009, 07:41 AM
. So you are more digital than you think you are.


No, I'd say you think I am more digital than I think am.

IMO there are enough fundamental differences(both in the physcial aspect and the operation) between brains and digital computers that to call the brain fundamentally digital is a REAL stretch.

We will have to just agree to disagree.

Quercus
02-26-2009, 08:28 AM
Actually, no. The human brain is heavily digital, when you get right down to it. A neuron fires, or it doesn't. A neurotransmitter is released, or it isn't. That digital aspect is how a living nervous system can function, despite organic sloppiness; the analog uncertainty is pared away into "this neuron has fired". "1", instead of a "0", in other words.You're only half right. It's mostly true that a neuron either fires or it doesn't (with some complications), but how it decides to fire is an inherently analog non-digital process. It's not like a single wired connection, where neuron A firing always leads to neuron B firing. Instead whether a neuron fires depends on the recent history of the neuron, the concentration of exciting and inhibiting chemical neurotransmitters released by various other neurons (and the concentrations in turn depend on when nearby neurons released them, the size of the physical gap between the neurons, and how active are various systems that reabsorb the chemical), the concentration of various chemicals released by non-neurons in the brain, and the concentration of hormones released into the bloodstream by glands outside of the brain (or inserted into the bloodstream by that nice anesthesiologist). Plus probably other factors that I've forgotten (or that we just don't know about yet).
So the brain is really more like a whole bunch of analog calculators that use a digital connection as one channel of communicating with each other.

Frylock
02-26-2009, 09:25 AM
The real distinction between an analog and digital is that the output of an analog block is a smooth function of its inputs, while the output of a digital cell may be controlled by its inputs, but isn't directly proportional to them.

That's right, and I'd say it supports the view of the brain as analog, since things like catching a ball, walking, having sex, and so on require the ability to vary output smoothly according to inputs.

But of course the brain can implement various digital machines as well, but that doesn't in itself make it fundamentally digital.

Something interesting to read related to these issues is Dynamic Systems by Kelso. He argues (and a lot of cognitive scientists have since taken up his view) that the brain (/body/world system) functions by switching between various dynamic states. Each dynamic state is in itself an analog machine (that's my reading, not a phrasing he himself uses) but the swithing process between these various analog machines is itself discrete. An example is gait. A horse, for example, can walk, trot or gallop. Each of these gaits can be seen as the implemenation of an analog machine. However, the switching between these three gaits doesn't happen smoothly. It happens practically instantaneously once the conditions appropriate to a new gate obtain.

I think the upshot of all this might be that the analog/digital distinction doesn't quite apply (at least not usefully) to neurological systems. They're analog, but (at least in the case of humans) seem apt to the implementation of digital systems. On the other hand, one of the things about them that is pretty digital--their ability to switch into different dynamic states--serves to make them apt to implement various analog systems.

The answer would seem to be, the brain is analog in certain fundamental ways, and digital in certain fundamental ways.

-FrL-

Voyager
02-26-2009, 11:22 AM
No, I'd say you think I am more digital than I think am.

IMO there are enough fundamental differences(both in the physcial aspect and the operation) between brains and digital computers that to call the brain fundamentally digital is a REAL stretch.

We will have to just agree to disagree.

I'm talking about the fundamental primitive level - a neuron for the brain, and a cell for a digital design. Whether or not you can call the brain a computer is another matter entirely, I agree.

Voyager
02-26-2009, 11:51 AM
That's right, and I'd say it supports the view of the brain as analog, since things like catching a ball, walking, having sex, and so on require the ability to vary output smoothly according to inputs.

Check out a CD player. It has smoothly varying outputs, yet is basically digital. However, you are talking about the brain, and I've been talking about building blocks. There are clearly analog things in the brain and body, but there are in mostly digital systems also - they are called mixed signal. I don't know where the digital / analog boundary in the brain and body is. Muscles are clearly purely analog, but don't they get control signals from the brain, which might be considered digital?


But of course the brain can implement various digital machines as well, but that doesn't in itself make it fundamentally digital.

Something interesting to read related to these issues is Dynamic Systems by Kelso. He argues (and a lot of cognitive scientists have since taken up his view) that the brain (/body/world system) functions by switching between various dynamic states. Each dynamic state is in itself an analog machine (that's my reading, not a phrasing he himself uses) but the swithing process between these various analog machines is itself discrete. An example is gait. A horse, for example, can walk, trot or gallop. Each of these gaits can be seen as the implemenation of an analog machine. However, the switching between these three gaits doesn't happen smoothly. It happens practically instantaneously once the conditions appropriate to a new gate obtain.

I think the upshot of all this might be that the analog/digital distinction doesn't quite apply (at least not usefully) to neurological systems. They're analog, but (at least in the case of humans) seem apt to the implementation of digital systems. On the other hand, one of the things about them that is pretty digital--their ability to switch into different dynamic states--serves to make them apt to implement various analog systems.

The answer would seem to be, the brain is analog in certain fundamental ways, and digital in certain fundamental ways.

-FrL-
Switching between states is an architectural thing, and neither digital nor analog in a fundamental sense. Our peripherals, and peripherals for a computer, are often analog. Maybe states are a new idea to biologists (no offense, I'm married to one) but they are very fundamental in digital design, where anything interesting involves the design of a state machine where the outputs depend on the inputs and the current state. But there are states at the higher level also. For instance, to save power, microprocessors turn off chunks of themselves which are not being used.

However, in the sense of this thread, which is analog computation not the status of peripherals such as muscles, I still contend that the fundamental computational building block of the brain looks digital, in the sense I've already defined.

Voyager
02-26-2009, 11:57 AM
You're only half right. It's mostly true that a neuron either fires or it doesn't (with some complications), but how it decides to fire is an inherently analog non-digital process.
If you look deeply enough, the "decision" for a gate to change value is analog also - at this level there are no 1s and 0s, but only voltages which put a bunch of transistors into different states. In fact, modern designs have elements called tristate gates where the output can be 1, 0, or Z, which is a value between 1 and 0. These are commonly used in buses and as the heart of multiplexers, since you can have many gates tied together. The bus will take the value of the gate that is non-Z, driving it. It gets more complicated than this, since you can have gates producing strong 1s and weak 1s, strong 0s and weak 0s also, and a strong 1 can dominate a weak 0. When you have a strong 1 and a strong 0 both you have what is called bus contention, the output is undefined, and nasty things might happen.

RaftPeople
02-26-2009, 03:29 PM
If you look deeply enough, the "decision" for a gate to change value is analog also - at this level there are no 1s and 0s, but only voltages which put a bunch of transistors into different states...

I was thinking about this too as I grapple with "is brain analog or digital", clearly you can make something digital out of something analog. However, given that firing rate (which is analog) appears to be used to transmit information in the brain, are we back to analog?

I'm also wondering what is the definition of analog vs digital with respect to a system like the brain. Does analog mean that there is a continuous landscape of states the brain can move between? Or does it mean something else?

billfish678
02-26-2009, 04:37 PM
Another way to look at brain vs computer (smackdown down down down).

A computer has a program, data, and a physical structure. All VERY digital, very sequential, and strongly deterministic.

A brains program, data, and physical structure all highly intermeshed. I might even go so far as to say you can't really seperate them out. And I also think calling it digital is a stretch as well.

Just a quick thought for pondering...

WarmNPrickly
02-26-2009, 04:49 PM
Back in the day optical computing was all the rage in theory.

Take an image, many pixels by many pixels. Do a 2-D fourier transform on it. Thats computationally intensive as hell.

A lens "automatically" does that to a whole image at the speed of light. So, in theory, its massively parallel and about as FAST as you can get. And it doesnt generate any heat in the process.

I guess the problem is most computations are not easily/efficiently "transformable" to a 2D fourier transform problem in order to be solved. An then there is the data I/O problem. But again in theory its da bomb.

I don't know what the state of optical computing is, but I do know they don't depend on "lenses". Optical computing depends on non-linear optics. Essentially, you have to design a way for one electromagnetic wave effect the way that another electromagnetic wave interacts with a material. Then you have to develop a logic circuit with it. I don't know what the state of development is anymore and I have no idea how it compares with quantum computing. I don't know anything about quantum computing, but I do know photonic computing is not analog.

Also, it will generate heat, but not as much as electronic computing.

Indistinguishable
02-26-2009, 05:16 PM
That's right, and I'd say it supports the view of the brain as analog, since things like catching a ball, walking, having sex, and so on require the ability to vary output smoothly according to inputs.
Oh god, I've been doing it all wrong!

lazybratsche
02-26-2009, 09:17 PM
Check out a CD player. It has smoothly varying outputs, yet is basically digital. However, you are talking about the brain, and I've been talking about building blocks. There are clearly analog things in the brain and body, but there are in mostly digital systems also - they are called mixed signal. I don't know where the digital / analog boundary in the brain and body is. Muscles are clearly purely analog, but don't they get control signals from the brain, which might be considered digital?

Frankly, I still think you misunderstand how a neuron works. Clearly, you have a decent understanding, but bear with me for a moment. A typical neuron will have a large number of inputs directly from other neurons, and a single output. Inputs are summed over time and space, and if these inputs reach a certain threshold, the neuron will fire, often repeatedly. The threshold, itself, is often a moving target based on all sorts of shifting electrochemical equilibria. That output is not a single, discrete event, however. The strength of a single firing can vary, the frequency of firing is variable. Both of those quantities, which really describe the output of a neuron, are continuously variable.

Oukile
02-27-2009, 12:46 AM
Well, speaking of neurons and analog computers, one of the advantage of analog computing is the high temiporal resolution. I know a group of researchs who are builing analog chips that simulate neurons. The chips are analog, i.e. reponses are simulated by assembling resistances, condensators and so on. The idea is to plug about 100 of them in order to simulate a little network of 'realistic' neurons.

In brief, the idea is to answer questions such as 'how do my parametres affect the shape of PSEP (i.e. the dynamic of electrical responses) in a neuron that receive multiple inputs ?'

Now it gets interesting, indeed if you want to simulate things such as 'the voltage in one given neuron at time t', you are going to need a sampling of 10 kHz at least. If you have a network of 100 neurons with, let's say, 20 connections by neurons, you will have a hard time simulating it on a digital computer. With analog computing, on the other hand, all you have to do is to plug a DAC.

So, one point for digital computing is time resolution. Notice however that the example I gave is an example of a very specific computation (the cricuit is designed to simulate neurons) and not a multi-purpose programmable machine.
Notice also that this has nothing to do with artificial intelligence.

Alex_Dubinsky
02-27-2009, 01:51 AM
Modems use analog tones instead of Morse code to communicate. So does, to an extent, Gigabit and Wi-Fi. When you get down to it, it's just a better, denser way of relaying information.

I think that this is actually what the multi-value voltage folk were getting at. Maybe the signal quality and overshoot/undershoot problems really are a nail. But maybe it's doable and just really, really hard. I can maybe see future computers, if they don't go quantum, moving toward multi-value voltage or quote-unquote analog for lack of another way to improve. (And YES, you can still do error correction.) But it'd be a huge jump in difficulty of design, and maybe it'd be our super-intelligent AI descendants that'd be creating them.

Quercus
02-27-2009, 08:22 AM
If you look deeply enough, the "decision" for a gate to change value is analog also - at this level there are no 1s and 0s, but only voltages which put a bunch of transistors into different states. In fact, modern designs have elements called tristate gates where the output can be 1, 0, or Z, which is a value between 1 and 0. These are commonly used in buses and as the heart of multiplexers, since you can have many gates tied together. The bus will take the value of the gate that is non-Z, driving it. It gets more complicated than this, since you can have gates producing strong 1s and weak 1s, strong 0s and weak 0s also, and a strong 1 can dominate a weak 0. When you have a strong 1 and a strong 0 both you have what is called bus contention, the output is undefined, and nasty things might happen.I see what you're saying, but I think that's getting a little nitpicky and playing with definitions (after all, one could just as well argue that on an atomic level everything is digital because of quantum mechanics).
In My Ever-So Humble Opinion, synapses are far more complicated than the input to a resistor in a logic array, sufficiently so that --if the terms are going to be useful at all-- a resistor in a logic array is digital but a synapse is analog.

For instance, the resistor input has no 'memory' -- the state depends entirely on a single variable: the input voltage. In contrast, a synapse depends on a huge number of variables, including past history and how much glands on the other side of the body are pumping things into the bloodstream. And all of those variables are smoothly varying.

But lazybratsche , can neurons really fire at different strengths (as opposed to different frequencies) ?

RaftPeople
02-27-2009, 09:27 AM
But lazybratsche , can neurons really fire at different strengths (as opposed to different frequencies) ?

This isn't exactly answering your question, but some neurons don't fire, instead they have a graded potential that is a signal that varies continuously in strength according to the strength of the stimulus.

RaftPeople
02-27-2009, 09:49 AM
The other thing that seems non-digital is the communication between neurons and glial cells which is based on chemical signaling. Although the neuron firing causes signaling to the glial cells, the glial cells don't fire in response but do release chemical signals that interact with other glial cells and moderate neuron firing.

lazybratsche
02-27-2009, 10:17 AM
But lazybratsche , can neurons really fire at different strengths (as opposed to different frequencies) ?

Yep, though it's mostly dependent on frequency. One action potential results in one squirt of neurotransmitters out into the synapse. In isolation, the amount released will be roughly constant for a given cell. Current understanding is that there are various pools of neurotransmitter waiting to be released: an active pool waiting right at the synapse, various reserve pools to replenish the active pool, and even pools of a second type of neurotransmitter. As the neuron fires repeatedly, the active pool can be depleted faster than it can be replenished, meaning the amount of neurotransmitter released during each action potential is less each time. Also, for some of those neurons, the amount of secondary neurotransmitters release will be zero for an isolated action potential, but will slowly increase with repeated firing.

In the synapse, you can further modulate the signal that makes it to the receiving neuron, independent of the frequency and pattern of the firing neuron. You can change how fast the neurotransmitter is cleared from the synapse -- this is how a lot of antidepressants work. You can also modulate how neurotransmitter receptors behave in a lot of ways, and there are many classes of drugs that act here as well.

Whether you want to count the synaptic modulation in a neuron's output could be debatable, but it's definitely a big part of how the nervous system works.

billfish678
02-27-2009, 10:55 AM
I don't know what the state of optical computing is, but I do know they don't depend on "lenses". Optical computing depends on non-linear optics. Essentially, you have to design a way for one electromagnetic wave effect the way that another electromagnetic wave interacts with a material. Then you have to develop a logic circuit with it. I don't know what the state of development is anymore and I have no idea how it compares with quantum computing. I don't know anything about quantum computing, but I do know photonic computing is not analog.

Also, it will generate heat, but not as much as electronic computing.

There is more than one kind of "optical computing".

See Butts good post up thread for details, of the type that sure as hell needs lenses to work.

Chronos
02-27-2009, 12:04 PM
Quoth billfish678: Another way to look at brain vs computer (smackdown down down down).

A computer has a program, data, and a physical structure. All VERY digital, very sequential, and strongly deterministic.

A brains program, data, and physical structure all highly intermeshed. I might even go so far as to say you can't really seperate them out. And I also think calling it digital is a stretch as well.Being sequential and deterministic has nothing whatsoever to do with being digital. A slide rule has a program, data, and physical structure, and its operation is very sequential and strongly deterministic, but it's completely analog. Meanwhile, some data analysis techniques implemented on digital computers are not sequential or deterministic.

beowulff
02-27-2009, 12:06 PM
...Meanwhile, some data analysis techniques implemented on digital computers are not sequential or deterministic.

[Bolding mine] I find this very hard to believe. Care for an example?

billfish678
02-27-2009, 12:09 PM
Quoth billfish678:Being sequential and deterministic has nothing whatsoever to do with being digital. A slide rule has a program, data, and physical structure, and its operation is very sequential and strongly deterministic, but it's completely analog. Meanwhile, some data analysis techniques implemented on digital computers are not sequential or deterministic.


True enough.

But are you of the brain is basically digital or not camp?

billfish678
02-27-2009, 12:10 PM
[Bolding mine] I find this very hard to believe. Care for an example?

You've never used Windows have you?:)

erislover
02-27-2009, 12:14 PM
[Bolding mine] I find this very hard to believe. Care for an example?Which would count:

Monte Carlo techniques using, say, random.org and a PRNG?

Merge sort with duplicates?

Voyager
02-27-2009, 12:25 PM
Another way to look at brain vs computer (smackdown down down down).

A computer has a program, data, and a physical structure. All VERY digital, very sequential, and strongly deterministic.

A brains program, data, and physical structure all highly intermeshed. I might even go so far as to say you can't really seperate them out. And I also think calling it digital is a stretch as well.

Just a quick thought for pondering...

In my understanding, the brain can and does grow new connections, which neither digital nor analog systems do - in general. However there are Field Programmable Gate Arrays (FPGAs) in which the actual circuit is controlled by a memory array. There is a distinct set of resources, but the resources used, and the connections between them can be changed more or less on the fly. The main reason for using these is that since you have only one chip design for many applications, you get economies of scale, but some people have made use of this capability in a system. Say you are building a chess playing computer. Some of these, like Belle, have special hardware to accelerate the computation needed. If you used FPGAs, you can have one design optimized for openings, then change it for the middle game, and then change it again for the end game, with only one chip. I think Belle did use FPGAs, but I don't think the version I heard about used this capability.

billfish678
02-27-2009, 12:35 PM
In my understanding, the brain can and does grow new connections, which neither digital nor analog systems do - in general..

IANA brain expert by ANY definition.

But you might be able to also phrase it this way.

In a brain, the structure IS the data, the structure IS the programing, the data IS the programming, the programing IS the data....blah blah blah.

It would be a real stretch IMO to say the same of current digital computers and computing, much less the growing aspect you brought up.

Now, whether that is true or how important that is a whole nother matter for debate.

Voyager
02-27-2009, 12:38 PM
Frankly, I still think you misunderstand how a neuron works. Clearly, you have a decent understanding, but bear with me for a moment. A typical neuron will have a large number of inputs directly from other neurons, and a single output. Inputs are summed over time and space, and if these inputs reach a certain threshold, the neuron will fire, often repeatedly. The threshold, itself, is often a moving target based on all sorts of shifting electrochemical equilibria. That output is not a single, discrete event, however. The strength of a single firing can vary, the frequency of firing is variable. Both of those quantities, which really describe the output of a neuron, are continuously variable.

No, that is pretty much my understanding, but I hadn't remembered the output modulation. Digital devices are designed to avoid these kinds of problems, by sizing cells so that they won't be affected by driving too many outputs, or by putting in repeaters to make sure signals arrive at the other end of long paths in good shape.

But the fundamental difference between an analog and a digital design is that the output of an analog design is directly affected by its inputs, while in a digital design the signals going through it are regenerated at each cell. None of this implies that there must be a single bit of information coming out of a cell or neuron.

Almost all digital design these days is synchronous, with frequencies fixed in each clock domain. There has been plenty of work on asynchronous designs, in which the frequency of firing has been changed. They are still digital though, but have all sorts of practical problems such that I know of several projects that started out asynchronous and ended synchronous. The difference between computers and the brain is that computers evolve faster and use intelligent design, though some days I have my doubts.

Voyager
02-27-2009, 12:45 PM
IANA brain expert by ANY definition.

But you might be able to also phrase it this way.

In a brain, the structure IS the data, the structure IS the programing, the data IS the programming, the programing IS the data....blah blah blah.

It would be a real stretch IMO to say the same of current digital computers and computing, much less the growing aspect you brought up.

Now, whether that is true or how important that is a whole nother matter for debate.

Sure - I'm definitely not saying that the brain has the structure of a computer or is directly analogous to a digital design. However the fact that the brain works on structure says nothing about whether it is digital or analog, since you can construct a computer that works the same way, if you wished to.

Software and hardware are equivalent at the fundamental level. These days hardware designs at the top level are expressed in software languages, and are synthesized into hardware. I know people who pretty much automatically converted a compute intensive part of a program they had into hardware by compiling the C into hardware. This has been true since the early days of computing, when some floating point was done in software or hardware depending on the price and throughput requirements of two machines in the same family.

Voyager
02-27-2009, 12:51 PM
Quoth billfish678:Being sequential and deterministic has nothing whatsoever to do with being digital. A slide rule has a program, data, and physical structure, and its operation is very sequential and strongly deterministic, but it's completely analog. Meanwhile, some data analysis techniques implemented on digital computers are not sequential or deterministic.

How do you define sequential here? That's a word that has a lot of meanings depending on the context.

By not being deterministic, I assume you mean the use of random methods, which might be pseudorandom in practice but could be truly random if necessary. That I have no problem with.

Voyager
02-27-2009, 12:58 PM
I see what you're saying, but I think that's getting a little nitpicky and playing with definitions (after all, one could just as well argue that on an atomic level everything is digital because of quantum mechanics).
In My Ever-So Humble Opinion, synapses are far more complicated than the input to a resistor in a logic array, sufficiently so that --if the terms are going to be useful at all-- a resistor in a logic array is digital but a synapse is analog.

For instance, the resistor input has no 'memory' -- the state depends entirely on a single variable: the input voltage. In contrast, a synapse depends on a huge number of variables, including past history and how much glands on the other side of the body are pumping things into the bloodstream. And all of those variables are smoothly varying.

But lazybratsche , can neurons really fire at different strengths (as opposed to different frequencies) ?

Do you mean transistor? No one has used resistors in ICs since RTL, except maybe for some very special sensor blocks or other mixed signal pieces.

States are preserved in digital designs by feedback among several memoryless elements. A brief look at a seminconductor memory book shows a static RAM cell using 6 transistors. The heart of a flop is two gates cross coupled, but more are used for things like control.

I wish the analog nature of digital cells was nitpicking - almost all the interesting debug problems we see come from the fact that you can't ignore the analog effects, not a over 1 GHz. Logic problems can be debugged on a simulator, and are a lot easier to handle.

WarmNPrickly
02-27-2009, 02:21 PM
Modems use analog tones instead of Morse code to communicate. So does, to an extent, Gigabit and Wi-Fi. When you get down to it, it's just a better, denser way of relaying information.

I think that this is actually what the multi-value voltage folk were getting at. Maybe the signal quality and overshoot/undershoot problems really are a nail. But maybe it's doable and just really, really hard. I can maybe see future computers, if they don't go quantum, moving toward multi-value voltage or quote-unquote analog for lack of another way to improve. (And YES, you can still do error correction.) But it'd be a huge jump in difficulty of design, and maybe it'd be our super-intelligent AI descendants that'd be creating them.

I'm interested in this statement, because I'm not sure if it really demonstrates analog technology. Rather, I think you are describing non-binary technology. I think that ternary, quatinary or even decimal system can be used that is not analog. Since ultimately the modem signals have to be converted to binary, I have trouble imagining that a modem can be analog.

RaftPeople
02-27-2009, 02:26 PM
But the fundamental difference between an analog and a digital design is that the output of an analog design is directly affected by its inputs, while in a digital design the signals going through it are regenerated at each cell. None of this implies that there must be a single bit of information coming out of a cell or neuron.

This doesn't seem correct, but I'm happy to be educated. When I google to verify I see definitions based on continuous vs discrete, which is what I expected. Can you point me to any definitions?

Voyager
02-27-2009, 02:26 PM
I'm interested in this statement, because I'm not sure if it really demonstrates analog technology. Rather, I think you are describing non-binary technology. I think that ternary, quatinary or even decimal system can be used that is not analog. Since ultimately the modem signals have to be converted to binary, I have trouble imagining that a modem can be analog.

It's a perfect example of a mixed signal device. The transmitter takes digital input and produces analog waveforms, the receiver does just the opposite.

Voyager
02-27-2009, 02:40 PM
This doesn't seem correct, but I'm happy to be educated. When I google to verify I see definitions based on continuous vs discrete, which is what I expected. Can you point me to any definitions?

I don't think there is anyplace that is going to give a definition about what we are covering here.

Let me explain. When you go down deep enough, everything is analog. (Ignoring quantum level.) The waveforms between gates are not square waves, and signals between neurons are not digital either.

The next level up, you can have devices like amplifiers or resistors for which the output is a direct function of the inputs, or you can have an output controlled by the inputs but which does not flow directly from the inputs. lazybratsche mentioned a pool of neurotransmitters sitting at the output of a neuron which gets squirted into the synapse. The neurotransmitters received at the input don't go right through. In a gate, the electrical signal received is used to turn on and off transistors, and the voltage on the output comes from a distinct voltage source, not the input. That is how a small signal at the input, which might be very messy, gets regenerated into a strong clean signal at the output.

My source is 30 years in this business, being program chair of conferences which cover both analog and digital designs, and doing book reviews on books containing chapters on analog and digital designs. I'm not an expert in analog or mixed signal stuff by any means, but I've been exposed to more of it than most people.

Frylock
02-27-2009, 02:44 PM
Voyager this seems different from what you said before, and what you said before seems a lot more in keeping with the ideas of analog and digital I'm familiar with.

To quote your previous comment I'm referring to:

The real distinction between an analog and digital is that the output of an analog block is a smooth function of its inputs, while the output of a digital cell may be controlled by its inputs, but isn't directly proportional to them.

RaftPeople
02-27-2009, 02:53 PM
Let me explain. When you go down deep enough, everything is analog. (Ignoring quantum level.) The waveforms between gates are not square waves, and signals between neurons are not digital either.

The next level up, you can have devices like amplifiers or resistors for which the output is a direct function of the inputs, or you can have an output controlled by the inputs but which does not flow directly from the inputs. lazybratsche mentioned a pool of neurotransmitters sitting at the output of a neuron which gets squirted into the synapse. The neurotransmitters received at the input don't go right through. In a gate, the electrical signal received is used to turn on and off transistors, and the voltage on the output comes from a distinct voltage source, not the input. That is how a small signal at the input, which might be very messy, gets regenerated into a strong clean signal at the output.

Yes, completely understand all of that. But if the output is continuous as a function of the input and not discrete, to me that is analog, regardless of the number intermediaries or the specific mapping of input to output.

It seems to me that you are saying that even if the output is not discrete it can still be considered analog. Now I fully understand the fact that digital computers are created using analog mechanisms, but in the case of digital computers we can follow the chain all the way through and be sure it's digital.

The brain, on the other hand, has so many analog actions that don't seem to be clearly transformed into digital (e.g. graded potentials) that it seems like we can't really say yet which it is.

Chronos
02-27-2009, 04:22 PM
True enough.

But are you of the brain is basically digital or not camp?To the extent that most neurons have an output that is either "fire" or "not fire", the brain is mostly digital. Now, there may be neurons which can fire at varying strengths, and if so, those portions are analog, but this is the first I've heard of them.

There are, of course, some fundamental differences between the way the brain works and the way most non-brain computers work, but they're differences other than the analog-digital divide.

Voyager
02-27-2009, 05:12 PM
Voyager this seems different from what you said before, and what you said before seems a lot more in keeping with the ideas of analog and digital I'm familiar with.

To quote your previous comment I'm referring to:

I'm probably not expressing myself well. Take this comment from lazybratsche
A typical neuron will have a large number of inputs directly from other neurons, and a single output. Inputs are summed over time and space, and if these inputs reach a certain threshold, the neuron will fire, often repeatedly. The threshold, itself, is often a moving target based on all sorts of shifting electrochemical equilibria. That output is not a single, discrete event, however. The strength of a single firing can vary, the frequency of firing is variable. Both of those quantities, which really describe the output of a neuron, are continuously variable.

The input is continuously varying, check. The output is continuously varying, check. But the output does nothing until the threshold is achieved. If you have ten inputs from other neurons, and reduce the strength of one by 50%, you are not going to get a 10% reduction on the output. You might get a total reduction, if you've missed the threshold, or practically none.

The input to a gate is somewhat similar to this. If you have a three input AND, and change all inputs from 0 to 1, you get a voltage rise on the inputs, and the gate output changes value based on when all three exceed the threshold voltage of the input. Now the gate output holds steady, so we don't have the interesting varying output we see in a neuron. There are some specialized cells that do, but I think we're talking basic building blocks here, so they shouldn't count.

I'd guess that since the brain is based more on living cells and structures that carry signal, moderated by chemicals, things are inherently messier (more analog) than designed computer cells. I suspect evolution took advantage of some of this, and now uses multivalue logic.

When I was a freshman they thought CS majors weren't ready for real hardware, so we learned about circuits with blocks called integrators and dividers and stuff. Then they told us that these were really capacitors and resistors and the like. You can learn analog circuits that way - you can't describe a digital circuit that way. Unless I totally misunderstand the design of the brain, you can't model a batch of neurons with resistors and capacitors either.

WarmNPrickly
02-27-2009, 06:40 PM
It's a perfect example of a mixed signal device. The transmitter takes digital input and produces analog waveforms, the receiver does just the opposite.

Why is there anything mixed about it? Certainly the signal that goes to the modem is digital. The modem converts it to whatever the modem uses, then converts it back to digital. The carrier signal may be analog, but it is clearly communicating in digital format. I don't see how it is any less digital than flash memory. Certainly, each stored bit on a flash drive could technically exist in an intermediate state, but it would be interpereted as an error the same way a modem would interperet something that doesn't fit its own format.

Voyager
02-27-2009, 06:59 PM
Why is there anything mixed about it? Certainly the signal that goes to the modem is digital. The modem converts it to whatever the modem uses, then converts it back to digital. The carrier signal may be analog, but it is clearly communicating in digital format. I don't see how it is any less digital than flash memory. Certainly, each stored bit on a flash drive could technically exist in an intermediate state, but it would be interpereted as an error the same way a modem would interperet something that doesn't fit its own format.

That's the definition of mixed signal! You start, say, with digital, using gates and stuff, and then feed a digital signal into a d2a (digital to analog) circuit which feeds a traditionally analog chunk. And vice versa, of course. Draw a box around some circuitry. If it has only resistors and capacitors and such, it is an analog circuit. If it only has gates, flops, and memory, it is a digital circuit. If it has both, it is mixed signal.

Want to make it more complicated? Inside the digital parts, there are buffers that do nothing except strengthen the signal. You don't put them into a schematic, they get added when you "power up" the design to make timing. Old style normal I/Os has an analog part since they have to be powerful enough to drive a signal off a chip. I was involved in a debug once where the internals of a chip were working perfectly, but it totally failed any tests where the test was applied at the inputs, and pretty much all the outputs were bad. It turned out that there was an analog input to our I/O cells which people forgot, and was not connected. The digital only netlist was fine, and digital people like more ignore the analog signals because they are just irrelevant to what really matters - except when they're not. It is really hard to see that something is missing when you've trained yourself to ignore it.

billfish678
02-27-2009, 07:07 PM
I too tired to fight tonight. I'll just say this regarding the brain digital analog debate.

Its my impression that folks are stretching the analog side of the digital one direction and the digital side of the analog in the other directions, and with enough stretching from both directions you can "get" them to meet in the middle and say "taa daa, see, they are both the same!"

What about that previous post where the researchers are building a physical analog model that will model a whooping 100 neurons! Then the poster goes on to claim that the reason they are doing that is because it would take one hell of a digital computer to simulate it.

A digital computer that has millions of transitors, gates, and reticulated framuses, operates at gigahertz speeds, has access to mega/tera bytes of data can't easily simulate a measly 100 neurons.

If true, that tells me that calling neurons digital, even if technically true at some level, is just barely this side of a lie.

Just my opinion mind you.

And get of my analog lawn!

billfish678
02-27-2009, 07:37 PM
Oops!

Regarding my previous post. Note, I am NOT, repeat NOT calling anyone here liars, claiming they are lieing, or anything lieish going on.

I was just trying to make the point, that IMO if condition XYZ was true, saying ABC would be so factually incorrect, though it may be technically correct, it would be the informational equivalent of a lie.

Please, forgive this old analog fool !

No offense was intended.

And still get off my lawn!

RaftPeople
02-27-2009, 07:52 PM
Here is an interesting article on various ways neurons/synapses encode information:

http://www.pnas.org/content/94/24/12740.full

Some methods:
1) Firing rate
2) Relative timing of previous action potentials
3) Phase of action potential relative to oscillations
4) and others



Here's a paper exploring the digital/analog debate.
http://watarts.uwaterloo.ca/~celiasmi/Papers/ce.2000.continuity.debate.csq.html

"As well, the neurophysiological evidence for discreteness is not conclusive. Though it strongly suggests that we are discrete in state (with respect to neurons), this does not mean we are discrete in time (van Gelder 1995). Even though the spikes themselves are all-or-none, the precise distance between any two spikes can only be expressed as a real number. If these distances are the basis of neural information processing, the brain is clearly continuous."

If this person is correct with the bolded statement, and given the above encoding schemes which are time dependent, that would seem to imply analog.


But I think this is a key statement because clearly both analog and digital activity is happening:
"...the heart of the debate lies in the question of whether cognition can be explained by analog or digital processes. The strict analogicity of the depolarization of neuron membranes may be uninteresting if its analog nature does not affect the cognitive behavior of the system. Similarly, even though the transmission of neural spikes is digital, if this digitalness is not relevant to cognition (perhaps only the time between spikes is relevant) then we should not consider the brain to be digital."


I haven't read the rest of this persons paper yet, but I do know he thinks the brain is digital.

Voyager
02-27-2009, 07:59 PM
Oops!

Regarding my previous post. Note, I am NOT, repeat NOT calling anyone here liars, claiming they are lieing, or anything lieish going on.

I was just trying to make the point, that IMO if condition XYZ was true, saying ABC would be so factually incorrect, though it may be technically correct, it would be the informational equivalent of a lie.

Please, forgive this old analog fool !

No offense was intended.

And still get off my lawn!

That's okay. I published a column by an analog guy about how analog was just better. A colleague of mine at Bell Labs, a pretty well known analog guy, had a good time telling all us digital guys that everything was actually analog. And I had lunch with Bob Pease not all that long ago. So I know that analog guys are kind of, well, odd. :D

I think the problem is the several levels of meaning of analog, and that we don't see the abstraction levels of the brain all that clearly, since we don't have to design the damn thing. I bet if all we had to go on was the most basic semiconductor level representation of a microprocessor, we'd be damn sure the thing was an analog design.

Voyager
02-27-2009, 08:02 PM
Here's a paper exploring the digital/analog debate.
http://watarts.uwaterloo.ca/~celiasmi/Papers/ce.2000.continuity.debate.csq.html

"As well, the neurophysiological evidence for discreteness is not conclusive. Though it strongly suggests that we are discrete in state (with respect to neurons), this does not mean we are discrete in time (van Gelder 1995). Even though the spikes themselves are all-or-none, the precise distance between any two spikes can only be expressed as a real number. If these distances are the basis of neural information processing, the brain is clearly continuous."

Thanks for the links. I'll try to read them over the weekend. If it turns out that the fundamental unit of information in the brain is a continuously varying delta between events, and that this carries through the neuron (perfectly possible) then I agree the brain is analog. It almost sounds like a neuron is doing Fourier transforms or something.

RaftPeople
02-27-2009, 08:07 PM
Thanks for the links. I'll try to read them over the weekend. If it turns out that the fundamental unit of information in the brain is a continuously varying delta between events, and that this carries through the neuron (perfectly possible) then I agree the brain is analog. It almost sounds like a neuron is doing Fourier transforms or something.

I just finished that second link and he takes apart the analog arguent by showing that there is a limit to the information that can be encoded by the firing rate. He didn't have a final conclusion (meaning all cognition clearly digital) due to many levels to consider, but it's interesting logic.

Alex_Dubinsky
02-27-2009, 08:50 PM
I think the problem is the several levels of meaning of analog, and that we don't see the abstraction levels of the brain all that clearly, since we don't have to design the damn thing. I bet if all we had to go on was the most basic semiconductor level representation of a microprocessor, we'd be damn sure the thing was an analog design.
Yup, exactly. You make digital out of analog, and you can make analog out of digital. These things change depending on scale (ie, level of abstraction), and are not decided by the most base constitutent.

(This supposed "true nature" gets erased with each shift in scale, at least enough to become part of the background noise. The analog nature of digital is effectively erased when the rate of hardware errors is smaller than of software crashes. The digital nature of a weather simulation is erased when the quality of the pseudorandom number generator and precision of the floating-point numbers has less significance than the quality of the atmospheric model. It's all relative, baby.)

RaftPeople
02-28-2009, 04:15 PM
Thanks for the links. I'll try to read them over the weekend. If it turns out that the fundamental unit of information in the brain is a continuously varying delta between events, and that this carries through the neuron (perfectly possible) then I agree the brain is analog. It almost sounds like a neuron is doing Fourier transforms or something.

I read it again to make sure I followed it. I enjoyed his explanation, logical but short and sweet. A few thoughts:

1) He states that a real number contains infinite information because it requires an infinite bit string to represent it. This seems to imply a real number contains more information than an integer, which doesn't make sense to me. An infinite bit string has to do with converting the value to some other representation, which is not the value itself. I would imagine there is a different transformation that would result in the integer requiring infinite bits to represent and the real number requiring finite bits.

2) I'm glad I read his explanation of noise constraining amount of information that can be xmitted, I can see now how even a continuously varying signal can be more limited than it first appears.

3) At the end, he leaves it open as to whether the brain is analog at a higher level. Thoughts on how it could be analog at a higher level if digital at the level he discusses?

Voyager
02-28-2009, 07:56 PM
I read it again to make sure I followed it. I enjoyed his explanation, logical but short and sweet. A few thoughts:

1) He states that a real number contains infinite information because it requires an infinite bit string to represent it. This seems to imply a real number contains more information than an integer, which doesn't make sense to me. An infinite bit string has to do with converting the value to some other representation, which is not the value itself. I would imagine there is a different transformation that would result in the integer requiring infinite bits to represent and the real number requiring finite bits.

2) I'm glad I read his explanation of noise constraining amount of information that can be xmitted, I can see now how even a continuously varying signal can be more limited than it first appears.

3) At the end, he leaves it open as to whether the brain is analog at a higher level. Thoughts on how it could be analog at a higher level if digital at the level he discusses?

I'm glad this guy is on my side. The paper is badly flawed, and since it supports my position I don't need to spend as much time refuting it as if it supported the other one. He references Shannon, but he clearly does not understand him. Information transfer requires energy - infinite information would require infinite energy. It is not a matter of analog or digital - Shannon developed information theory on very analog telephone lines. Also, transmitting infinite information would require either infinite bandwidth or infinite time. I know he comes down on the right side of this argument, but he doesn't seem to understand the basics of information.

He is making a better argument when he says
The case is analogous in neural systems. For a neuron to receive information from its neighbor, it must examine its neighbor’s message in the presence of noise. For a neuron to send a message to its neighbor, it must encode the message in the presence of noise. It is truly amazing at how effective neurons are at preserving the information they pass through encoding and decoding spike trains. In fact, neural systems approach the theoretical limits of coding efficiency. Nevertheless, there is a finite amount of information passed per neural spike. This allows us to consider the system as discrete in time on a time scale of about one millisecond.

"Examine" is metaphorical here, but it captures the discrete nature of the structure of the brain. Resistors and other analog components don't examine anything - a logic gate does. But he is saying this as an assertion, so some might object.

The real problem, which is connected to your question 3, is that analog and digital don't make much sense at a higher level. Most people would say a computer is digital, but it has tons of analog - in the screen driver, the disk, etc. We can make a clear distinction in small parts of the design, but when you get high enough it is mixed. We might say that the computer is digital because most of its computation is done digitally, and that requires you look at the nature of the majority of its components. So, it all boils down to what you think the neuron is. Without seeing a direct path from input to output, the way an analog component has, I'll still call it digital.

brujaja
03-02-2009, 06:59 AM
He wasn't mistaking it for Quantum computing (http://en.wikipedia.org/wiki/Quantum_computer) by any chance?

Dang it, Pushkin! I clicked on your innocent-looking link, and then spent hours at Wikipedia!

lazybratsche
03-03-2009, 04:09 PM
I feel like we're debating "does a dog have buddha nature?" here. Still, I'm going to stick with "analog" for the brain, with full disclosure that it's just a gut feeling and that reality usually doesn't pay much attention to my gut.

On a more practical level, I'm not sure if it's useful to call a neuron or circuit "digital". We've got a pretty good understanding of how a single neuron behaves, and can model it really well (by biological standards at least). We have a decent understanding of very small circuits... but beyond that, the brain is still very mysterious. Questions on the nature of cognition is still the domain of philosophers. Trying to call it "analog" or "digital" leads to interesting debates on a message board, but I'm not sure if it'll be a productive way of trying to understand the brain. Personally, I'm unimpressed by the "Continuity Debate" linked above -- way to many arguments based on how things "seem". Sure, you can model neural systems with a discrete computer with some success, but you can also do that for the weather.

I guess that's why I'm throwing in with the neuroscientists. I likes me some good empirical data.

Voyager
03-04-2009, 01:28 AM
Personally, I'm unimpressed by the "Continuity Debate" linked above -- way to many arguments based on how things "seem". Sure, you can model neural systems with a discrete computer with some success, but you can also do that for the weather.


It lives! I'm unimpressed with the continuity argument also. As for modeling, I don't think that would prove anything. There is structural modeling, and there is functional modeling. I don't see how weather can be modeled structurally. The brain might be (not that we understand it well enough to do a good job.)
But I also don't think the question of whether the brain is analog or digital has much meaning - it is clear it is both. Neurons on the other hand ...

Jragon
03-04-2009, 11:04 PM
Back to the whole "what's the future of the computer" thing, wouldn't it make (some) sense that, at an engineering level, once they start hitting roadblocks with digital circuitry it would be a somewhat logical progression to begin experimenting with adding analog components onto an existing digital system and only using the digital system for digitally accomplishable tasks and as a controller for the new analog components?

I think someone on the first page used an example about how a digital device can't determine if a given number is rational but saw a analog device that can (not sure of the veracity of that, but I'll go with it since it sound plausible). So you enter your number into the computer and hit the "findIsRational" button, the computer takes in the number and passes it to some analog device sticking off the motherboard which does its thing and then returns a simple boolean back to the digital part of the computer who then prints it for you. Obviously this sort of boolean computing is a really simple example, but you could probably accomplish some pretty amazing stuff if you were creative with how the device structures its return to the digital part of the computer.

Alex_Dubinsky
03-05-2009, 02:29 AM
I'm unimpressed with the continuity argument.You should be.

The argument, though, doesn't prove that analog is digital. It proves that, philosophically, there's not much difference. Digital signals carry finite information, and analog signals carry finite information. Moreover, that information can be transformed from one representation to another and it's the same fn information.

Analog and digital aren't philosophical concepts. They're labels for a couple specific technologies we often use. It's easy to have other technologies, like the aforementioned modem and ethernet, that don't really fit either of those labels. And there's nothing mystical about it.

I think when people talk philosophically about analog vs digital, they're really trying to point out the difference between functions which are smooth and stable and functions which are highly dependent on starting conditions and have constrained behavior. But, in fact, a digital program can easily be smooth and well-behaved and an analog circuit can easily be chaotic.

Voyager
03-05-2009, 02:38 AM
Back to the whole "what's the future of the computer" thing, wouldn't it make (some) sense that, at an engineering level, once they start hitting roadblocks with digital circuitry it would be a somewhat logical progression to begin experimenting with adding analog components onto an existing digital system and only using the digital system for digitally accomplishable tasks and as a controller for the new analog components?

As I mentioned System on Chip devices commonly include analog blocks. The rational number finder is nonsense, but there are plenty of things analog does better than digital - like radio transmission in your cellphone, for example. Nothing new about that at all.

Voyager
03-05-2009, 02:47 AM
You should be.

The argument, though, doesn't prove that analog is digital. It proves that, philosophically, there's not much difference. Digital signals carry finite information, and analog signals carry finite information. Moreover, that information can be transformed from one representation to another and it's the same fn information.

The think that impressed me the least was that he thought he had to find a complex counterargument against infinite information on an analog symbol - especially when he mentioned information theory.


Analog and digital aren't philosophical concepts. They're labels for a couple specific technologies we often use. It's easy to have other technologies, like the aforementioned modem and ethernet, that don't really fit either of those labels. And there's nothing mystical about it.

I think when people talk philosophically about analog vs digital, they're really trying to point out the difference between functions which are smooth and stable and functions which are highly dependent on starting conditions and have constrained behavior. But, in fact, a digital program can easily be smooth and well-behaved and an analog circuit can easily be chaotic.
I think I covered all this already. Digital computing has win because it is more reliable and more scalable - and much easier to design and program.
The philosophical meaning of analog and digital seems kind of pointless. What we are doing here is looking at a design and wondering how to classify it. I don't think the philosophers have ever read a netlist, and so don't really get what digital is.