computer = human brain

If there was a computer with the strength and performance of an (average?) human brain, what would it be like? How much memory, what speed, how big a hard drive?

I assume the OS wouldn’t be Windows.

This is a bad comparison; brains are not computers, and they work fundamentally different. (Hahahahaha!)

Seriously, brains work with electrochemical signals which are much slower than the electrical signals a computer uses.

I’m sure there are other problems too, but I can only think of that one extemporaneously.

P.S. There would be no OS. Unless you believe in mind control.

OK, but how much slower? Can’t the speed be measured in megahertz? And there is no way of measuring the memory capacity of the human brain, which could then be translated into megabites?

No accurate way. Besides being used for completely different purposes (Brains control a body, whereas computers are made to interact with that body.), every brain varies from the next. This would seriously screw up any attempt at measuring. In addition, we created computers, and thus know how they work. We are still figuring out brains. Computers also have much different components.

This will never be a fair comparison.

The speed has been adequately covered in “Light vs Thought.”

I’ll look into some of this; I know I’ve seen it recently. However, the above posters are right. A brain’s circuitry works in a fundamentally different way from conventional computers, and we are only just beginning to explore the details of this difference.

Furthermore, because of the above difference, any conventional computer designed to “think” as a human would would require even more computing power and memory than a straight comparison would indicate. This is because the computer would have to emulate the human thought process, sort of like some software programs out there emulate old console video games.

Here’s a neat little bit on neurocircuitry, or whatever you want to call it:

http://www.lucent.com/press/0600/000621.bla.html

And here’s a quote from a fascinating article that I have only briefly skimmed:

For ease of analysis we make the assumption that all the 10 /\14 [ten to the fourteenth] synapses function as memory elements, remembering some state information about past actions. The sampling time is taken to be limited by the width of the ionic pulse travelling along a neuron. The brain effectively runs asynchronously at a bit rate of about 100 bit/s. Although not used here (since we are not comparing systems) the input bit rate is the sum of all the sensory nerves. This is dominated by the eye which has about 127M rods and cones concentrated down to about 1M neurons, and thus gives rise to an input bit rate of about 100 Mbit/s. The rest (sound, tactile, taste and smell) adding up to no more than one tenth of this figure.

From http://btlabs1.labs.bt.com/people/cochrap/papers/brain9a/brain9a.html

I recall when I was a kid reading the World Book Encyclopedia (1967 edition IIRC), which informed us that a computer which could do the work of a human brain would be the size of the Empire State Building.

So there you go.

As I recall, Hans Moravec came up with some estimates of
processing and memory capacity in his Mind Children book
(available on Amazon http://www.amazon.com/exec/obidos/ASIN/0674576187/o/qid=963944540/sr=8-3/ref=aps_sr_b_1_5/104-0703661-0398305).

It is fairly difficult to form a one-to-one comparison of brains vs computers because one uses a massively parallel set of quite slow “processors” while the other uses a much faster but linear processor. You can, of course, hook up a whole bunch o’ Pentiums or PowerPCs in parallel, but the hardware and software issues are complex and probably become insurmountable for some fairly small (< 100) number of processors. At this point, you have to switch to networked groups of shared memory parallel processors. Coordinating all these processors becomes a major problem.
And you still don’t end up with anything that strongly resembles the brain in anything except possibly raw computing power.

curwin asks:

Technically, the clock speed of the human brain is about 500Hz. Where the brain recoops it’s processing speed is in the architecture - it’s highly parallel. So where your PC only has one monolithic processor churning away at several hundred megahertz, the brain has effective billions of tiny processors churning away at only 500Hz.

I’ll assume you mean megabytes… [wink]
The problem is that you’re assuming that the brain stores information in the same way that computers store information. However, the mechanism for memory storage in the human brain is not well understood. In the first place, it almost certainly isn’t binary. In the second place, the mechanisms are not hard. It’s as if the architecture is constantly shuffling stuff around in memory, optimizing and corellating related information. In the third place, there are apparently many levels of redundancy. However I did hear one projection (I think it was from Roger Penrose, but I’ll try to double check that) that the human capacity for memory is probably hundreds of terabytes.
Sofa King quoted the paper from Cochrane, Winter, and Hardwick, but I thought it only fair to point out that the 100 bit/s is possibly misleading. It presumes a binary transfer of information, whereas an actual synaptic response is weighted, therefore (depending on the resolution) the amount of information in the transfer during one synaptic event could be equivalent to dozens or even hundreds of bits. Also, their 100 Mbit/s input rate from the eye suffers from a similar failing. Hans Moravec of the Robotics Institute at Carnegie Mellon University has placed this estimate much closer to 100 million Mbits/s (a million times greater). See: http://www.transhumanist.com/volume1/moravec.htm

Here, Moravec estimates the memory capacity at around 100 million megabytes. Again we know so little about the way the human brain stores information that Moravec could be off by several orders of magnitude on the low side…

To illustrate the differences in human and computer memories: Suppose you have an image that you want to memorize, say, the face of a person. A computer will store it as a bitmap or jpeg, or some other such format, with an easily definable size. A person, on the other hand, will use a series of templates: A human memory of that picture might look something like “Human face… Female… Caucasian… Brown hair… Long, curly hair… Pointy nose… big eyes… rounded cheeks… looking to the left”, or even something like “Aunt Sue… Looking to the left”. In either case, it’s stored very compactly, being constructed from elements of already-memorized images, but with loss of many of the details present in a computer image. For instance, you may remember a picture very well, but do you remember which pair of earrings Aunt Sue was wearing? Probably not, unless you specifically took note of them when you looked at the picture.

Wood Thrush’s comments are beyond useless. The fact that they work differently doesn’t make it impossible to estimate a comparison. Anybody can point out why things are difficult; the OP asked a specific question, and there is a generally agreed upon body of answers to that question.

curwin, here are a couple of good reads on the subject:
http://www.merkle.com/humanMemory.html
http://www.merkle.com/brainLimits.html

Chronos wrote:

While I agree that “associative memory” is definitely one of the tools used by the human brain to store information, it is by no means the only mechanism. I’m not sure if that’s what you were trying to imply.

Bill wrote:

Maybe not, but it might temper how relevant that comparison is. Forinstance, I tend to think it’s pointless to compare Wintel computers to Macintoshes in terms of MIPS and they are a lot closer, architecturally, than the human brain and binary monolithic memories.

Actually, I think you’ll find pretty wide variance across the “panel of experts”. For instance, in your first link, Landauer estimates the human brain capacity at a few hundred megabytes. Most other estimates I’ve seen hold that figure to be many orders of magnitude larger. Landauer’s approach was interesting, but I question his fundamental findings, that human beings remember “nearly two bits per second under all the experimental conditions”. This doesn’t seem to pass the general sanity test. Consider the two word phrase “CHOCOLATE ELEPHANT” and commit it to memory. It probably took you less than 2 seconds to read and memorize it, yet it represents at least 126 bits (probably more since you undoubtely have some pseudo-visual cues involved in the memorization). For another example, go to the SD homepage and take a quick look at the graphic. How much information from that picture do you think you could commit to memory in, say 10 seconds? I dare say that it would be a lot more than a mere 20 bits…