PhineasJWhoopy asks:
> How about some clarification on IQ? Cecil
> disparages the estimate that Sidis’ IQ was
> 250-300 (“on what basis could the scale be
> run up that high?” he asks rhetorically).
> My understanding is that IQ (“intelligence
> quotient”) is simply the ratio of mental
> age to chronological age. So if Silis is
> as intelligent as a 9-year-old at age
> three, he’s got an IQ of 300. Does this
> not jibe with what we know of Silis?
>
> That said, I see problems with the whole
> IQ concept for adults. The concept assumes
> that mental ability increases linearly
> with age. Does it? If a 25-year-old has an
> IQ of 300, does it mean he has the mind of
> a 75-year-old? Would you want that
> distinction?
There are two definitions for I.Q. The old one was, as you suggest, to first define the ability to pass a given test for a given age as having that mental age. Then if a three-year-old passes a test which assigns him a mental age of nine, he has an I.Q. of 100 times the quotient of 9 divided by 3. The problem with this definition is that it’s impossible to make a person’s I.Q. be constant over his lifetime, and that was supposed to be the part of the idea of the I.Q. Thus this definition hasn’t been popular for a while.
When Marilyn Vos Savant claims to have an I.Q. of 228, she’s using the old definition. Apparently she passed a test which gave her a mental age of 16 when she was 7. For a long time I was convinced she was simply lying about her I.Q., since I only knew the current definition of the term.
The modern definition doesn’t require one to define the idea of a mental age at all. All you have to do is to give a test to all of a certain group (all children of a given age or all adults) and rank all of them in order according to how well they did on the test. This modern definition of I.Q. is based on the idea of a standard deviation. Get yourself a statistics book and look up the term. Here’s an URL with some sketchy definitions of statistical terms:
http://nilesonline.com/stats
An average I.Q. is 100. One standard deviation in this system is 15 I.Q. points. If you’re one standard deviation above average, you have an I.Q. of 115, and if you’re one standard deviation below the mean, you have an I.Q. of 85. Roughly, to be one standard deviation above the average (115), you need to be better than five-sixths of the people who took the test. To be two standard deviations above the mean (130), you need to be better than 49 out of 50 people who took the test. To be three standard deviations above the mean (145), you need to be better than all except 1 person in 700 of those who took the test. To be four standard deviations above the mean (160), you need to be better than all except 1 person in 30,000 of those who took the test.
As it happens, an I.Q. of 200, which would be six and two-thirds standard deviations above the average, corresponds to about 1 person in 100 billion. Since 100 billion is approimately the total number of human beings who have ever lived, an I.Q. of 200 is the highest value that could even theoretically be assigned to any person. Practically, the highest value that most I.Q. tests will assign is 160, since it’s hard to give the test to more than 100,000 people when you’re calibrating the test. Even a sample of 100,000 people means that you can only expect three or four of them to score above 160 (four standard deviations above the mean).
Thus, using the modern definition of I.Q., Marilyn Vos Savant can only have an I.Q. of 228 if she’s claiming she’s not only the smartest person who ever lived, but also the smartest being who ever lived in this galaxy.