Why are IQ scores so asymmetric?

IQ scores are a bell curve centered at 100. But the differences in people as you move away from this center seem to be asymmetric. For example, a person with a 125 IQ is smart but i don’t think noticeably so in an everyday conversation. But a person with a 75 IQ is in the realm of Forest Gump, slow, unable to grasp complex ideas, etc. Go out to 150 and the person is very smart, probably nocticeably so even in a short span of time. But at 50 the person has trouble taking care of him or herself.
Is this asymmetry in my head, is it a function of test scoring, or is it just simple math? By math i mean that the left-hand wall of 0 constricts variation the closer you get to it. So though the scores are a bell curve, if you charted the abilities of people at each score it would be a curve whose slope increases as you move away from the y-axis (i’m sure there’s a mathematical term for this but it’s been 25years since i took a math class).

I would say that it comes down to what a person is capable of. A person whose IQ is very high is (usually) perfectly capable of relating to people of ordinary intelligence and acting a in a manner similar to them, and recognizes that this is often desirable. A person whose IQ is very low may want to do such a thing, but may not be capable of it.

Also, of course, the difference between 100 and 75 is greater than the difference between 100 and 125.

100 is 33% greater than 75, but 125 is only 25% greater than 100.

An IQ of 50 is half the IQ of 100, but an IQ of 150 is only one and a half times greater.

I’m quite sure that there is no sense in which an IQ of 100 means that person is twice as smart as another with an IQ of 50. IQ scores are at best interval measures meaning that the difference between IQs of 100 and 110 is the same as the difference between an IQ of 110 and 120. This is like degrees F or C (and unlike degrees Kelvin in which ratios are meaningful).

Others claim that IQ scores are really only ordinal measures which means that the only thing you can say is that 120 is smarter than 110 which is smarter than 100. No ratios, differences, etc. have any meaning at all, and the OP’s question doesn’t really make sense as IQs are not meant to be used that way.

Of course, others argue that IQs have little meaning at all and a one dimensional number is not sufficient to capture things, but that has little to do with this question.

But personally I think that everyday experience is unlikely to give clues as to much higher than average IQs while they give many clues to lower than average IQs. The world after is “designed” (or it might be better to say reaches an equilibrium) where the vast majority of people can operate efficiently in it and efficiency probably means that many (probably most) tasks can be done without full mental effort of teh typical person.

A twofold reason: The person of exceptional intelligence is capable of expressing the behaviour of an average intelligence person. “Average” is a subset of their intellectual range.

The second part can be illustrated by the idea that an IQ 75 person does not grok the differences between themselves and average folk. They might be able to say “Wow, you are smart!” but they don’t fully grasp the implications.

The humbling grace is that as you grow more intelligent, you grow increasingly aware of how little you really know. If my intelligence were 125, and since I realize I have only the tiniest iota of understanding about the universe, then someone of 100 intelligence isn’t much less.

Thus I have no cause for conceit.

Case in point: Someone that I know routinely asks me such gems such as, “Did you see the episode where Honey Boo Boo did such and such?”. No I did not. But this person is apparently intellectually fulfilled by that sort of entertainment.

At the same time, she is functionally successful. She owns her own home, is self employed and maintains a long term relationship.

She is certainly smarter than a 75, but perhaps a little on the low side of 100. I could tell her about the straight dope, but I doubt she’d be interested. She’d fail to grasp the satisfaction of coming here to have ones adored notions torn to shreds by truly intelligent folk.

When I first came here I was actually mortified. I felt very much outclassed by many of the brilliant members. But I’ll be damned if I’ll leave here without your secrets! shakes fist

In as far as IQ has any meaning, one can say with confidence that someone that has a 120 IQ is smarter than someone that has a 100 IQ, but to say that 120 is necessarily 20% smarter is to misunderstand how it’s being used. All it’s really doing is measuring a score, lining people up according to that. In as much as intelligence could be measured on a single axis, that the people are lined up in such a way makes no guarantees that intelligence would increase at some predictable rate as the IQ score increased, only that over a confidence interval that one is smarter than another.

I think an analogy might help illustrate this, so let’s imagine that instead of talking about IQ we’re talking about strength, something which does have some sort of objective measures. So say you measure it with, to pick an existing example, a standard power lifting set of bench press, squat, and deadlift, but then fitted to a normal curve. Two people could score similarly but have fairly different strengths in particular areas if one is better at one exercise and another better at another. But to say that one person is twice as strong as another person isn’t really particularly meaningful. Consider, if one person can bench benchpress 100 lbs one time, is someone twice as strong if they can bench press it twice or if they can bench press 200 lbs once? I can guarantee you that someone who can bench press 100 lbs twice has a max no where close to 200 lbs and someone who can bench press 200 lbs can lift 100 lbs a lot of times. That is, there isn’t really a meaning to how much stronger one person is than another, only really that one person is stronger than another.

So in the same way, what exactly is twice as smart? If one person scores 100 IQ and can answer 50% of the questions on a test right, what is twice as smart, someone who can answer 100% on questions on the same test, or someone who can score similarly on a test that is somehow determined to be twice as hard? All you can really say is that someone who performs better on a particular test is smarter. And, similarly, someone who is “half as smart” probably wouldn’t score 25% on the first test, they’d probably only get any right through pure luck.

In all of this, the relative because even an extremely unintelligent human is still, compared to all the rest of the life we know of, remarkably intelligent. A Forest Gump type person is clearly well below the average person in intelligence, but obviously well beyond comparison to the intellectual capacity of the smartest dog. So when we’re looking at humans, we’re already looking at a skewed sample because the zero point on the IQ scale is not an actual zero point of intelligence. It’d be like saying that it’s twice as hot when its 80 degrees out as when it’s 40 degrees out, but that isn’t really a meaningful statement because the zero in the scale isn’t lined up with absolute zero.

It appears that society in general seems to be geared toward people of average intelligence. For example, public schools in the US are notoriously geared toward the non-exceptional learner and the truly smart don’t have a lot of opportunities to really shine in basic classes. They can get along and get that A with a reasonable effort and additional achievement goes unrewarded (the teacher only wanted a summary of the Battle of Gettysburg, he didn’t want a full tactical analysis of the Confederate and Union positions with citations to referreed journals). The less intelligent are more obvious because they have trouble meeting the standards.

My guess, similar to the above posters, is that there aren’t many social situations that really tell you much about the intelligence of other people. Most people with an IQ above, say, 80 can make small talk about the weather, and handle basic life tasks. Anyone above 95 will be smart enough to handle everyday life and conversations. Someone above 110 might be notably witty if you had long drawn-out conversations with them. Beyond that, you’d probably only notice the difference in a professional context, i.e. an IQ 110 lawyer would be a bit slow compared to IQ 120-130 lawyers. And IQ 130 theoretical rocket surgeons would be left behind by their IQ 150 colleagues.

Interesting OP. And I am happy that after reading some very well stated responses, I sorta get it.

IQs actually represent a bell curve (with a long tail) approach with a mean of 100 and a standard deviation of either 15 or 16, depending on the test in question. Thus the statistical approach is the same as any other (roughly) normal distribution. If you have a 115, it means that you scored better 84.15% of the population, a 150 would be better than 99.96%, etc.

As a density function, it isn’t meant to quantify the difference bewteen folks, but simply to rank them.

So then, explain the scarecrow in The Wizard of Oz. What happened there?

Not sure if this is a joke, but if not…

There is nothing to stop a negative IQ. The fact that we’ve never heard of one afaik says all you need to know about those with claims of IQs above 200 (apart from Cecil Zotti of course).

An IQ is just about ensuring that performance on the test is normalized at 100 and a standard deviation represents some number of points, changeable based on the test.

So what the OP is really asking is “why can’t I easily tell the top 1% [say] of the population from the top 5%, when I can tell the bottom 1% from the bottom 5%?”

I hope that rephrasing the question like that will be enough for the OP to understand why - if that question is thought about for a few minutes while thinking about who she has met, I think she’ll actually understand pretty easily on an intuitive level! Hint: I think the American term is “short bus”

If not then I suppose I could go into more detail…

Really? Didn’t realise it was that much. I had 155 on a WISC test when I was about 14. On the other hand since then I have ruined my brain through drugs, but it’s nice to know I was actually a super genius rather than a semi-precocious twat :smiley:

OP’s observation is not too surprising. IIRC, IQ tests were originally developed as a clinical test for mentally defective patients. They were supposed to suggest the most beneficial treatment. As in, extensive personal care can help somebody with IQ 50, but not somebody with IQ 30. Then a few doctors conned US Army into testing WWI conscripts, and it snowballed from there.

I don’t believe that’s correct. I believe IQ scores are centered and scaled so that that mean is 100 and the standard deviation is correct, but they are not otherwise fit to the bell curve with nonlinear adjustments. Of course, they tend to fit nearly on the bell curve around 100 because as with many measurements, the Central Limit Theorem applies. But the Central Limit Theorem will not guarantee the tail behavior.

Yeah, that’s why I used the term roughly. The point I was trying to get at is summarized better here:

Emphasis mine.

Colophon writes:

> Also, of course, the difference between 100 and 75 is greater than the
> difference between 100 and 125.
>
> 100 is 33% greater than 75, but 125 is only 25% greater than 100.
>
> An IQ of 50 is half the IQ of 100, but an IQ of 150 is only one and a half times
> greater.

No, I’m sorry, but this simply incorrect. An I.Q. of 100 means that half of the population is more intelligent than you and half is less intelligent than you. If you have an I.Q. of 115, you are one standard deviation greater than the average. You can look up what standard deviation means, but basically it means that you are more intelligent than about two-thirds of the population. If you have an I.Q. of 85, you are one standard deviation less than the average. If you have an I.Q. of 130, you are two standard deviations greater than the average. If you have an I.Q. of 145, you are three standard deviations greater than the average. If you have an I.Q. of 160, you are four standard deviations greater than the average. If you claim to have an I.Q. of 175, you’re probably a liar, since no accurate I.Q. test measures more than four standard deviations away from the average.

And I.Q. means absolutely nothing more. It’s not like height or weight where you can say something is twice as tall or twice as heavy as something else. You can’t say that at all with I.Q.'s. It’s closer to temperature. 50 degrees Fahrenheit is not twice as hot as 25 degrees Fahrenheit. In the centigrade or the absolute scale, the values have a different relationship. In fact, even temperature isn’t a good analogy, because you can stick to just the absolute scale and talk about twice as hot. In I.Q. that’s simply not possible. A better example would be SAT tests, where scores are also given by standard deviations (although this is obscured by the numbering scheme). A score of 800 on, say, the verbal test doesn’t mean than you are twice as ready for college in verbal skills as someone with a score of 400 on the verbal test. In any system for assigning numbers to something that use standard deviations where there are no other numbers to go by, it makes no sense to talk about one thing or person being twice as high on that scale as another.

Uh uh. An IQ of 100 means half the population (49.99%) is as smart as you, and approximately 25% each smarter and dumber than you. To believe that half were smarter and half were dumber than someone with an IQ of 100 would be to espouse the idea that only 1% of people have an IQ of 100 and we know that the greatest clustering is actually centered there.
As for the OP, I don’t think the differences between 100 and 75 are as pronounced as we assume they are. I know that over the past year I’ve had to radically rethink my own concept of what people who are mildly intellectually disabled are like (I now work for an org that serves dual diagnosis of IDD/MH populations). From watching videos of several people who have an IQ in the mild ID range, it’s clear to me that my former idea of what someone with an intellectual disability is like/capable of matched people in the moderate range, not mild. A person with a mild intellectually disability doesn’t necessarily strike you as dauntingly disabled, just a bit slow. They can hold jobs, live on their own, and some even hold drivers licenses.

Citing because it’s the most condensed. IQ follows “a bell curve with a long tail” - it does not follow the bell curve that many people think is the only one (a Gaussian distribution). Many people have never seen another one in class and think it’s the only possible one, but actually there are many situations where neither the desireable nor the real distribution are simmetrical: for example, if you’re measuring the amount of impurities in something, both the usual distribution and the desired one are skewed - you want your result to be zero, it can’t be any lower than zero, and if your process is any good you’ll get zero many times but your higher acceptable limit (or higher values) never.

That’s a premise many people have unconsciously which is simply wrong, that the “natural state” for a statistical distribution is simmetry. Nope, it’s not.

You’re claiming that half the population has an IQ of 100? I realize that is the mode or most likely number, but nothing near half have measured IQs of 100.

It’s perfectly reasonable to say if you have an IQ of 100 that half the people are smarter (or at least half the people have higher IQs. Yes, there are others who also have IQs of 100, but roughly half of those would score better and worse than you on a retest or on a test with a finer gradation.