Would you have a problem with this statement?

Not only were you being overly pedantic, but you exchanged one meaningless number for another. If you remove power from a processor its temperature doesn’t drop to absolute zero, it drops to ambient. Your reference relative to absolute zero, while correct from a percentage viewpoint in temperature rise, is almost completely meaningless with respect to the important issue here, which is how much heat these processors generate.

The temperature in deg C is better than the temperature in deg K because it is a heck of a lot closer to the difference in actual heat produced. So their bad number was way better than your bad number.

If you want a better number to compare, use the amount of heat generated (in watts) as a comparison. This percentage difference would be far more useful than anything you (or they) posted.

It seems to me that you’re right, so no, I wouldn’t object. But this isn’t a topic I’m familiar with and I may be misunderstanding something.

Yes, thank you, I’m aware the processor won’t drop to absolute zero. The whole point is that if you’re doing a comparison of one absolute temperature to another, the ONLY way to compare them if you’re doing a “% higher” that has any actual meaning is to compare the temperatures in Kelvin.

I agree about using wattage in heat generated as a more useful comparison. Again, I wasn’t demanding they change the article, and I definitely wasn’t saying they should have stated that the chips were 7.8% hotter. I was objecting to the use of an absolute incorrect calculation as an illustration of the increase in heat.

You know those misleading bar charts where the Y axis starts a huge way up the numbers, so the comparison of one number to another looks to be three times larger, when in reality it’s more like a 10% difference? Now imagine someone took that chart and stated “look, Category A is three times larger on the Chart!” Holy Cow! Category A is really dominant! Wouldn’t you take issue with that? It’s exactly the same thing.

I was simply very surprised that so many people there (and apparently here as well, surprisingly) were so willing to revel in defending something blatantly wrong simply because they didn’t want to get ‘too technical.’

But, as a result, I have actually found the “yes, you’re being overly pedantic” comments in this thread to be quite useful, as it does let me know that I sometimes take these things too far. I know in casual conversation, this would be taking it WAY too far, and people would start leaving and refusing to talk to me, though I didn’t think it was out of line on a forum dedicated to computer hardware, I mean…frankly we are ALL nerds in some way there. :wink: I did not further continue the conversation on the original board, though, as I knew it really wasn’t worth the argument. I am finding this discussion fascinating, though.

And if anyone is offended by anything I say in this thread, please know I’m just interested in the discussion, and the whole point/counterpoint thing is how I best learn new things.

Though…I am getting the feeling my next thread should be:

“Hello, my name is Jman. I’m a pedant, but I really don’t mean to be!”

Sometimes it makes me mad when I’ve got a whole reply in my head and somebody else made it every bit as well as I possibly could before I got the chance. Saved me a lot of typing though…:stuck_out_tongue:

Not necessarily.

“The old temp was 56C, the new one is 82C”

“The old temp was 56C, the new one is 46% higher than that”

“The old temp is 68% of the new one (82C)”

These are all accurate statements which give the same information (assuming you interpret them as nearly every reader would). It is not inherently misleading or meaningless to give a percentage, as these examples illustrate.

Now, in this particular review both the old and new temperatures were already given, so it is redundant to note the percentage increase. But sometimes people give redundant information. Maybe they thought giving the percentage as well would make them look smarter? “Hey look, I can calculate a percentage!”

I suggest typing faster.
heh heh heh

I agree with the OP. A comparison like this is meaningless unless you have an established baseline.

Suppose you assumed a baseline of -2544C. Then you could honestly say that the second processor was only one percent higher than the first one.

I strongly agree with OP, but might not have made the comment. (Too many nits to pick, too little time.)

It reminds a little of Feynmann’s story from his days on California Textbook Committee. One textbook problem mentioned several stars of various temperatures, then asked for the total temperature :smack: … absurd even in Kelvin.

ETA: Here’s another example of percentage confusion. During a financial crisis 15 years ago, the Thai baht fell from 26-to-the dollar to 39. The Bangkok Post reported that the baht “had lost 50% of its value.” It continued to fall to 52-to-the dollar which, to be consistent, meant it had lost 100% of its value. (The Post declined to so state, however.)

I think you have missed the point.
The OP is not complaining about redundant information, he is complaining about the use of a relative temperature scale and the misleading ratios that result from comparing one point on the scale to another.
Google “Kelvin scale” and “absolute zero” for more information.

What the guy with the nifty weapon said… I realize that what should actually be compared is the delta-H over time, but if what you want to compare is temps, you need to use either absolutes or (more properly) delta-T.

But I don’t expect journalists to know anything about science or technology, even if they specialize on it.

No, I understand the OP’s point. You don’t seem to understand my point though.

Imagine a unit of measurement that tells you how many degrees celsius more than 0C the temperature is. If this measurement went from 56 to 82, one might say “the measurement is 46% higher”. It would mean that the new temperature was 46% father above 0C than the old one. Note that this would be true even if you converted to another unit, such as kelvin. F(82) would also be 46% father above 0C than F(56). It is a true statement.

It would certainly not mean it was 46% “hotter”, but then, nobody said it meant that.

Is using 0C as the baseline worse than using absolute zero? Is either worse than using a typical room temperature? Maybe it should have said the new temperature was 79% farther above 23C than the old one? Maybe.

But since the reviewer gave the temperatures, and never once said either was “46% hotter”, I think it would be fair to be a little generous. At least preface the complaint with “I know you didn’t say 82C was 46% hotter than 56C, but just in case anyone gets the wrong idea I want to point out that…”

You found a blatant error in their thinking. Simple as that. They should have just stated the difference in degrees and ended it.

Wow! A surprisingly contentious thread over something that’s clearly right. As SenorBeef says, you can’t say it’s “twice as hot” as yesterday, as it’s a meaningless statement.

Comparing the temperature to ambient temperature might make at least some sense: that way something that produces more heat is described as “hotter”, etc, even if the correlation isn’t linear. And obviously some things (although I don’t think it really applies to computer processors) it makes sense to compare the temperatures in Kelvin.

Other than that, describing a temperature as “twice as hot” is completely meaningless. It SOUNDS like it means something, but it really doesn’t. (“twice as cold” or “twice as hot” may give an intuitive sense if the ambient temperature is near 0C or 0F, but 42% isn’t going to.)

It’s not like it’s a sloppy way of saying something but everyone knows what you means. It DOESN’T MEAN ANYTHING. It actively OBSCURES the relevant information.

But I’m not surprised that lots of people didn’t care.

The sad result of all this is that people just don’t understand percentages. The media/marketing people have poisoned that well, with CRIME RATES UP 25% when they really mean the number of crimes has increased, or 33% WHITER. People get so used to hearing numbers a percentages that they start thinking by giving the number as a percentage makes it more authoritative.

Yeah, it’s sad because, presumably the people who are arguing with jman believe they are smart, but clearly show they do not understand even simple physics. It’s also sad because their defense of him being too pedantic boils down to “you are being too precise about my attempt to be precise”.

But, if you just give 110%, you will persevere. If not, just wash your mouth out with some of the 99 44/100% pure soap.

excavating (for a mind)

This reminds me of a time where I was on the flip side of this argument. I said you need X% more energy to ignite cold wood than warm wood. I did [X = (flashpoint-coldtemp)/(flashpoint-warmtemp) * 100] and caught all kinds of hell for not using absolute zero and Kelvin. No manner of explanation could convince my detractors that Kelvin was unnecessary, and indeed, the answer comes out the same no matter what scale you use. They just kept going “Naw, man. My high school teacher said you always have to use Kelvin when talking about temperature percentages.”

:smack:

This was my thought. Expressing differences in temperatures in this kind of context as percentages is practically meaningless as you either have to either define your own scale (“the new temperature is X% greater than the machine’s powered-down temperature than the old one”) or express it relative to a value that’s more or less meaningless in the context of the discussion (absolute zero and the freezing point of water are both pretty unlikely to matter in talking about processor temperatures). The article should have given the difference using whatever temperature scale it was measuring in and left it at that.

You might be pedantic, but you’re literally twice as right as they are. :wink:

I get the arguments that a Kelvin-based percentage is pretty useless in this case, but the OP is spot on that relating the temperatures as a percentage of each other is completely inaccurate and deceptive, and deserves to be pointed out as so, particularly if it is being used as a relevant spec.