I’ve heard people joke about computer programs that are great, but when the divide 3 by 2, they get 1.4999999, most recently in the “chicken cross the road” thread.
What’s the deal with this?
I have an ancient (useful, but archaic) calculator: a TI 36.
This simple computer correctly calculates 3/2 as 1.5
Why can’t some computers do this?
I don’t know who first said “everyone’s a critic,” but I think it’s a really stupid saying.
Most software represents numbers internally in binary, using only digits 0 and 1, and converting to decimal for display. The conversion is exact for integers, but not always for fractions, so small “round-off errors” creep in. Mostly they cancel out in a long calculation (some round up and some down), so no big deal. Your calculator is probably setting up a few extra decimal digits for display, rounding to fit, and the suppressing the trailing zeroes.
It’s a joke (sort of) because Intel released a batch of processors a while back that didn’t just have trouble rounding, but were just wrong in a few cases. They tried to say “No big deal” for this too, but took a lot of abuse over it, and had to make deals for replacements with some customers.
Bob the Random Expert
“If we don’t have the answer, we’ll make one up.”
You’re right, John, I was trying for a general answer, and forgot all about the specifics.
Any operation with a result that can be expressed as sums of powers of 2 (i.e. 1/2, 1/4, 1/8, and so on to as many digits as the number format can carry) will give an exact result. “1/3+1/3+1/3” fails because 1/3 can only be expressed exactly as the sum of an infinite series of powers of 2. Since the computer only carries a limited number of digits, the last few get dropped, and the result ends up a little low.
There are a few programs that handle formulas algebraically, simplifying first and doing the numerical calculation later, avoiding most of the rounding problems.
Bob the Random Expert
“If we don’t have the answer, we’ll make one up.”