*Originally posted by Cumber *
**
*Originally posted by Mangetout *
**Bear in mind that the term Kilobyte was coined at a time when it was thought of as a large amount and Gigabyte was a term that we would never need because it was the stuff of SF.
So in the early days, the discrepancy was 24 bytes; big deal, they thought. **
In the days when a kilobyte was a large amount of data, 24 bytes probably would have been a big deal. Or not negligable anyway. In fact I would think probably would have to do with small numbers of bytes being important… since computers use base 2 it’s always going to be a pain to express computer-significant numbers precisely in decimal. How do you get around that problem? Base your terminology around computer-significant stuff rather than the number of fingers you’ve got. Close enough for people who aren’t in the buisiness anyway. **
Sure, the bytes would have been important, but the fact that the terminology was inaccurate by 2 percent or so wouldn’t have been a worry, but with a gigabyte, the discrepancy rises to nearly 5 percent and so on.