0.999…, obviously.
Actually, it does matter in some rather esoteric pursuits, but not for this discussion.
Besides, programming is just applied math anyway.
Not that I’m against 0.9999~=1
I just don’t feel confident in it.
Basically, my intuition says that we’re separated from 1 by some infinitely small amount.
Now I’m seeing posts saying it can’t be, because then there would be some number between 0.999~ and 1, that 1 minus 0.999~ would have to give a number. But my head keeps pushing that the difference would be 1/infinity
I seem to remember from high school calculus that this is roughly zero…but I’m afraid that doesn’t satisfy me
Please, convince me I’m wrong so that I can finally get some sleep
Renzo - try this: Algebraically simplifiy the expression 1 divided by 3 multiplied by 3. Then sit down with pencil and paper and work out what 1 divided by 3 is. The multiply that by 3 and note what results.
[QUOTE]
*Originally posted by Exapno Mapcase *
**You’re joking.
How many times must we say it in this thread and all the others?
0.333~ is precisely equal to 1/3.
Not close or near or approaching or about. Equal.
Noone i sarguing that fact, but you can not perform any simple addition problems with the number .333… without rounding it, at which point it becomes not equal to 1/3.
Like hell we can’t. .3… + .3… = .6…
I have no problem working out the mathematics of it. I can easily get 0.99~ from 3/3, or even proof by 1/3 + 2/3 = 0.33~ + 0.66~ = 0.99 = 1
It the intuition behind it that I’m just not satisfied with. When you look at the numbers on paper they don’t look like the same number, and in my head I still feel like they’re separated by 1/infinity. I know 1/infinity is zero for all practical purposes, yet I just feel that this logic means math is not an exact science, and for me that kind of paradigm shift is just too much to handle. (I’ll survive, but I won’t be very satisfied)
1/infinity is not a meaningful concept.
1/infinity is not a meaningful concept.
Renzo, consign yourself to the fact that your intuition is wrong.
<…>
Not to change the subject too much, but since it has already been addressed, we can refute the claim that “1/infty” is a meaningless concept. The expression indeed represents a number in certain areas of mathematics. It’s 0. In real numbers though, “infty” is not a member of that set 9thus division by infty is not defined), and when we think of concepts such as your 1/infty, we are usually thinking in terms of lim x–>infty 1/x or something along those lines.
Our intuition is wrong because, well, often is the case that our intuition is wrong when dealing with infinite concepts. .99… is an infinite concept in the sense that the string repeats infinitely.
And that’s the key. Repeats infinitely. No one argues that if the string stopped somewhere, it would be less than 1. And that’s the point----the only way it CAN be less than 1 is if the string stops somewhere. But it doesn’t.
Turns out that descriptions such as “infinitely close, but not equal to” , in real numbers at least, are not at all adequate descriptions. 9this is just anoterh way of describing your 1/infty concept). If you take “infinitely close to 1” to be some number, albeit very close to 1, so long as it’s not equal to 1, then there are always uncountably infinitely many other numbers between them. This is a proven result with higher mathematics. The set of real numbers is dense and connected. Any “slice” of the real number line, contains the same magnitude of infinity (pertaining to number of points), as any other “slice.”
…which suggests the number you have in mind is not really “infinitely close” to 1. IOW, to say there is some real number infinitely close to, but not equal, to 1 (ie the difference bewtween .99… and 1 is 1/infty, where the latter is nonzero) is saying nonsense, when you think about it.
But I do not argue against a statement, in an appropriate context, that .99… and 1 differ by 1/infty. It’s just that in that context, 1/infty really IS 0 (ie it is defined to be 0).
Again, as I suggested in my very first post, much confusion can be avoided by understanding the meaning of the notation. .999… does NOT mean:
.9 + .09 + .009 + continue this process as long as desired, then STOP, but instead is defined to mean:
limit of the sequence {.9,.99.,999,…} and this limit is 1 (definition of limit omitted; you can look it up).
It is confusing (ie goes against intuition) for lay people because the expression:
.999999999…
Looks EXTREMELY similar to the expression:
.999999999 <----and even lay people know, this is <1.
The “…” makes all the difference.
Darrell
Could you explain which are the “certain areas of mathematics” that consider 1 divided by infinity to be a meaningful concept equal to 0?
By the extended real number system R[sup]*[/sup] we shall mean the set of real numbers R together with two symbols +[symbol]¥[/symbol] and -[symbol]¥[/symbol] which satisfy the following properties:
a) If x [symbol]Î[/symbol] R, then we have …
x/+[symbol]¥[/symbol] = x/-[symbol]¥[/symbol] = 0
T. M. Apostol, Mathematical Analysis, p. 14
My math teacher in High School, after much badgering from me, provided an elegant proof which satisfied even me.
1 = 0.999~
1.000~
-0.999~
0.00000000000000000000000000~
In English, the above operation can be stated thus: "The DIFFERENCE between ‘one’ and ‘point nine repeating’ is ‘zero point zero repeating’ which is obviously zero.
Therefore, since there is no difference, the values are the same.
Thank you Mr. Kehoe. I learned more than I thought I did from you, and likely more than you thought, too
~Wolfrick
1 = 0.9~ because it is defined that way; that is the nature of Mathematics. 0.9 re-occuring is a sequence, and the limit of that sequence is 1. Anything else is needlessly multiplying beyond…
Infinite cant be expressed. We can only pretend we can express it. But it cant be done, we cant make infinite .333’s, we can pretend for the sake of arguement, but this is why mathematical problems like this come up.
Exactly. If you try to write or print out an infinite number of 3s after a decimal point you’ll never finish the job, but if you agree that .333~ or .333… or whatever notation you choose to use represents the number meant by an infinite number of 3s after a decimal point then you don’t have to worry about actually writing or printing out all those pesky threes and you can get on with actually understanding the mathematical concept (which isn’t really all that difficult) in question.
You don’t have to invoke that argument. 1 = 0.9~ because there exists no positive number that is smaller than the difference between 1 and 0.9~.
You could; either is fine.
Again, this argument is a misunderstanding of mathematical notation.
√2 is exactly equal to the square root of two. You can never write out the full decimal expression of this number, but you also never need to. √2 does it simply and neatly.
π (which is supposed to be pi, though it may not look that way on your browser) is exactly equal to the ratio of the circumference to the diameter of a circle. You cannot write out the decimal expression of this number, but you also never need to. π does it simply and neatly.
The three dots of the ellipsis … says that the expression repeats in the same fashion infinitely. This is the definition of the symbol. We never need to write out the full decimal expression of .999… because the … does it simply and neatly.
These conventions of mathematical notation eliminate all arguments and problems because they express perfectly the concepts behind them. If you understand the convention, there is no argument whatsoever, no confusion, no not getting it.
.333… = 1/3
.666… = 2/3
.999… = 3/3 = 1
QED