.999 = 1?

In the interests of wanting to keep this zombie alive until on the main General Questions list of threads the number of replies to this one reaches ( 1 2 3 4 5 6 ) :smiley:

One of the best colloquial explanations I’ve heard is you have to remember that .999… is not an ongoing process. The ellipsis (the dot dot dot) at the end does not mean that the nines spill off the edge of the paper, up into the sky, and right about now halfway to the Andromeda Galaxy there’s still a little immortal math gremlin furiously scribbling away more nines. It means that instead of writing 1 you chose to write 0.999… instead. Same as if rather than 4 you wrote 2+2 instead.

2+2 and .999… can both be thought of as a process, but ***not ***an ongoing (i.e. infinite) one. If you went to a car shop and said, “I need two plus two tires please” the guy would look at you funny, and reply, “You mean four?” He would never ever say three or five or three and a half, but four. He would know exactly what you meant, he just wouldn’t understand your choice of (verbal) notation.

Actually, it’s equal to zero. 1 - .999… = 0. Divide that by two, still zero. Add 1, subtract .999… , and you’ve still got zero.

Understood that thinking of the sum of an infinite series may lead to the type of thinking that 'hey no matter how many 9s i put on paper or convieve of, whether indtantaniously or mot, never do they equal 1. However lets forget about it being any type of process. The acceptance of .999… Being equal to 1 in all reasonable proofs I have seen hinge on accepting the Limit as being an actual value of the sum of an infinite series. The definition of the limit itself presupposes that .999… =1 by the very definition of a limit. This assertion is to be just accepted as a given (intuitive) truth. This (intuitive) truth hinges on the fact that you accept 1/infinity is zero. It is actually written to avoid specifically say this. This is never proven, just to be excepted. It is and always has been a means to either banish infinitesimals, or at the very least ignore them. While this may be practical and even perhaps true, it is NOT proven. Many people such as my self view limits as a mathematical convenience, rather than an ultimate truth.

Again the very definition of a limit by its very definition assumes .999… = 1. So it is circular to use as a proof.

The convergence of an infinite series relies on the limit principle, so it is also circular.

One is free to accept a limit as an actual perfect truth, but it is not proven to my knowledge. As a matter of practicality limits make perfect sense.

Some people want to include limits in the very definition of .999… Saying this is nothing more than this number actually means the limit of .999… But that is an artificial presumption in my mind. Why does this have to be in the definition, unless you specifically don’t want to have to argue about .999… = 1.

Actually 1 -.999… = something bigger than zero, divide that by 2 you get something 1/2 the size.

A statement contradicting my statement is hardly an argument of reason.

It must indeed be an intoxicating thought to conclude that, of all the great minds who have considered this question, you alone have got it right. While it is, at a certain level of abstraction, possible for you to be uniquely right on this pretty basic point, you don’t seem to have a sense of how radically unlikely that is.

You seem to be approaching the real number system as though it is an object in the real world, like the Himalayas or some such, about which truths can be determined by empirical methods of discovery. The real number system consists purely of intellectual objects manipulated according to intellectual rules, all of which objects and rules only exist as a result of their a priori definitions. You can’t “discover” a number between .999… and 1.0 because a consequence of our definitions of these things is that it does not exist. Your attempt to do so is simply notational abuse. The number you have proposed simply resolves to zero, that is, (1.0-1.0)/2 + 1.0-1.0
This is not an “assumption”, it is a consequence of definitions.

I well recall as a child having division explained to me as the number of times one number will fit into another. This was a rough approximation of the principle for childish ears, but it led me to become very frustrated with teachers who would say that division by zero was undefined. It seemed intuitively obvious that zero “went into” 1 as many times as you wanted it to, in other words infinite. Thus, to me 1/0=infinity. And ordinary algebra naively applied transforms this into 1/infinity=0.

But as an adult, the childish definition of division is inadequate, and zero and infinity are not finite numbers like 6 or the other numbers that can for all purposes be plugged into algebraic equations. Thus, division by zero is not defined because it leads to inconsistency that makes the entire enterprise valueless. 2x0=3x0, but applying ordinary algebraic rules and dividing both sides by zero as if zero worked the same way as, say, 6 results in 2=3. The same thing can happen when you manipulate infinity that way (treating it as though it was a number of the same order as 6). For that reason, it is a mistake to think about 1/infinity as though it is meaningful. Infinity just cannot be plugged into processes like that. Put another way, if 0.999…!= 1.0, then the concept of infinity would collapse to a finite number, making it incoherent, which would manifest itself in the legitimising of “proofs” like those above that 1=2

And that is one among many ways of identifying your error. Buried in all of your thinking is the idea that infinity is a mathematical object in the same class as finite numbers so that ordinary algebraic rules apply. They don’t. You can’t deal algebraically with the concept of infinity in the way you are attempting to do any more than you can divide by “red” or take the square root of “health”. That does not mean you can’t do any maths with infinite, any more than the fact that you can’t divide by zero means you can’t do anything with zero. It just means you can’t treat it as you would wish to.

Arguing the limit theorem is OK within the limits of your logical framwork. The thing is not to confuse that with the more limited issue of 0.9999… being the same as one. Most of the thread arguments are happy to take 1/3 = 0.33333… and 2/3 as = 0.6666666 but not to add them together. Yet any non-terminating representation is an infinite series sum and thus requires a limit. Not accepting the limit theorem requires that you also a-priori accept that all non terminating representations are not the same as the base rational, and not accepting that you can translate from one base where the rational is terminating to a base where it is not (this is no big thing, since the translation requires a division operation that invokes an infinite series.) So fine, simply don’t accept 1/3 as being equal 0.3333… in base 10, or even 1/2 + 1/4 + 1/8 etc not to equal 1, because you don’t accept the limit theorem. But don’t single out 0.99999… as not equalling one as special, because it isn’t.

Me: No not at all, in fact is actually very frustrating. :wink: and “all” of the great minds… surely you don’t suggest I am the first and only to consider and conclude .999… != 1? And none the less even if I was your argument that I therefore am wrong is nothing more than anecdotal.

Me: Whose definitions? Which ones? And how do you know those definitions make the most possible sense?

Me: Please show me this. I would be grateful to see.

Me: Tell Cantor he can’t do math with infinite as he wishes Euler or many others. You make lofty statements with NO evidence to support. You could be right, but you make no convincing arguments, just here-say. I am not the first one taking on this fight, and won’t be the last I am sure.

Show me the proof that doesn’t hide the conclusion in its assertions.

I am absolutely not singling out .999… I don’t think 1/3 is exactly .333… either. If you read the whole thread you would see my point much more clearly.

RE: “So fine, simply don’t accept 1/3 as being equal 0.3333.. in base 10, or even 1/2 + 1/4 + 1/8 etc not to equal 1, because you don’t accept the limit theorem”

I accept the limit theorem as a useful approximation, and in fact in some calculus texts it is taught that way. Some people look at the limit and say it IS the answer, and others say it is an approximation (however accurate it may be).

The way it was explained to me in high school calculus was thus:

We all know that 1/0 is undefined. There is no answer. Should mathematics do nothing with it? How about we see what happens if we substitute numbers that are close to zero to see if we can get an idea of what that would be. Okay 1/.01=100. 1/.001=1000. 1/.0001=10,000.

We are starting to see a pattern. So lets keep getting closer to zero FOREVER AND EVER (but without actually getting there because we are back to undefined status). As we do that we see that the number we get from 1/0=infinity.

Now, posters here are arguing that is not correct because 1/0 is not the same as 1/.0000000…1. I think the problem comes from not understanding the “forever and ever” progressing towards zero. It doesn’t get “oh so close” and stop. The number continues eternally towards zero. It gets, and continues getting, so close to zero so that for whatever level of precision you demand for your application, it meets that and keeps going!

Does your project depend on an accuracy to the googelplex-th of a millimeter? This calculation has you covered and on into perpetutity.

So .999… is 1. It’s absolutely one. If you want to say it is something else that 1, then that number is incorrect because it is STILL approaching 1 to the point that it will for any principle needed in the universe or even the supernatural that it is 1.

ETA: “Approximation” does it a serious injustice. Would you complain that your footlong sub was actually only 11.999999… inches? What property of the universe wouldn’t be served by this “approximation”?

It is well settled that 0.999… = 1. Here is a reference with a summary and a collection of links to other threads.

http://www.seiglefamily.com/gqfaq.html#one

“Put another way, if 0.999…!= 1.0, then the concept of infinity would collapse to a finite number, making it incoherent, which would manifest itself in the legitimising of “proofs” like those above that 1=2” - Noel Prosequi

Show me one of these proofs, please. I am anxious to see it. I have actually been anxious to see one and searching, but have found none.

Ah the it is a well settled argument…argument.

See this other thread argument… argument.

I have my fiend. Don’t see it being settled or proven. Please some one post a rigorous proof or point me to one that settles this without actually stating the conclusion in the premise.

There are a whole class of subsets of mathematics that depend upon what set of axioms you accept or don’t. You would get along with the constructivists.

I would like my sub to be 12.000… thank you very much :wink:

How serious is my injustice a level of .999… or as great as 1?

Again I am not talking about what is absolutely a very fine way of approximating for practical purposes, I am talking pure theory here. If you don’t care about anything but the practicality of .999… = 1, that’s fine. That is not what I am talking about.

Why is mathematics not allowed to fudge for practical purposes anymore than construction work or any other other science/industry/trade? As was said 1/0 is undefined. Should we just stop there and make no practical use of it.

However, this fudge by the science of mathematics is so fine that it works for any calculation of anything in the universe. If you want to argue that .9999 is not truly, really and absolutely, 1, then I would agree and I think posters here would as well, but the argument is so pedantic that it serves no useful purpose. It is soooo close to 1 that for all purposes it is 1, so what’s the point of not saying it’s 1?

And by sooo close, I mean infinitely close.

<NM>

Glad you agree with me jtgain. :wink:

What’s the point of saying it is 1?

It just plain doesn’t exist, which, mathematically falls under ‘undefined’. As in, there is literally no definition of infinity (under our standard notion of a number system), nor of arithmetic involving quantities that have no definition. You might as well claim that 1/blue makes any sort of sense.

There are certain contexts under which we do define a number which has many of the properties we attribute to the common notion of infinity, but even there, they don’t apply to our common notion of real numbers and arithmetic on real numbers.

Then DEFINE the system and the numbers. Why do you think mathematicians spent years formalizing our current real number system? Mathematics abhors imprecision in definition, and you seem to be handwaving away one of the most important things when discussing mathematics in the first place.

This last bit reveals a huge gap in mathematics knowledge. It’s a lot like going up to a chemist and saying “Well, clearly there must be some phlogiston around because of the open flame.”

You should definitely find an introductory book on analysis and see how numbers are developed in the first place. The reason your “beliefs” often don’t work is because an intuitive notion of how numbers work only takes you so far. At some point, you need some mathematical formalism in the definitions and basic operations themselves, which you’ve largely hand-waved away by implying or stating they were intuitive or immediately obvious (not the case).

Because they are the same number.

Again, mathematics abhors imprecision. There is no point is just “accepting” they are really close if they are actually the same number. There’s no “almost wrong” in mathematics. Making a wrong statement is wrong. And mathematics is NOT construction or engineering or such. It’s not purely a practical thing. It’s not actually a science, where there are imprecisions in the measuring tools themselves. This is another common misconception.

Could you elaborate? I don’t quite follow your long division method there.

1_________
9|9.000…

To move on and prove more complex theories that are based on the limit principle. No calculus without it.

Should Lowes have to stock thousands of different types of "2 X 4"s by saying “Well, this one’s 1.50003 X 3.49765. This stack is 1.5002 X 3.500002.” etc. Of course not. In the construction trade those are all 2 X 4’s.

In mathematics, things that are .99999… are 1 because that’s good enough for anything in the universe. Is there a reason not to make the statement that .999…=1?