# .9~=1?

Ok, so Cecil did a pretty good job explaining why 0.999~=1, now I’m here to explain why it doesn’t

First, I’m gonna give you a little proof

x=0.999~
If you multiply by 10
10x=9.999~
Subtract x from each…
9x=9 (because x=0.999~)
divide both by 9 to find x
x=1

However, there is one problem with that proof, and that is it treats infinity as if it is not an actual number. What we must remember is that infinity if a number, just a number we cannot express.

When we multiplied both sides by ten, it was 9.999continuing infinitely-1. In an equation, infinity must remain the same number, so multiplying by ten would remove a decimal at the end just like 0.99*10=9.9, NOT 9.99. Therefore, this argument is false.

Regarding fractions, 1/3 DOES NOT=0.333~. That is the best way we can put in it non-fractional form. The only true equal of 1/3 is 1/3. Saying 1/3=0.33~ is like saying 22/7=pi. It’s pretty damn close, but not quite right (perhaps the most accurate way of expressing 1/3 would be to say 0.333~+0.000~1, meaning that it ends in a four.)

Anyway I hope this has been helpful, and the original column can be found here . Now I just have to finish my explanation of Stairway to Heaven, last time the forum deleted it when I tried to post…

I’m a believer. I have no factual or mathematical basis on which to base my belief, but I just can’t accept that .999…=1. However, I do understand why the non-believers say .333…=1/3. IF .999… does equal one, then .999 divided by 3 would be equal to 1/3. But that is, of course, contingent upon .999… being equal to 1.000…

oh boy, here we go again.

not that I’m knocking your OP. Just that this has been discussed ad infinitum (snicker) here and looks like it will continue to be. Personally, I always enjoy them so please, carry on.

You are confusing equality with identity. Look at it this way. If x=0.999…, then no matter how small a quantity you name, I can show that the difference between x and 1 is smaller still. Thus there is no limit to the closeness of 0.999… and 1. That meets any reasonable definition of equality you care to propose.

If you disagree, could you please expand on why you believe infinity to be a number?

No. The decimal expansion of 1/3 does not end in a four. The decimal expansion of 1/3 does not end. Assuming that the “~” means an unterminated string of zeros, “0.000~1” is simply nonsensical. How can you tack a 1 on the end of something that doesn’t even have an end?

You’re correct in saying that 22/7 does not equal pi. In fact, I’d argue that 22/7 is nowhere near pi (althought this is purely subjective). We can see that by performing the decimal expansion to a few decimal places (where “~=” means “approximately equal” - not a precise concept, admittedly, but adequate for our purposes):

pi ~= 3. 1416
22/7 ~= 3.1429

However, you’re incorrect in saying that 0.333… does not equal 1/3 (assuming that “…” means something along the lines of “an unterminated string of 3s”). Saying “1/3 = 0.333…” is the same as saying “the sum of 3/10[sup]n[/sup], where n takes the value of each positive integer, is 1/3”, which can be proven to be true. I can’t be bothered proving it, but I’m certain someone else can.

0.333~ is a limit. So is 1.000~ and 0.5000~ and every other number in the set of the reals. It is not ‘infinity’ or ‘infinite’ or any other incomprehensible babble you come up with. It is a limit, and therefore it works like every other real in the set of all reals.

0.999~ is a limit as well, and so it works the same way as everything else. It obeys all the same rules without exception. One of the rules of the set of the reals is that for two real numbers to be unequal, there must exist a real number strictly between them. (That is, it must be strictly less than one of the two and strictly greater than one of the two and unequal to both of the two.) Another rule is that you can only obtain the additive identity (0.000~) from x-y if x is equal to y. So, what real number lies between 0.999~ and 1.000~?

``````

1.000~
-0.999~
----------
0.000~

``````

Well, look at that: 0.000~ is the result of the subtraction, so 0.999~ must equal 1.000~.

This is not surprising. 0.999~ can be seen as a convergent series, and a convergent series is equal to its limit. The limit of summing 9e-n as n goes from 1 to infinity is 1.000~, so 0.999~ must equal 1.000~.

This equality is not approximate. There are no approximations in mathematics.

Maybe they need the definition of R.

The set of real numbers, R, is the unique complete extension field of the rational numbers. This means it’s constructed from the set Q as a subquotient of all sequences of elements of Q.

Given a sequence {x[sub]n[/sub]}[sub]n in N[/sub], if for every e>0 there exists a k(e) such that for all n,m>k, |x[sub]n[/sub]-x[sub]m[/sub]|<e we call the ssequence Cauchy. Roughly, a sequence is Cauchy if all the numbers in the sequence get close together as n increases. Take the collection of Cauchy sequences of elements of Q.

Now, given two Cauchy sequences, {x[sub]n[/sub]}[sub]n in N[/sub] and {y[sub]n[/sub]}[sub]n in N[/sub], interleave them to define {z[sub]n[/sub]}[sub]n in N[/sub]. That is, z[sub]2k[/sub]=x[sub]k[/sub] and z[sub]2k+1[/sub]=y[sub]k[/sub]. If z is also Cauchy, we consider the sequences x and y to be equivalent. The set of Cauchy sequences of elements of Q modulo this equivalence relation is R. That is: every real number is instantiated as an equivalence class of sequences of rational numbers, such as the sequence of its decimal approximations.

That there is a unique field structure on R extending that of Q is left as an exercise for the reader.

I think I speak for everyone when I say… huh

No, it’s not. It’s like saying that 4/1 - 4/3 + 4/5 - 4/7 + 4/9 … (and so on endlessly) equals pi. Which it does, as does .33333 … = 1/3.

As a public service:
if .999,=1 does .999,+.999,= 2 or 1.999,8?
1=.999…
.999~
1 = 999…
Why doesn’t .9999~ = 1?
Math: .99repeating = 1?
.99999 repeating = 1? Whats your opinion?
Does .99(repeating) = 1
.99999999 equal to 1 ???
.999 = 1?
Stupid math question
Is 0.999…=1 ?
And probably some other I’ve missed. Go thou, and read.

In any Archimedean field, .9… = 1. The reals are an Archimedean field, so in the reals, .9… = 1. If you don’t understand what that means, don’t try to debate it.

So, 2/6 is a different number, then? And 3/9 must be something else, and 4/12 a fourth number, and so on?

AAAAARGH!

So what is the decimal representation of 1/3 in base 12 then? And what do three of these sum to? So why do you think that the same number in different bases magically becomes different?

This has been done so MANY times before that it’s like the “drive on the parkway” question. If you’re serious about debating it, go find each and every argument in those threads and THEN come back to us with arguments against them.

Mathochist just repeated the first two sentences of Derleth’s post immediately preceding, translated into ‘mathese’. The reason he did this can be inferred from his user name.

The problem here is a difference between the theoretical and the practical.

In the theoretical realm, the mathematician has no problem dealing with infinite decimals. In the practical realm, of course, there’s no such thing. After a couple of dozen decimal points, you’re way past a point where anyone can measure or verify or validate anything, and the non-mathematical mind can’t grasp the idea of “infinite decimal.”

So, the mathematical mind thinks on a theoretic plane and obviously .9999… = 1 as .333… = 1/3. The practical mind thinks that .9999999999[a zillion more 9’s]999 stops and is obviously different from 1, as .33333333333333333333 is obviously different from 1/3.

Stop right there. This isn’t religion, it’s math. There is no room for belief here, only logic.

Now then, here’s a proof that 0.9… = 1

Let us assume that 0.9… != (not equal to) 1. But there are an infinite number of 9s, so it must be infinitely close to 1; that is, there is no number x such that

0.9… < x < 1

Now let x = square root (0.9…)

Because 0 < 0.9… < 1, therefore 0.9… < x < 1

But this contradicts the statement that there is no x such that 0.9… < x < 1, which folllows directly from the assumption that 0.9… != 1

Therefore, 0.9… = 1

There is no way to understand what 0.999… means unless you understand limits of sequences. It is defined as the limit of the sequence:
{0.9, 0.99, 0.999, 0.9999, 0.99999, … }, adding an extra 9 at the end of each item in the sequence to get the next member of the sequence.

The limit of this sequence is 1, because you can get as close as you like to 1 by going far enough along the sequence, even though no member of the sequence is equal to 1.

(Similarly, the limit of the sequence:
{1/2, 1/4, 1/8, 1/16, 1/32, …}
is 0, even though no member of the sequence is equal to 0).

Of course, many sequences do not have limits: examples of sequences without a limit are:
{1, 2, 3, 4, 5, …}
{0, 1, 0, 1, 0, 1, …}

It’s fairly natural to assume that you can do the usual sort of arithmetic in sequences, e.g.:
0.333… + 0.333… + 0.333… = 0.999…
but because we are talking about infinite sequences here, you need to do some pretty rigorous mathematics to show that you can do operations like this. However, you can only do it with sequences that have limits – but since these infinite decimal fractions always have a limit that’s not a problem

Did y’all see this?

All the Charter Members have seen the 0.999… does not equal 1 threads ad nauseam. So only new Members and Guests are posting this nonsense.