An Infinite Question

Regarding the ancient column (all the way back in '03) on .999 being equal to 1, “An infinite question: Why doesn’t .999~ = 1?”, in which the answer was that it IS equal to one, I gotta say, I’m spotting some faulty logic. It was obviously written by an impostor, not Cecil. The whole thing was arrived at by saying .333~ is equal to 1/3. Assuming that, you’re right, because as stated in the article:

.333~ = 1/3

.333~ + .333~ = .666~, or 2/3

.333~ + .333~ + .333~ (or .333~ + .666~, if you prefer) = 3/3

3/3 = 1

Now, before we go any further, I’d like to throw in the fact that more than one digit in these numbers is superfluous, because, as also stated in “Cecil’s” article, ~ means the decimal keeps repeating infinitely. But I split hairs here.

Now, why this cretin impersonating our (well, your) lord and master is wrong. It was ENTIRELY based on .333 being equal to 1/3. Well, I gotta tell ya, it’s not. 1/3 (obviously) means the number indicated is exactly one third of a whole number.

I know what you’re thinking now. “Oh mighty General, you have truly mastered the laws of mathematics. Your wisdom is far beyond the understanding of mortal man.” And you’re right. But please, keep the praise back until I’ve arrived at my point. My point is simply this: .333~, while damn close, is NOT 1/3 of 1. I mean, sure, it’s the closest thing we’ve got, and it works for all practical points and purposes (unless you’re way too much of a perfectionist for me to want you in my immediate area, but this article is all ABOUT perfection). .999~, while infinitesimally close to 1, will NEVER actually reach 1. Truth is, there’s no way for 1 to be divided into 3 EXACTLY equal portions. It’s just that, as I said before, .333~ is the absolute closest thing we have, and, as I also stated before, it works in every situation except this one, where we’re just splitting hairs for the fun of it, so .333~ is generally accepted as 1/3.

So in short, your calculations were entirely correct, but they were based on a widely accepted inaccuracy. Go forth and use this newfound knowledge. Probably on one of your mathematician friends who’s getting too big for his britches.

… Methinks you are not nearly as smart as you think you are…

For starters, 1 can indeed be divided exactly into three equal parts. We call these parts… 1/3.

x = 0.3~
10x = 3.3~
9x = (10x – x) = (3.3~ – 0.3~) = 3
x = 1/3

Did you follow that?

ETA: In your post you said 0.333~ will never “reach” exactly 1/3. But it’s a category mistake to think of 0.333~ as "reach"ing anything. It’s not a process or an action. It’s a number. “0.333~” designates a numerical value, not a process for constructing numerical values.

I suppose I asked for this, entering into a perfectionfest, but that doesn’t make this any less obtuse. You know what I meant. But okay. Let’s say 1 can’t be divided into three exactly equal NUMBERS (recognizing a distinction between numbers and fractions).

Perfectly. .3~ times 9 is 3. Forgive my ignorance, but I don’t see how this means .3~ is 1/3 of 1, and thus .9~ is 1.

Ah, but it IS a process. As stated in the article, and I quote:

Granted, it is wrong on the main point: That if you never stop typing 9s, it’s equal to 1. But it’s correct in saying that an infinitely repeating decimal is a process.

No, it isn’t. That much is a mistake in the article if it gives that impression.

How about this: If an infinite series has a limit, it is equal to that limit. This means that if summing the terms a[sub]0[/sub] to a[sub]infinity[/sub] yields a number, the infinite series is defined to be equal to that number. This is, so far as I understand, a definition, which means that if you have a problem with it and refuse to accept it, you are no longer doing the same math as the rest of us and, therefore, your results are meaningless. That’s how an axiom system works.

Yes, but by definition, an infinite series HAS no limit. Much as the original article, your calculations seem to be correct, just founded badly.

I’d like to see a citation for the notion that a convergent infinite series has no limit.

More to the point, I’d like to reiterate my statement that if you wish to make statements relevant to a given axiom system, you have to accept all of the axioms in that system.

I am not sure what distinction you are trying to make. I can’t speak for the person you’re replying to, but I certainly don’t “know what you meant” when you said you can’t divide one into three parts evenly. You can. I can’t understand in what sense you are saying you can’t. I am a famously non-obtuse person, so that can’t be the problem here. :wink:

You agree that .3~ * 9 == 3.

Okay. So let’s call .3~ ‘x’. This means:

9x == 3.

(That’s what you agree with, correct?)

By simple algebra, then,

x == 3/9

And as we all know

3/9 == 1/3.

So then .3~ == 1/3.

Cleared it up for you yet?

The article you’re quoting says says exactly the opposite of what you think it’s saying on this point. It is talking about the view that .333~ denotes a process, as an example of wrongheaded thinking. It is not advocating the view–it’s debunking it.

Not by any definition used by any actual mathematician.

This is why my algebra teacher taught us to leave our answers in fractions.

You can slice an apple into 3 parts. Tape it back together and you have one apple.

1/3 x 3 = 1 you don’t get any inaccuracies unless you do the unneeded division.

We were taught to work with fractions. Even our final answer was a fraction reduced to the lowest common denominator. Todays calculator generation tries to convert to decimals and then they introduce rounding errors into the process.

You’ve just totally destroyed yourself. As the others have noted, this is completely and absolutely wrong. It’s your fundamental misunderstanding, why you are saying something that everybody knows just ain’t so.

This problem is the same as Zeno’s Paradox. The problem Zeno had is that the Greeks has the same misunderstanding that you do. They assumed, by definition, that any infinite series must diverge, i.e., go to infinity. But that’s simply not true. Many infinite series converge. 0.9 + 0.09 + 0.009 + 0.0009 + ~ is an infinite series. It does not diverge. It converges. It converges, to be precise, on 1. You can show that after a sufficient finite number of terms any difference from one is less than any number you can name. This is a mathematical definition for equality.

You need to study what a limit is. Once you do the problem disappears. 0.333+ is precisely equal to 1/3. 0.666+ is precisely equal to 2/3. 0.999+ is precisely equal to 1.

We can argue this, but you might start with this thread which has the virtue of linking to a half dozen previous threads on this subject.

In short, give it up. You got the main point wrong and there’s no hope of you convincing anyone who understands the problem properly. If you have to go through it one more time please read the previous threads so that you don’t repeat arguments that have been endlessly and remorselessly struck down.

I really don’t understand the point of the columns answer. Why didn’t Cecil give the standard definition of a function from Calculus? A function which is the derivative at all points of an interval. Limits are used to explain the concept of a derivative.

Nitpick:

Fractions are numbers. Fractions can represent rational numbers or irrational numbers, but they are numbers. Rationals can be written as the quotient of two integers, irrationals can’t. 1/3 is a number, and it is irrational

/Nitpick

Pure mathematics is a “perfectionist’s” discipline. People infinitely (heh) smarter than I devised a set of axioms, or self-evident statements, that number systems should follow. I’m glossing over a lot of things, but the Dedekind Cut gave us the irrational numbers for the real number system, and because of that, all kinds of results are possible, including all of calculus.

I think the mistake you’re making is in thinking that .333~ is a really long (but finite) string of numbers. It’s not. It goes on forever, infinitely. As Frylock has demonstrated, the notion that 0.333~ is perfectly consistent with the axioms that define the real numbers. If you want to make new axioms, go ahead. But the ones we have work pretty well as it is.

Hm. I never heard that definition, but if you define equality in some strange mathematical sense of the difference being too small to name, rather than the way most people would define it, as absolutely no difference, then yes, .9~ is equal to one.

So in other words, it all depends on how equality is defined. I guess you folks can see where I made my slip, because I’m sure we can all agree equality is usually defined as “absolutely no difference between two or more things”. Just… not in math, apparently. Go forth and correct me no more.

What number is smaller than any positive number you can name? Zero. The “smaller than any number you can name” definition and the “absolutely no difference” definition are the same. If the difference is of a magnitude smaller than all positive numbers (and is not negative) then there is no difference.

Your definition and the one Exapno gave amount to exactly the same thing. Your misunderstanding doesn’t have to do with definitions of equality. Your misunderstanding has to do with the fact that you don’t know what the symbol “0.333~” actually means. You think it denotes a process. In fact it denotes a numerical value.

Yes, we could all immediately see that you, GD, knew nothing about actual mathematics.

I’m not really being snarky. It was obvious. Every statement you made has been made a dozen times before in various threads, all by people who don’t know anything about real math, always with similar misunderstandings of the actual problem, and yet all with absolute certainty that they have it right while all the mathematicians are wrong.

That’s the part that puzzles me. Math is all about the correct way to define and calculate something. You need to understand the basic definitions that math uses and how they apply to the problem at hand. And you’re still getting it wrong.

When mathematicians say that the delta (the difference) between the two things is smaller than any possible nameable number, there are indeed saying that the two things have “absolutely no difference between” them. It’s the same definition as everybody else’s, just derived in a slightly different way. This is a very basic level definition that gets used all over serious math, so it is a critical one.

If you don’t know the basics or the definitions or the problems, your pronouncements about certainly will fall apart like a life jacket made of alka seltzer tablets: showily and disastrously. Everybody will notice.

So why didn’t you, is what my long-winded question really amounts to.

No. She did that because finite hardware, such as pencils and paper or calculators or computers, cannot represent arbitrary real numbers accurately. It’s a limit of the hardware (and, in a larger sense, the physical universe), not mathematics, and we are discussing pure mathematics only in this thread.

General Derangement: Everything in mathematics depends upon definitions. Unless you state what definitions (axioms) you choose, your mathematics is of no interest to anyone else. The definitions we (everyone in this thread except you) have chosen preclude your ideas that infinite series have no limits: We have defined the cases in which they do, those definitions result in a useful and interesting mathematics, and so those definitions are the ones we’re sticking with.

It is important to note that you can choose any definitions you wish. However, you must make your choices plain up front, and you must stick with them at all times unless you make explicit that you are changing your definitions. And, of course, if you choose peculiar definitions, you cannot necessarily expect anyone else to care about them.

All that said, you might be interested in this essay on the hyperreals and nonstandard analysis. It demonstrates that sometimes, it can be very interesting to create your own definitions in order to do a new kind of math.

In what alternate reality is 1/3 irrational? :confused:

Yeah, but the number is only unnameable because of the ~ symbol, which means you never stop typing the number indicated, and so it’s really impossible to find the remainder, right? The original article, as I understood it, said the same thing. If you ever stop typing 9s, the whole .9~ = 1 thing falls apart, because THEN you can find the difference between the long string of 9s and the whole. By that logic, isn’t .9~ equal to two? Or three? Or five million? Because you never stop typing nines, so you can’t actually find the difference?

I know, I know, I sound like a thickheaded bastard who can’t give it up, but at this point I’m pretty sure I’m wrong, I’m just trying to get my head around this.

You miss the point. The ~ symbol means there IS no remainder. .9~ is = 1 because there is no number you can shove in between them, no measurable distance between them numerically. They are simply different decimal representations of the same concept.

Your concept of infinity and infinitely long needs some tweaking, I think. :slight_smile: