Well, to begin with, it does stop at 1/3. It’s the definition of 1/3.
But even if (wrongly!) you insist on it being a process, it not only gets “close enough” to 1/3, it can always get closer! No matter how small a margin of error you ask for, the limit can always fall well within that margin. There is no other number than 1/3 that the series 0.333~ can approach in this way. So 1/3 is the only meaningful value to assign to the sequence.
Well, it’s quite certain that the discussion doesn’t have much point. But, hey, it’s a game we all play… One problem is that one “side” is the world’s mathematicians, and the other side is just anonymous people on BBS discussion boards. It’s as if I claimed that nucleic acids are amino acids, because they’re chemically kinda similar. Okay, nice for me, but the world’s biologists have defined the terms differently, and my revolutionary new concept isn’t going to get taught in colleges any time soon.
This came up in a previous thread on the same subject. You can make a new definition of numbers, where this actually works. It’s a little like complex numbers: every number is now defined to have a “major” part and an “infinitesimal” part – a + bt. (I would use “i” but it’s already claimed. “t” is probably also already claimed, but let me run with it…)
t is tiny. t is arbitrarily tiny. Like 10^-googolplex. So tiny, it doesn’t blessed matter. But it isn’t zero. However, t^2 is defined to disappear, being so close to zero, we ignore it.
You can add 'em. (a + bt) + (c + dt) = (a + c) + (b + d)t.
You can multiply 'em. (a + bt) * (c + dt) = ab + (ac + bd)t. (The term bdt^2 vanishes.)
You can do most of the regular math stuff with this new notion. It has most of the properties of real numbers. You sort of have to limit how big b is allowed to be, lest the infinitesimals encroach upon the majors. But how often is that gonna happen?
If you absolutely insist that there is a gap, somewhere, between 0.999~ and 1, you could change your definition of number to work with this kind of notation.
Again, good luck getting the world’s mathematicians to include in it textbooks. But it’s a free country!
No, the problem isn’t that negative integers are constructs of the human brain, and don’t actually exist in nature. The problem is that all integers, including the positive ones, are constructs of the human brain. Four sheep is not exactly twice as much as two sheep, because sheep are not all exactly the same size.
Because it was realized as early as Zeno of Elea that you can always continue to cut the length of delta t in half. Always. No matter how long you try, you can always take half of the remaining interval. You can never reach the end by this process. (It’s called the Dichotomy Paradox.) And it is exactly this never reaching the end that people like engjs object to.
This is precisely what Cantor wanted to handle when he put infinities on a rigorous basis. He used - invented - set theory along with a set of definitions that finally and for all time settled these so-called paradoxes.
That’s why people have been saying that all the mathematicians in the world agree on certain rules and procedures for dealing with infinities and have agreed upon these for 150 years. They know all the objections and know why the objections can be dismissed using standard mathematics. If you check some of the many other threads on this issue, you’ll see some of our mathematicians roll out real math to explain this, rather than the mere word games that people are playing here.
Why is engjs therefore continuing to make these assertions without any use of actual math? Baffles me. You’ll notice that he studiously avoids definitions or discussions of how his procedures would affect other problems in math. That alone is a sufficient sign that his claims can be instantly dismissed. But a lot of good math can come out in the process of dismissing him.
Well, no, that’s not valid, because “dog” (let’s not use things with collectively plural singular forms, for the sake of sanity) is not defined to mean “this particular dog, and that one over there, being slightly smaller, is less than one dog.”
But I WILL grant that “four” is a construct of the human brain, because it represents one dog collected together with one dog, collected together with one dog, collected together with one dog. Given that there are reportedly societies that only manage the number “one” in their concepts, and anything else is just a collection of “one” things, pretty clearly “two”, “three”, et al. are human constructs.
Still, the “natural” numbers (as long as we let them start at 1) are natural for a reason…
ETA: And to be very blunt, taking three pencils away from me is not the same as “hand[ing] me three pencils.” Not by any measure of the English language. Don’t let yourself get caught up in the usefulness of subtraction to deal with undoing adding to the point of losing touch with plain meaning of non-mathematical language.
Actually, that’s fine. You cut the time-intervals in half, in a way to make the “supertask” arbitrarily fast, as the number approaches 1/3 abritrarily closely. No one can say, “This doesn’t include the billionth digit” because, yes, that digit was added at a specific point in the sequence.
Martin Gardner played some with supertasks. Suppose you turn the light switch on at time zero, then off after half a second, then on again after a quarter of a second, etc. At the end of one full second, is the light switch on…or off? The answer can’t be known, because infinity can’t actually be reached or attained, even if constrained by an accelerating rate.
This is why it’s important to understand that 0.333~ isn’t a “process,” and not even a “superprocess.” It’s a notation, implying “threes forever.”
There is a bijection between the set t = { 0, 0.25, 0.75, … } and the set { 0.3, 0.33, 0.333, … }. And both sets are infinite. To me this seems well-defined. And the result of the infinite process, at t = 1, is a notation, implying “threes forever”.
Right. By the way, the light on/light off “supertask” cannot succeed, because the sequence light on, light off, light on, light off, … doesn’t converge to a defined value.
Agreed: it’s an amusing thought-experiment only. The halving of the time frames is a way to force it to the appearance of a conclusion, but, really, given any state (on or off) there will be a successor state (off or on.)
It’s really the same as saying, “Flip the light switch forever. After that, is it off, or on?”
Norton Juster, in “The Phantom Tollbooth,” put this wonderfully: “Just follow that line forever…and when you reach the end turn left.”
(I do not blame the loyal opposition in this thread for thinking, “Aren’t the 0.999~ people making the same error? They want the list of nines to go forever…and then stop.” But, of course, no, we don’t: they go on forever and don’t ever stop.)
Absolutely correct. When I say, the result of my infinite process is 0.333…, doesn’t mean that we have stopped at some stage of the infinite process. We have completed the process, because we added a 3 for every n-th digit, n ∈ ℕ. Just like the notation 0.333… is complete, because there is a n-th digit for every n ∈ ℕ.
Whenever this topic comes up — and its persistence now exceeds both the Monty Hall Problem and Is Social Security a Ponzi Scheme? — I like to mention Archimedes’ Axiom.
That ancient Axiom was useful to the ancient Greek geometers: Given a line ABC can you construct with finite steps D such that CD has the same length as AB? An arithmetic version of the axiom is:
Given 0 < a < b there is some integer K with aK > b.
No, Archimedes (who attributes his eponymous Axiom to an earlier Greek) didn’t need set theory, let alone the rigorous work of Weierstrass to avoid the problem that plagues the Neo-Zenoists. 0.333… cannot be distinct from 1/3 because there is no (1/3 - 0.33333…)K > 1.
And I’m always booed down when I mention Archimedes Axiom: “Archimedes was only the greatest genius who ever lived, why would his opinion matter?” I mention this ancient Axiom NOT to denigrate the rigorous modern works of Dedekind and Weierstrass but just to emphasize that the 1 != 0.9999… nonsense is not some post-modern “insight” — it would have been laughed at even by the ancient Greeks.
0.999… is defined to be the limit of the sequence 0.9, 0.99, 0.999… and the limit of a sequence of real numbers is defined as being the real number L for which, for any given real number ε greater than zero, there exists an Nth term such that the absolute difference between L and all nth terms, where n>N, is smaller than ε.
So we’re not actually performing an infinite series of additions, we’re applying a definition to 0.999…
The difference, for example, is that you may start from a set of first principles that do not allow you to define what 0.9+0.09+0.009… is when taken literally. However by imposing a consistent and useful definition you sidestep the question as to whether you can actually perform a sum with an infinite number of terms and get onto questions which have definite answers.
I can perform the sum 1+1. It’s equivalent to 2. How much time does it take to perform this sum? No time. There is no time in mathematics. How much time does it take to perform 0.9+0.09+0.009+…?