10/1=.999999 is a very precise answer, but not very accurate
its all about definition. notation is a representation of a number, not the number itself. don’t mistake numerals and notation for numbers. if you write 0.9~ and intend it to be 1, fine. if you intend it to be something less than 1, fine. if you intend it to be the number of angels that can dance on the point of a pin, that’s fine too. the notation doesn’t change the number.
This only works if you are completely consistent. If you write 0.9~ and mean it to be anything other than 1, then every single use of every single representation of a number has to follow suit or at least be totally defined and restricted in context.
You can’t say that 0.3~ is equal to 1/3 but that 0.9~ is not equal to 1, as people actually have in this thread. If you do that you are wrong. You can’t excuse it by saying you are using two different representations or interpretations. You are then internally inconsistent and we are totally justified in rejecting what you say.
I understand your point about conventions — but then answer me this.
When dividing 1 by 3 using long division, in base 10, we notice that the result seems to require an infinite number of threes after the decimal point. If I (playing Devil’s advocate here) deny that this represents a real number — or maybe claim it represents some real number, just not the limit of the partial sums as is the custom — then won’t I be forced to conclude that the algorithm for long division is incorrect?
It is more than that. Allowing just one inconsistency – any inconsistency – into a system is sufficient to prove that 1 is exactly equal to 2, or that Louis XIV was a three-headed land clam with polyester teeth, for that matter.
You are correct that the notation is merely a representation. But the issue here is not what meaning you or I assign to 0.9~. The question at hand is what to mathematicians generally mean when applying it to standard notation.
The point of standard notation is communication - the attempt to communicate a particular value or idea or number. That communication only works when all parties to the exchange are using the same conventions, assumptions, definitions, etc.
If you wish to state that 0.9~ = 1946.1, you can do so, but don’t be surprised if people look at you with puzzled looks. At best, you are using a confusing, unwieldy, and inconsistent notation system.
1945.9, 1946.0, 0.9~, 1946.2, …
But that just spawns more questions, like what does that one value get a different notation.
from Irishman: The question at hand is what to mathematicians generally mean when applying it to standard notation.
i would generally concur, and as noted before, some of the arguments here are inconsistent and nonsense. but definition makes a mess out of things with the general populous. you contend that a large subset of all mathematicians agree on the definition of 0.9~, but that subset is much smaller than the subset of all people who disagree. these things come up because its unwieldy to preced every posit with 'according to the majority of mathematicians…". without that, the average person might revert to an unclear concept of what repeating notation means, and then, as usual, attempt to find an argument that explains their position, which then descends into sematic arguments. so i’ve found it easier to paradoxically ‘agree to disagree’ when arguments descend to that level.
from John W. Kennedy: It is more than that. Allowing just one inconsistency – any inconsistency – into a system is sufficient to prove that 1 is exactly equal to 2, or that Louis XIV was a three-headed land clam with polyester teeth, for that matter.
i originally agreed with you, but then i remembered physics. so i don’t think all inconsistencies have such dire consequences. still, in this case, inconsistency would probably make things fall apart.
No it isn’t. 1 is one particular number. One solitary value. Talking about “high values of 1” is nonsensical, because if it isn’t equal to 1 then it isn’t 1.
And yes, I know what you’re going to say. “But if you take 1.4 + 1.4 and round…”
Mathematics is a system of definitions. Period. The last century or more of formal mathematics has been one long continuing effort to make every subset of math as completely defined, internally consistent, and locked into place as possible.
There are no semantic arguments involved. By definition, semantic arguments apply to words, which do have slippery and conflicting connotations. Math relies on axioms and builds from there. There cannot be internal inconsistencies inside a mathematical system.*
For our example, the definition of integers, fractions, limits, and infinity, as well as all the processes involved, are formally and completely defined. Not by some majority of mathematicians: by all mathematicians who understand the subject. Considering that this is about as basic as formal math gets, that’s all of them. You cannot allow argument by ignorance into this. Nobody can accept “well, I don’t don’t know enough about math to grasp even these extremely basic concepts, but I’m entitled to have my own opinion on the matter.”
Nor can you simply and arbitrarily move from math into other disciplines, even though the distinctions are pretty small. The mathematical basis of physics has no contradictions in it. Experimental results may not agree with calculations made by theory, but that’s not a contradiction, merely an indication that the experiment wasn’t accurate enough or that the theory needs adjusting. Internally, however, the theory cannot allow for contradictions. If it does, the theory is discarded as useless.
Similarly, what John F. Kennedy wrote was a trope from mathematical logic turned into semantics. Formal logic is written in symbolic notation. That is as rigorously defined as any branch of mathematics. People can then take the results and translate them into words to prove that “1 is exactly equal to 2, or that Louis XIV was a three-headed land clam with polyester teeth, for that matter” but the underlying formal logic is not word based.
There are no mathematicians who disagree with the statement that 0.9999~ = 1, unless they first preface that by noting that they are talking about a specialized branch of mathematics that the average reader would not even recognize the name of. That has not happened here. We’re dealing with the ordinary math that you learned in elementary school. That’s always true, in every one of the many threads on this. And there is no possible way to twist that so that 0.9999~ is not exactly 1, merely a different representation of the same thing.
Your argument is wrong from start to finish. I’d like to say that about a lot of things people say here, but refrain because political or economic arguments are seldom totally wrong. This is math and the rules are different. So it’s refreshing to have a case where I can say that at the top of my voice and not fear any contradiction.
A mathematical system can, indeed must, contain propositions that are not formally decidable within that system. That does not constitute a contradiction, however.
i don’t get it. how many times do i have to agree with you? maybe i’ve been too subtle. you are right. 0.9~ = 1. the only disagreement i have is over this:
Nobody can accept “well, I don’t don’t know enough about math to grasp even these extremely basic concepts, but I’m entitled to have my own opinion on the matter.”
quoting somebody:
“Opinions are like assholes, everybody has one”
someone’s irrational opinion has no effect on mathematics. but everyone is entitled to their own opinion. it would be easier to determine whether the hulk could beat superman, than to pit science against opinion. look at all the success people have had getting creationists to understand evolution.
That depends on whether you’re doing pure math or applied math. In pure math, “1” means 1.0000… In applied math, “1”, without any further comment, means “between 0.5 and 1.5”.
That’s circular logic. You’re just saying that the difference between 1 and .9~ is epsilon, and epsilon is the difference between 1 and .9~. That’s null content.