.999 = 1?

We define notations, and 0.999… means Limit Σ (n=1 to inf) 9/10ⁿ

But you’re allergic to infinity, we get it.

I guess I’ll wade back into this mess.

Yes, there appears to be a mental block going on here where the representation of a number is being taken to be the number.

If it makes it any easier (probably not) on the discussion, rather than sqrt(2), we can always call the number ‘meh’.

The number we refer to by ‘meh’ is the number such that if you multiply it by itself, you get 2, i.e. ‘meh’ * ‘meh’ = 2.

There. No square root functions, no decimal representations. Pure platonic concept of number.

Now, there’s the problem of how we represent this number using our conventional notation for numbers. Well, I’ll leave that as an exercise to the reader. But no cheating like saying it can’t be done. That’s simply asinine and untrue.

Sorry. I can’t resist.

I submit n=23.
n+1=24. This is not less than 1.

If you don’t make sense then no one can treat you seriously.
If you lack the precision of communication required for a thread of this kind then you should get out now.

If I ask you “What number is 2 + 2?” You answer: The number 4.

if I ask, “what is sqrt(2) ?” You answer “Sqrt(2)” !!

If I ask “what is sqrt(4)?” you answer : The number 2.

By not being able to answer me sqrt(2), it is like you are conceding that we cannot write it in base-10, but you refuse to admit it.

Is there an exact base-10 decimal number for the “amount” sqrt(2) ?

So does 0.000…1 and 1/10… mean Limit (n=1 to inf) 1/10ⁿ ?

0.9(n times) < 0.9(n+1 times) < 1

if you don’t like that notation:

1 - 1/10[sup]n[/sup] < 1 - 1/10[sup]n+1[/sup] < 1 for the entire cardinality of n a member of N, which is what was explained as the same meaning as […], ie: n = 1, 2, 3, …

If you lack the intellect to follow the conversation and read a wee bit between the lines, then perhaps you should “get out now” ?

I think you have a misconception of infinity here. There is a very important mistake here.

The “cardinality of n a member of N” is one for any n.

That is, you picked a number from the set N, and there is only one number in the set you now have. That is what you wrote - even if it clearly isn’t what you meant.
The phrase “entire cardinality” doesn’t make sense.

Infinity is the cardinality of the set of Natural numbers. It isn’t any n you pick from the Natural numbers, it is the total count of Natural numbers.

It may interest you to note that Wikipedia agrees exactly with what naita and everyone else has been saying.

Note the following:
The uninhibited use of … to imply a limit, that is, summation to infinity
The use of = to denote an exact equality
The use of the geometric sum to infinity formula without any mention of the word limit and also stating equality

Now I get that Wiki is not an authoritative source. But you are seriously swimming against the flow here.

As for 0.0000…1, this has no sensible meaning in the Real numbers (and as far as I know is highly non-standard notation in any other system)
The reason? It is because the … implies infinity of zeros - where infinity means the cardinality of the natural numbers. As such, it is a property of the natural numbers that infinity plus one is not a meaningful statement. Therefore there is no place-value position for the digit 1 and hence it has no value.

Seriously dude, you should take up poetry and give this a rest.

Why should I have to read between the lines? Is it not sufficient in a mathematical thread to expect a poster to use explicit mathematical language in a consistent manner and in a way that their meaning is clear?

It seems you are saying the entirety of trigonometry is wrong, since no small amount of these ratios are irrational. Are you saying that Leonhard Euler was completely wrong about e, because e cannot be written exactly in any base? You’ll need to abandon your use of “f(x)” and “∑”, these notations are credited to Euler.

You should finish your education and get published this falsification of trig, the Queen of Sweden will want to meet you. I don’t mean to get snarky, but we’ve a couple millennia of the smartest people* agreeing on something, why do you think you’ve ninja’ed them?

Moderator Warning

dropzone, you’ve been here long enough to know that insults and accusations of trolling (even oblique ones) are not permitted in this forum. If you want to insult people, take it to the Pit thread.

Colibri
General Questions Moderator

No, this kind of behavior is not accepted in this forum (although it is in the BBQ Pit). But if you have a problem with someone’s post, report it (small triangle on upper right of the post) and let the moderators deal with it instead of retaliating.

Ok, so " all n ∈ N " and "cardinality N " are the same ?

I thought it was clear that it didn’t mean “any (n)” but “all n” instead.

Wikipedia ?? I know for a fact that only “those who agree” get posted there. There have been arguments by very intellectual people of the world on the matter, of which all get deleted.

I didn’t say 0.000…1 was a “number”.

It is like I have been saying all along. It was explained that 0.999… means “The limit of Σ 9/10ⁿ as n→∞” which is the number 1. So again, 1 is the number, which is L, the limit, and is also the limit of the sequence. Again, the number 1.
0.999… are the ingredients to get you to the number 1.

You understanding of 0.000…1 is the typical one. I must tell you, the 1 exists: always.

1/10ⁿ for all n ∈ N.
I start with 1. I divide it by every number in N. The 1 doesn’t go away.

So if […] mean “limit”, then 0.000…1 = 0, right ?

or

1/10… = 0

I am using the same concept as 0.999…

To the extent that we are all smart enough to understand what is indicated by the non-standard notation, yes 0.000…1 = 0. Now what?

Yes, everyone else in the world does mean that 0.9999… really means Limit 0.9999…

That’s just what everyone has accepted the notation to mean. Notation is a convention, we can choose by consensus what symbols mean, and in this case, the notation 0.9999… has been agreed by everyone (except you) to mean the limit of the infinite series.

Why would we leave it out? Because it’s less to write, and it’s not ambiguous to anyone. It’s not ambiguous because there is no other actual real number that 0.9999… could possibly refer to. If we’re talking about real numbers, there is no other meaning that 0.9999… could have.

That’s why convention has settled on using it that way. You may as well argue with the usage of a word, and try to tell everyone else in the English-speaking world that a word they all use should instead mean something else according to schooner26.

…seeing as s(n) is rational for all n and we’ve known √2 isn’t rational for millennia, I’m going to guess “there isn’t one.” What’s your point? Keep in mind that the value of a real number is defined* as what the Cauchy sequence converges to, not to any specific number that’s inside the sequence.

*for one definition that’s equivalent to the others used in modern math, at any rate

No, the … notation is used for recurring decimals and irrational numbers only, and not in the exact same way.

But not the same definition. There’s nothing in math that says that all “concepts” have to be extendible in all thinkable ways. Only actual definitions have to be extendible.

Still no. Cardinality means the size of the set. You don’t get to even say all members of N, that is a different concept. Literally, you have: limit, where n approaches the size of the set of natural numbers.

The phrase “base-10 number” demonstrates the confusion. There is no such thing as a “base-10 number.” Base ten is not a property of a number itself, but of a particular way of writing that number.

Mathematicians have defined the real numbers in a way that makes no reference to decimal representation.

That’s why people have been saying things like

I have only a limited understanding of philosophies of mathematics, but you (schooner26) seem to believe in some form of Constructivism.