.999 = 1?

I can do anything I want with the axioms of the system in my quest to arrive at a contradiction.

If I can use them to arrive at a contradiction within the system the system
comes tumbling down.

If one day I go outside and drop an apple and it goes up instead of towards the center of the earth, either Newton was wrong or something changed cosmologically/empirically.

“neither exists”

What?

Perfect circles do not exist.

I can reference thousands of circles in day to day life if I try.

Oh that’s just silly.

If there are 85 billion “proofs” of a theorem and one “disproof” of that same theorem the possibilities are:

the 85 billion “proofs” are all wrong
and/or
the one “disproof” is wrong

Which is more probable? That the disproof provided by one free-thinker who can’t provide a simple algebraic proof free from trivial errors is wrong, or is it the 85 billion proofs, reviewed and verified by generations of qualified mathematicians that are all wrong?

Or perhaps it’s the worst case, all 85 billion and one proofs are false, we are all equally useless.

For what it’s worth, those sig fig rules you learn in high school are bullshit. For example, suppose someone tells me the first four digits of x are 1.234. I calculate four digits of 1.234^5 and get 2.861. Does that mean the first four digits of x^5 are 2.861? Not at all; it could be anywhere from 2.861 to 2.872. I only actually know the first two digits of the output.

Why should I magically identify the range of uncertainty around (the mantissa of) x with the range of uncertainty around (the mantissa of) every calculation involving x? In fact, even in the simplest, most straightforward case, this thinking goes awry: knowing the first digit of x is 1 and the first digit of y is 2 is not enough to conclude that the first digit x + y is 1 + 2 = 3.

The sig fig ritual is nonsense. It’s good to keep track of uncertainty in your measurements and thus in your conclusions. But those crude, misformalized rules aren’t the way to trace uncertainty through your calculations.

And as usual you have virtually nothing to say mathematically other than condescend from a vantage point that conveys the impression that you understand everything but don’t have as much as 10 seconds to share what you do know with anyone even though it would help.

Gotcha

I’m still just sitting here and wondering how the NSA feels about long division and other recursive algorithms.

Any comments?

Just thinking about all that computing power that went online in 2013 and how important it might be to dissuade the general public from discussing recursive algorithms thats all.

It would certainly seem useful to simply circulate the meme that long division is a “broken” or “backward” process that needs to be shelved…hmmm I wonder how an “Elf on the Shelf” might view this?

I calculate a delay of responses here of something like 5 minutes everytime I use the word recursive.

Fascinating

Indistinguishable, do you want to say something to break the silence here since it is “near and dear to your heart”? Still waiting for an answer on your “understanding” also.

True, if you start with a contradiction, you’ll end up with a contradiction … but that’s not proof of anything except that you started with a contradiction. With no foundation, the house comes tumbling down.

There is a non-zero probability that an apple with “fall” up … if you believe in that QM stuff.

OK, well based on this, its seems clear that there is some correlation between the use of certain words and the behavior of this forum…of course we all know that everything is being run through a certain agencies supercomputers in Utah so, well, the time stamps here prove it.

I don’t think you know what the NSA does … because if you did … you’d know it’s a felony to imply you do.

Not the Pit, not the Pit, not the Pit …

LOL what does this mean?

Why don’t you for the benefit of all of us tell us how you feel about recursion in any ways that you would like (ie mathematically, spiritually, etc)

You want me to say any old thing about recursion? Ok: primitive recursion is the process of constructing the unique map from the initial algebra of some type to another algebra of the same type. Primitive corecursion (typically called just “corecursion”) is the dual of this, constructing the unique map from a coalgebra of some type to the corresponding terminal coalgebra. What we consider “general recursion” in computing is actually more closely aligned with primitive corecursion than primitive recursion, though the latter ends up implementable as a special case of it.

That’s not terribly relevant to this thread at the moment, but I’m not really sure what you want me to talk about instead.

Under what Federal law?

I am not aware of a Federal law that prevents free scientific thought.
I am not referencing anything that is classified, only concepts.

So again, under what law is it illegal to talk about recursion?

Awesome!

Love to learn about it and hear about it…thank you, can I reference any of these concepts on the web though for further research?

“That’s not terribly relevant to this thread at the moment, but I’m not really sure what you want me to talk about instead.”

The only reason I am off on this tangent is because I have seen over the years since the beginning of the Gulf wars, a major shift away from discussing these concepts on the web.
I mean did you see what just happened there? I threw out a couple words and the whole system froze for like 5 or 10 minutes as if someone was making an executive decision.
I can’t ever remember a time when scientific concepts were taboo on the web.

I’ll have to check with the NSA.

And your aforementioned “understanding”…was it based on empirical evidence that you yourself have witnessed?