Yet Another Math Question (YAMQ)

Twenty years ago, when I was still in grade school, someone mentioned something about a longish proof that there are no integers between 1 and 2. I’ve asked a lot of “math people” since then, and they’ve always deferred to the defintion of an integer, instead of an actual ‘proof’ of the idea.

I can’t remember who planted this seed, back in my formative years, and it’s been a nagging question ever since. Was whoever it was messing with me, and the only ‘proof’ needed is the definition of an integer? Did I invent this ‘proof’ on my own, and give myself a “false memory?” Or is there really a big sequence of steps one can go through to prove that there are no integers between 1 and 2? None of the above?

Perhaps I will walk you through this Dave, but first, why don’t you provide us with the definition of an integer?

Hmmm. Touche, grienspace. Snide and perhaps sarcastic, but point understood, I think. Not being what I consider a “math person” myself, an integer is any number with an infinite number of nothing but zeroes after the decimal point. Still, this is just a guess at what the ‘formal’ defintion is, and I think my questions are still valid given that lack of fundamental math knowledge.

The first thing you need to do in math is define your terms. There are several equivalent ways to define integer and order, and it would be necessary to know how you define integer to know how to proceed to prove that there are no integers between 1 and 2.

I don’t remember trying to prove there are no integers between 1 and 2, but I do remember having to prove 1 >= 0. In the system we were using you could not prove 1 was not equal to 0; it had to be assumed.

There’s no integer between 0 and 1, but of course there’s one between 3 and 4. It’s called “Bleem”. Who ever said mathematicians don’t have a sense of humor?

OK then bibliophage. Prove that bleem is greater than pi, if it is greater than pi, otherwise prove that bleem is less than pi.

Note: bleem cannot be equal to pi because it isn’t Greek.

I think a rough sketch of the proof would go something like this:

Two is defined to be the successor to one. If there were an integer between one and two called Z, it would have to be the successor of one, and its successor would have to be two. But then two is the successor of this integer and one, so this integer is equal to its own successor. And the axioms don’t allow that.

This is a bit simplified, cause it doesn’t account for the possibility of more than one integer between one and two, but it’s probably the right idea.

Oh, and for a basic definition of successor, think “add one”.

I suspect that you heard a slightly garbled version of the fact that in Bertrand Russell’s and Alfred North Whitehead’s Principia Mathematica, they prove that 1 + 1 = 2. See the following URL:

where it says the following about this theorem:

> This assertion may become more convincing after a look at
> the page 362 of Principia Mathematica where Russell and
> Whitehead finally proved that 1 + 1 = 2.

I think you’re right, ultrafilter, but I also think you’d have a really hard time generalizing your proof to include the possibility of multiple integers between 1 and 2. This difficulty may account for the length of the proof that DaveW remembers hearing about.

Once you’ve established some results about sequences of successorship (basically, that they aren’t cyclical), it wouldn’t be too hard. However, establishing those might be a bit tough.

Hmmm. Strike my reply to grienspace, above. I should have waited for more replies to roll in, before responding thusly:

Apologies to all who read my OP as a cry for help to actually prove that there are no integers between 1 and 2, instead of as intended: “Does such a proof exist? Is it widely known?”

Judging from the replies so far, the answers to these two questions are “maybe” and a resounding “no,” respectively. The best I was hoping for was a reply along the lines of “Oh, yeah, that’s the Fizgig/Sinkhole proof [sup]*[/sup], and can be found at

See, the details of such a proof are irrelevant to me. As I’m a computer programmer, the details have no practical use in my day-to-day life (as long as the basic rules don’t suddenly change). I was trying to test the veracity of what I was told long ago, and perhaps get a solid foundation for a piece of math trivia that’s clogged my head for two decades.

So, given all that, thanks, Wendell, I’ll bet you’re correct.

[sup]*[/sup]Further apologies to any real-life mathematicians named either Fizgig or Sinkhole.