Zero is a natural number

The report needs serious rewriting. Zero is not a positive integer! Most people do include 0 as a Natural Number. “Counting” and “Whole” numbers are useless terms since there is no consistency at all in their meaning.

If you want a notation for positive integers, just use Z[sup]+[/sup], no one’s going to quibble with that one.

I have never seen P for Natural Numbers, is there a cite? Is it used in a particular branch of Math that I am not famliar with?

I saw somewhere that the definition of a topologist is someone who can’t tell his ass from a hole in the ground, but can tell his ass from two holes in the ground.

Chronos. I think you got it backwards. A standard hole in the ground that doesn’t come out anywhere is topologically different from the human alimentary canal. But a hole that does lead to another opening, such as a tunnel is the same. (Ignoring “side trips” in the human system.)

Ooog. I don’t know how that one slipped by, but the report obviously does confuse the Positive and the non-Negative integers. I’ve ordered correction. Sorry.

And, ftg, I’m not sure how you surveyed “most people”, since every text that I have looked at has defined Natural Numbers the way I did (excluding zero.) And “Counting Number” is not intended as math-theoretic, but as an explanation.

Hey, look, the question is whether zero is a number. I assumed that anyone comfortable Peano’s Axioms didn’t need an answer to that question, so I was focused on an explanation for the non-mathematicians.

I call your attention to the sister Staff Report: Is zero odd or even?

:slight_smile: Actually, the homotopy group of 1 is 1, because homotopy groups are nonabelian.

I find that if someone is writing a book and they want to use the N notation, they will always explicitly state their convention. In communications that are briefer than a paper, people mostly just avoid using N if it matters whether 0 is included. It’s usually not worth while defining N every time you want to use it in something short like an email or a message board post, so people just avoid it in these contexts.

Richard Stanley, in his Enumerative Combinatorics, uses P for the Positive integers and N for the Nonnegative integers.

Olmsted, the authority I already cited, uses P for the Positive Integers.

I see that the staff report in question as well as this thread focus on modern mathematics, so it may be only a side note, but I’d like to mention that Peano was by far not the first mathematician dealing with this topic, only the first to give a definition that survived until today. The Ancient Greeks (think Pythagoras, Thales, Aristotle, …) observed only numbers (which without saying meant 1, 2, 3, …) and their relations (positive rational numbers). They did not explicitly put down axioms like the ones of Peano (at least we have no knowledge about it), but we can deduce that they had ideas similar to that, since some Greeks argued that 1 cannot be a number, because it is the source of all numbers (an argument more philosophical than mathematical in nature - still you see the similarity: source of numbers/initial number).
Interesting that they argued whether 1 was a number or not in a way similar to how we discuss about 0.

As an aside, I would never use P for positive integers, because I have also seen it used for primes, so this usage just asks for troubles.

The thin Book of Numbers, by Conway and Guy packs in its few pages almost every piece of known math trivia. On page 13, with the heading Kinds of Numbers, it describes whole numbers, fractions, rationals, irrationals, algebraic numbers, zero, negative numbers, complex numbers, transcendental, infinite, cardinal, ordinal, and surreal numbers. It does not mention natrual numbers.

Probably, because of this issue.

No problem. Sorry for the lateness of my reply, but I though I’d told the fora to send me emails…

I ran around my department doing an informal poll and was universally met with incredulity: “How could zero not be a natural number?” The consensus seems to be that the numbers starting with 1 are the ordinal numbers, and therein lies their naturality.

Incidentally, when I worked through Royden as an undergrad the professor specifically mentioned that definition as being “out of date”.

I’m currently a CS undergraduate at Edinburgh. We are taught that N is {0, 1, 2, … } and N+ is {1, 2, 3, … }

Again, it seems to be purely a matter of individual choice and definitional. So, as long as the Staff Report has defined what is meant, I’m not going to start any massive revisions. Espeically since the whole point was the development or broadening, from { 1, 2, 3…} to { 0, 1, 2, 3…} to integers to rationals to reals. So, the symbols or names of the various lower level sets is pretty much irrelevant.

I’ve definitely seen the some authors who used N={0,1,2,…} and some who used N={1,2,3,…}. I finished a math major just last year, and I think that in the course of my undergraduate studies I saw each definition about as often as the other.

Personally, I just wish that when they’re teaching this stuff professors would mention that there is some inconsistency on how N is defined. But the fact is that it seems even many professors are oblivious to the fact that their own preferred convention isn’t universally accepted. And when those who are aware of this make no mention of it, they’re just perpetuating the problem.

I should probably amend my complaint to say it’s not a problem in writing as long as N is defined when it’s introduced. The real problem comes up in spoken discussions, where most people don’t bother to say, “The natural numbers, by which I mean the positive integers and zero.” Those who aren’t aware of the conflicting definitions won’t bother to ask, and moreover might be insulted if you gave them the definition they already knew.

We’d probably be better off if we just abandonded the phrase “natural numbers” altogether in favor of “positive integers” or “non-negative integers.”

And besides, there’s something delightfully precise about saying “non-negative integers,” isn’t there? It’s a pleasure only rivalved by talk of arranging a list of numbers (with possible repetitions) in “non-descending order.” :smiley:

<< many professors are oblivious to the fact that their own preferred convention isn’t universally accepted. >>

One of the first college math courses I took, the professor had numbered all the theorems and lemmas and whatnot. I thought that was standard terminology.

Of course, in grad school, I had a prof who always used “Theorem 17” when there was some sequence of proofs.

Getting away from the esoteric mathematical definitions, I have a problem with this in the Staff Report:

“But before you get to the Real Numbers, you probably start with the Counting Numbers or Natural Numbers, the set N = {1, 2, 3, …} in set notation. Zero isn’t a member of the set of Natural Numbers since you normally don’t start counting with zero. A primitive society developing a counting system wouldn’t think of “none” … they’d start counting with “one.” Thus, if by “number” you mean “the set of all Natural Numbers,” then zero isn’t among
them.”

This is just wrong. Assume me and my family are living in a primitive society long ago. I go out and pick 10 apples. The next day, I ask my wife how many apples we have, and she says “5” because she and the kids have eaten some. Then the next day I ask the same, and she says “none”, which means the number 0. When counting upwards people may start with 1, but when counting downwards from a finite quantity the lower bound is 0, not 1. While the concept of negative numbers and such may be alien to a primitive society, having the quantity 0 of something isn’t.

Actually, what you’re thinking of are cardinal numbers: those used to measure the cardinality of sets. Closely related to N (whatever its definition), but definitely distinct.

In response, first, zero as a number is a fairly advanced concept. Ancient Hebrew and ancient Greek, for instance, don’t have a numerical representation for zero. This doesn’t mean they weren’tt aware that seven less seven leaves nothing, they just ddidn’t consider “nothing” to be a number.

Second, if you show someone, anyone, of any age, from any society, a mass of pennies (say) and ask them to count them for you, I bet not a single one will start counting O, 1, 2… They’ll all start 1, 2, 3…

Third, this very enumeration of points starts with “first”, not with “zeroth”.

In short, “counting” numbers excludes zero.

"Second, if you show someone, anyone, of any age, from any society, a mass of pennies (say) and ask them to count them for you, I bet not a single one will start counting O, 1, 2… They’ll all start 1, 2, 3…

In short, “counting” numbers excludes zero."

…Unless you’re dealing with recalcitrant programmers, who tend to think entirely in offsets. In which case, you start counting at zero (meaning no offset.)

Alas, not all programmers think that way, resulting in the ever-popular off by one error.

The zero-counting programmers I’ve talked to about the matter are quite stubborn on the subject, and seem likely to pass their habits on to future generations. (“Never mind what teacher says, honey, it’s really *zero, * one, two, three…”)

In the original post, Is zero a number? the fifth paragraph (assuming we start with the 1st) contains a sentence that troubles me.

Of course, the concept of zero makes its appearance pretty early historically (the idea of using zero as a placeholder digit comes later, but that’s notational, and a different story)

It seems to me that the opposite is true. That the placeholder came first, and only gradually and grudgingly have we come to regard 0 as an actual number.

English hasn’t really caught up. Hold up 4 pencils and ask most people how many you’re holding - they’ll say 4, but hold up 0 and ask again - instead of just saying zero, they will argue with the question. “None.” “You’re not holding any.” “What do you mean?”

My wife and I have discussed this for some time and have become rather polarized.

She maintains that zero is nasty and selfish, requiring privileges, special consideration, and exceptions to otherwise universal rules and patterns; to wit: 0 to the power 0, N / 0, and especially 0 / 0 - a whole field (differential calculus) developed just to deal with it. Thus its value is 0.

In response, I have argued that 0 props up everything else (in Zen-like arguments) and, if fact, was responsible for the European Renaissance and the rise of western civilization - was the timing of Fibonacci’s math textbook merely COINCIDENCE (Probability = 0) or the spark that stimulated a zero-starved Europe into it’s outpouring of genius in science, music, art, architecture, etc?

And I bought her “The Nothing That Is” - a good read by Robert Kaplan. And she still loves me; what more could a man want? 0?

Anyway, this line of discussion is more fun than who and how many adopt which convention.

“God created the natural numbers [ 1, 2, 3, … ]; all the rest was done by man.” Kronecker

That’s a little strong. Differential calculus was developed to deal with the notion of curves and rates of change. In fact, the recieved wisdom that it was developed for physics is overstating it: it was developed for geometry, which was then applied to physics. Newton and Leibnitz actually managed to narrowly avoid dealing with 0/0 as they used infinitesimals. Only as the field was rigorized was that whole morass really settled. N/0 was effectively known to be infinite before that, and 0[sup]0[/sup] is essentially 0/0 all over again.

Oh, and Kronecker said integers: “Die Ganzen Zahlen hat Gott gemacht, alles andere ist Menschenwerk.”