In school, were you taught that 1 is a prime number?

So you are saying that a certain algorithm does not work if we accept that one is prime, so we had damn well better define “prime” so as to exclude 1.

Now explain to me how that is not gerrymandering. It is exactly the sort of thing I had in mind when I used the word.

Much the same goes for the so-called “Fundamental Theorem of Arithmetic” (as if no-one could do arithmetic without knowing that :rolleyes:).

Please note that I am not saying the definition should not be gerrymandered in this way, just there is nothing particularly intuitive or logically necessary about it. Intuitively, and with a neat, parsimonious definition, 1 is prime (as most mathematicians would have said, not so very long ago), but if that were accepted, mathematicians would have to gerrymander several other things (such as the Sieve, the “fundamental theorem”, and, no doubt, others) to make them work properly. Mathematicians have decided it is more convenient to gerrymander the definition of “prime” than to gerrymander all these other things, but that means that the non-primality of 1 is purely a pragmatic convention, not a timeless objective fact.

I’m over forty and was never taught that one is prime, though my memory of high school is dim.

I was taught that it wasn’t prime, and I’m over forty. I didn’t see that as a choice, though.

It only looks like gerrymandering because you’re only thinking about the positive integers, where there’s exactly one solution to the equation uv = 1. If you look at the set of ten by ten diagonal matrices with integer entries, there are one hundred and twenty eight solutions to that equation. If we disallow them as primes, the unique factorization theorem goes through exactly as it does for the integers. If we don’t, it gets immensely complicated. If we want to be able to talk about all these things at once, making things with reciprocals not be prime is obviously the right thing to do.

Well if 1 isn’t prime it really screws up the old “all odd numbers are prime” joke.

I think most of my math classes that dealt with primes taught that 1 was prime, but I also recall at least one class (which I think was in college) where a discussion occurred where we sort of debated whether 1 was prime or not, and the conclusion was it was ambiguous.

I don’t remember being taught it but always thought of one as prime and don’t understand why “every integer greater than 1 is either prime itself or is the product of prime numbers” means that one is not prime. Even phrasing it as “An integer greater than one is called a prime number if its only positive divisors (factors) are one and itself” does not mean that one is not prime; it merely state what is the case for numbers greater than one. Sure, being a unit makes it pretty useless as a prime but it seems the definition of prime is purely “a number whose only factors are 1 and itself” and one holds to that standard. Definitions trump theorems derived from them anyway.

Older than 50.

It’s that every integer is the product of primes in exactly one way that doesn’t work if you consider 1 prime.

It’s as though, in addition to all the chemical elements (Hydrogen, Oxygen, etc.) there were also an element X, such that H[sub]2[/sub]O, and H[sub]2[/sub]OX, and H[sub]2[/sub]OX[sub]2[/sub], etc. were all the same thing (water).

No, that’s not what we were thinking.

At least I was thinking that it the wording of the definition is arbitrary, beneficial in that it makes things easier, but arbitrary. For example, allow one to be prime by definition and make the fundamental theorem something like “expressed as a multiple of primes in one non-trivial way” and powers of an identity element as factors is trivial.

I don’t know what the fuck you are talking about, but, so far as I can see, it only reinforces my point: what would otherwise be a clean and elegant definition of “prime” is gerrymandered with a special clause excluding 1, so that special clauses are not required to make certain other (more complex and obscure) generalizations come out as true. Mathematicians have simply taken a pragmatic decision to gerrymander the definition of “prime”, spoiling what would otherwise be a pristine elegance, in order to simplify the statement of certain other generalizations. I am not saying that was not the most sensible decision in the circumstances, but I am saying that it not a mathematical inevitability, but a convention based on pragmatic considerations.

Anyway, isn’t it already implicit in the standard definition of prime that the relevant factors should be positive integers, thus excluding your example of “ten by ten diagonal matrices” from the get go? (The definition given in Wikipedia says “positive divisors”.) If not, lets write it in. That will certainly be less arbitrary and inelegant than the special act of attainder that excludes 1.

Mathematicians defined the term in such a way as to minimize the number of “special clauses” necessary. That’s the opposite of gerrymandering.

And how does it make things any more elegant to take a concept that can be applied in exactly the same way to any arbitrary ring, and artificially restrict it to only apply to one particular ring (and not even a very interesting one at that)?

I remember being taught something to the effect that zero is the “additive identity” number, such that any “x” added to zero will yield x, and, by extension, one is the “identity factor”, a reasonable enough premise to exclude it from the list of primes.

Another (equivalent) definition of primality is that p is a prime iff it’s a nonunit such that p | ab implies p|a or p|b. The latter clause is automatically satisfied for all units, so it’s a not a very interesting concept there. Another equivalent definition for an arbitrary ring A is that a nonunit x such that A/xA is a domain. If you allowed x to be a unit, then A/xA would be the zero ring, which would be problematic.

More to the point, though, why would you want to consider units as primes? All of the interesting results invovling primes— the Chinese Remainder theorem, the fundamental theorem of arithmetic, Euler products, localization, etc.— would either hold trivially or fail for this new prime-or-unit concept. Whatever the “point” of primality is, 1 doesn’t have it. Based on a couple of centuries of number theory, it’s pretty clear that requiring that primes be nonunits is the right definition.

Just to clarify, a unit is an element u for which there’s another element v such that uv = 1. The positive integers only have one unit, but general rings can have many.

TO;DR.

My favorite number is 17159327 – it is a prime number. I like it because it is such a lonely, ignored number in a sea of other meaningless numbers.

I’m 35 and while I’m not positive I was taught that 1 is a prime number, I know I wasn’t taught that it isn’t.

Two can be as bad as one.

Sort of, yes, but then it’s pure mathematics we’re talking about - it’s all arbitrary at some level. The issue here is that you (and evidently a lot of other people) are taught fairly early in their education that a prime number is a number which can only be exactly divided by itself and one. Such a definition would make 1 prime. But as has been pointed out, making 1 prime has undesirable consequences for the rest of mathematics. So instead we should define a prime number as a number that has exactly 2 distinct positive divisors. It seems to me this is just as “clean and elegant” as a definition (and also excludes 1 from the set of prime numbers), but because you have been taught it as “a prime number can only be divided by 1 and itself - except 1 itself is not prime, that doesn’t count”, you see it differently.

As for the poll, I couldn’t answer because I’m 27 and I was taught the former definition above at the age of about 9, I guess, and then this was corrected by the latter definition probably when I was around 12 (well before college, in other words). Naturally, this confused me for a time - it would be a lot better if everyone were taught the second definition from the outset, but I guess the first one is deemed less complicated for young kids.

It took me a while to remember for sure, but I realize I was not taught that 1 was prime, but I also was not taught that 1 was not prime. Hence, I derived for myself that 1 must be prime, based on the definition I was given.

The idea that a prime number only has two distinct positive divisors means it includes all negative numbers and allows for nonreal or fractional primes. The definition needs refinement.