Is 1 prime?

Read The Music of the Primes over the past week or so, and noticed that when the author lists off the first few numbers, he always starts with 2. John Allen Paulos, a more popular math writer, does the same thing. The lowest primes are listed as 2,3,5,7… . Now IIRC, a prime number is a whole number that cannot be factored into two numbers smaller itself (i.e. 1 and the prime). At any rate, I just always considered 1 a prime number. is my definition of a prime wrong, or is there some subtle difference between 1 and the other whole numbers I’m not picking up on?

thanks in advance --J

1 is defined as nonprime by most mathematicians. I don’t think it is considered to be composite either, but I could be wrong.

Why is 1 not a prime?
http://www.utm.edu/research/primes/notes/faq/one.html

A prime number has exactly 2 distinct factors: itself, and 1.

1 only has one distinct factor: 1.

Therefore, 1 is not a prime number.

Only in the naturals.

The most general definition of a prime deals with elements of a ring, and goes like this: a prime is a non-unit p such that whenever p divides the product ab, p divides a or p divides b. A unit is any u for which there is a v such that uv = 1. 1 meets all the conditions to be prime except that it’s a unit, so it’s not a prime.

Of course, definitions can’t be right or wrong, just more or less useful. So why do mathematicians consider it useful to define 1 as not prime? Because of the Prime Factorization Theorem. The PFT says that any positive integer (except, of course, for 0 and 1) has exactly one prime factorization. For instance, 12 = 223, and 100 = 2255. If you allowed 1 as a prime number, then there would be many prime factorizations for a number: 12 = 1223, or 12 = 11123, etc.

Since the PFT is such an important theorem, and it’d be annoying to say “exactly one factorization into primes other than one”, we define primes to exclude 1. You may someday encounter some theorem which deals with all numbers in the set {1,2,3,5,7,11,13,…}, and for those theorems it would be more convenient to define the primes to include one, but such theorems are few and far between.

ultrafilter, would it be correct to say from that definition of units, that in the rational (or real) numbers all nonzero numbers are units? I believe that Q and R are rings, right?

Yes. Furthermore, Q and R are fields. In a field, every non-zero element is a unit.

I’ve always wondered about the validity of 2 as a prime (I fully expect some eye rolling at this) - the lack of factors for 2 is a sole property of its diminutive value, whereas the lack of factors for all the others is (OK, except possibly 3) a property of their ‘irregular’ shape.

But IANAMathematician, so feel free to laugh gustily.

Since there are an infinite number of integers with any given number of factors (p[sup]n[/sup] has exactly n + 1 factors when p is prime), I don’t think there’s anything to that.

As the wise man says, someone’s gotta be first.

Anyway, look at the grade school definition of prime. There’s no reason to drag ultrafilter’s more correct (and correspondingly more complex) definition into this, a subset will do.

A prime is any integral number with two integral factors, itself and one. How many factors does 2 have? Two precisely. Which are those? 1 and 2. Therefore, 2 is prime. 2 may be even, but that’s never stopped its primality before.

Anyway, having an even prime (and the definition allows for precisely one) is essential if the Prime Factorization Theorem is to work. And, as Chronos said, the PFT is pretty big stuff.

You’re right, of course; the only point (and it is a lame one) is that there simply aren’t any candidate factors available for 2.

There’s nothing obvious in the property of a “factor” that it has to be less than the number it’s a factor of. You seem to be regarding 3 as a sort of “potential factor” of 10, but 13 as not a “potential factor” of 10, because it’s greater. This is not very helpful, though, as a number either is or isn’t a factor of 10, and there’s no realistic distinction to be made between those less than 10 and those greater than 10.

I suspect you’re thinking in terms of “If I were to write a comptuer program to test for primality of N, what numbers would I need to test?” At first, you realize you only need to test numbers less than N. But after more thought, you realize you only have to test numbers up to sqrt(N). So for N = 3, there’s no need to test any numbers! Does this challenge the validity of 3 as a prime? After more thought, you realize that you can stop the program if N is even. So you don’t need to test 2, because you’ve already eliminated all factors of 2. Instantly, this means that for N = 5 or N = 7, you don’t need to test any numbers.

Depending on your algorithm, it would be possible to write a program that doesn’t need to test any numbers at all for any N. So, 2 is not unique in this regard.

All good points, but no; the unique thing is that even without clever tweaks to the algorithm, there is simply nothing to test for 2 (if such a methodology were favoured).

Time for bed now probably.

You seem to think that the ordering relation we have on the integers is natural, and that it’s natural to have one in the first place. From a mathematical standpoint, it’s not: there’s no reason to order the integers the way we do, or any way at all.

When the mathematicians think of primality, we think of it only in terms of the ring axioms, which include no notion of order. The definition of a greatest common divisor reflects that fact: d = gcd(a, b) iff d divides a, d divides b, and if c divides a and b, then d divides c. No references to order.