In school, were you taught that 1 is a prime number?

Life is full of situations where the general population is taught an approximation (simplification) that is good enough. People who specialize need to know the specific nuances.

One as a prime is not a problem for most people’s need to know primes. If you are concerned about the unique factors you are in a field that requires the more precise definition.

For me the issue isn’t the primality of one, that’s covered by definition. The issue is the outrage that a high school student who’ll never study higher mathematics might consider one to be prime.

If you’re not concerned about unique factorization, then why are you studying primes at all? That’s pretty much their reason for existing. Primes are important because they’re the building blocks for constructing all the other integers, at least multiplicatively.

35 and this is the first I hear of it not being prime. I was taught if it can only be divided by 1 or itself it’s prime.

It comes off as “gerrymandering” because the definition of prime as “a number whose only factors are 1 and itself” seems to have come first. It then seems that a theorem that every wants to be true and is useful to believe is true is not true unless we change the definition, so sure, go back and change the definition to say “except one.” It seems like cheating. It does not seem right to say “I want to deduce this from a basic set of definitions and aioms and I cannot so I’ll just have to go back and change one of my defintions or axioms to make it work.”

But okay, if “prime” is defined instead as Wolfram does it as

then 1 is not prime.

Chronos most people need primes for things like LCM and reducing fractions in 6th grade and after that maybe hear a bit about how some keep searching for larger primes, about how there is no pattern to primes, and maybe that there is something about factoring primes and encryption. And fussing over whether the “fix” was placed on the defintion or by stating “not counting one” or that “identity does not count as factors” in the theorem (that at their level they had never heard of, even if it was implied in their 6th grade class) is not of any thought.

Putting the fix in the theorem seems at least to be more playing by the rules than changing a definition to make a theorem work.

I remember now why we were taught about primes in the first place: rational fractions in the form “a/b”: to add two such fractions, you had to find the lowest common numerator, so using primes was important, both in reaching the common numerator and in reducing the sum. In such a situation, 1 obviously becomes a stopping point.

Now they just use calculators, no one bothers with fractions or knows what the C/D scale is.

The definition that a number is prime if and only if it has exactly two divisors only applies to primes in the ring of positive integers. If you allow negative integers, primes have exactly four divisors, and in more general rings, you can’t define primes in terms of the number of divisors they have. The definition that Itself gave in post #73 is the one that we actually use, and since every rational number is a unit, there are no primes in the ring of rational numbers.

If you want to approach it from the direction of what came first, both the notion of primes and the theorems about them go back to the ancient Greeks (e.g. Euclid), who, as has been noted earlier, didn’t really consider 1 to be a number, and therefore not a prime number. Here’s Euclid’s definition:

For more historical background about whether/when 1 has been considered a prime, see What is the Smallest Prime? (PDF article).

I hope you meant “common denominator”—but yeah, the earliest context in which many people study primes is in working with fractions.

That was an interesting article. Thanks.

It puts some things in context. I tend to think of mathematics as being sets of clear definitions and axioms from which theorems are deduced and that different sorts of mathematics are be deduced by changing those definitions and axioms, hence non-Euclidean geometry. The way it has actually worked in this case however seems to be that within the same sort of mathematics the nature of what is and is not a number has changed and terms have not had the agreed upon clearly stated definitions that I had thought was the nature of the beast. Interesting.

Which is based on unique factorization.

All of which is only relevant for very large primes, so you’re still not talking about 1.

I’m saddened by the thought of high school students trying to find the common factors between 12 and 9, comparing 3x4x1x1x1x1 and 3x3x1x1x1x1x1x1x1x1

Won’t someone think of the children?

FWIW, I regard the choice of definition to be relatively unimportant. Yes, making 1 non-prime shortens some statements, but the converse is also true:

Goldbach’s Conjecture asserts that every natural even number [del]greater than two[/del] is the sum of two primes.

A composite number is a natural number [del]greater than one[/del] which is not a prime.

Please note that I don’t claim the definition should be reversed or ignored. Just that the choice to define 1 as not-prime is a relatively unimportant detail.

one isn’t a prime because if it was then you couldn’t split a number into it’s prime factors without including itself
(even if the number was not prime it self, messing up the meaning of prime factors)

I believe there was a Numberphile (youtube channel) that explained that, historically, 1 was prime “because its only factors are 1 and itself” but most theorems would have a clause that said “consider all primes except 1”. Thus, they decided that if almost every theorem had that clause they might as well just exclude 1 from the prime category to begin with. Sure, there are some theorems, then, that make you specially include 1, but they’re far fewer than the number that would have to specially exclude it.

(I understand the history has already been discussed in the thread, just adding on).

That said, I’m 23 and I’m fairly sure that except for maybe one teacher I’ve always been told 1 is not prime.

Jragon: this reminds me of my old Topology class. For the first week of the class, the teacher entertained exceptions involving the empty set. “Ah, but what you just said doesn’t apply to the empty set.” “Yes, that’s correct.”

After the first week, the teacher told us to stop it, as it wasn’t amusing any more. He told us always to be aware of such trivial exceptions, but to stop bringing them up in classroom discussions.

It’s the loneliest number since the number one.

And I am not making my point well. Which Septimus pretty much recapitulated. I don’t remember any mention of whether or not 1 is prime and doubt the clause excluding the number one was added in the definition. Heck it seems that histrically its definition and even the definition of integer, has been a bit fluid. (Not something admirable for a deductive system.) Given that, if asked, which I wasn’t at the time, I would have used that definition (the one that does not explicitly say that 1 need not apply) and declared 1 is prime by definition. But its status was something that for math through my level (to calc and statisitics classes in early college) was not particularly important to our understanding and use of primes, whether the clause was here or there, either of which would have been just as arbitrary and just as okay to us if we had thought about it at all.

And I am still not making my point well but oh well.

1 is not prime because the current definition of prime explictly excludes it. Why? To make theorums work that would otherwise have to have a clause stating “excluding 1” to make them work.

Pretty trite.

I’m 37, and that’s what I remember being taught. “One” would technically satisfy that definition, so I think there may have also been a caveat something like the numbers being distinct or something. Or maybe it was just “obvious” that “1” didn’t count since (as you said) you wouldn’t factor it as 1x1.

Actually, that’s not quite what you said. It’s that you wouldn’t write “1” at all when factoring anything, just like I was taught. So, I can’t remember if we were just taught that by definition 1 was a prime, or whether the word “distinct” or similar was put in there somewhere. Of course, that was more than two, two and a half decades ago, so I wouldn’t be able to remember exactly what I was taught if my life depended on it.

I’ve been away from the board for a few days so someone may have already covered it. But the fundamental theorem of arithmetic is not about the sort of arithmetic you learned in elementary school. It’s about arithmetic as equal to number theory.

Does that mean “too old, didn’t read”?