Some mathematical definitions are arbitrary, “Counting Numbers” for instance…
Zero used to be excluded, but now many prominent mathmeticians are point out than if you have five apples and a bully takes five of them away, you have ‘none’ or zero. The concept of apples doesn’t magically disappear when you don’t have any, thus you must have way to express how many you (don’t) have. I personally start counting at zero. If I have no money and you give me a dollar bill then ask me how much I have, I take my old total and add one, $0 + $1, getting one. This is the same operation I’d do if I was starting at $10,276.
Anyways, some definitions are arbitrary, some highly respected mathmeticians think one thing, some think the other, and there isn’t really a right answer, because if nobody counted 13, for superstitious reasons, we could exclude it from the set of “counting numbers”.
Other definitions appear arbitrary, yet aren’t if you examine them in depth.
10^2 which is 10*10 is 100, 10^1 which is 10 is … well … 10. 10^0 which is … is 1… How is that again?
It makes absolutely no sense if N^x just means writing ‘x’ Ns with *s between them. Yet if you understand the operation, especially the inverse operations, it makes sense.
So, is the definition of ‘prime numbers’ arbitrary in its exclusion of 1? 1 does fit the standard definition, having only one and itself as factors, if you don’t assume that the two factors must be different.
Are there any functions which prime numbers can be tested with which we could show that 1 is not a prime number, though it appears to be?
Just saying that “one is the unit, and thus different” isn’t good enough.
There is one piece of evidence against 1 being prime. That is, take the simple method of finding primes, start at a number, cross out all of its multiples, go to the next non-crossed out number, cross out all of its multiples. When you decide to stop, look at the list of numbers you’ve passed. Any that aren’t crossed out are prime.
If you start with 1, it’s trivial to see that this doesn’t work, where if you start at two, it does.
But, does this mean the prime-generation method is flawed, or that one isn’t prime?
What tests for primeness, asside from a definition which arbitrarily excludes it, does 1 fail?