Math: One is NOT a Prime Number

Fact: according to all mathematical definitions, the number 1 is not a prime number

Primes

One definition of a prime number is: any number that has only two positive integer factors — itself and 1. The following are valid prime numbers.

2, 3, 5, 7, 11, 13, 17, 19, 23, 29 ….

It is very good to know that 2 is the only even prime number. It is often handy to know the first eight or ten prime numbers.

Notice that 1 does not fit the fundamental definition for a prime number: it has only one positive integer factor, itself, not two. Therefore, it is not prime.

Prime factorizations

Now, to some folks, that rule will seem like a trivial technicality. We permanently excluded Fred from the elite country club because he owns one powder blue tie instead of two. We permanently excluded 1 from the elite set of prime numbers because it has only one positive factor instead of two. What kind of arbitrary rip-off is that? Who can we sue?

It turns out, as usual, mathematicians have deep reasons for the way they draw fine distinctions. Let’s think about this. One major rule of arithmetic is: each positive integer greater than one has a unique prime factorization. (This rule is so important, its official name is The Fundamental Theorem of Arithmetic!) The prime factorization of a number is like its DNA: we know its exact constituents, and thus can determine every single one of its factors. For example:

36 = 2 \times 2 \times 3 \times 3

That is the unique prime factorization of 36, the only way to multiply prime numbers to get a product of 36. Now, pretend for a moment that 1 were a prime number: if we defined things that way, what would the consequences be? Instead of having a unique prime factorization, every number would have an infinite number of prime factorizations. For example, the first few prime factorizations of 36 would be:

36 = 2 \times 2 \times 3 \times 3

36 = 1 \times 2 \times 2 \times 3 \times 3 (duh!)

36 = 1 \times 1 \times 2 \times 2 \times 3 \times 3 (duh!)

36 = 1 \times 1 \times 1 \times 2 \times 2 \times 3 \times 3 (duh!) etc. (an infinite number of duh! statements)

Not only would we demolish a perfectly good rule of arithmetic, the Fundamental Theorem of Arithmetic, but in doing so, we also would gain an infinite number of absolutely useless statements. That’s a lose-lose trade-off! Mathematicians, sensing this lose-lose situation, choose to head it off at the pass simply by stating, by fiat, 1 is not a prime number. Choosing this particular definition renders irrelevant this troubling situation with an infinite number of duh! statements as well as numerous similar problematic situations. Mathematicians are crafty enough to realize they can avoid a whole boatload of problems just by making a single stipulation: 1 is not a prime number. This is the deep reason for the rule.