goforthetie wrote:lieph82 wrote:OSXpert wrote:
But, yeah, it is one of those "its true because we say its true" things.
Of course, the fundamental theorem of arithmetic could easily be amended to say "prime numbers greater than 1" if the definition of prime were changed. As you say at the end, definitions are really just a matter of convenience.
Yes and no. It is not just "convenience", it is that it makes much more sense to define the set of primes so as not to include 1. For instance, we could define the term "integers" to include pi, but that would just be silly.
Congrats on the win, Kristin!
To include pi in the set of integers, you'd have to amend the definition of "integer" to say "a number that can be written without a fractional component, or pi." That's silly, because why include pi? What makes it more like the integers than 3.7 or phi? But the definition of "prime" is "a natural number greater than one that has no positive divisors other than one and itself," excluding one not naturally, but by affixation. 1 adheres to the other major property of prime numbers (this is not true if you use OS's definition for "prime," which I believe is a teaching tool and not the original definition).
If we took away the affixation from the definition of prime, some theorems would have to be amended to say "for primes greater than 1" and some theorems that already applied to 1 as well as the primes could be simplified in language.
We define things so that they'll be convenient to use when we need them, but definitions in math rarely change. So it's conceivable to me that it might have made sense to exclude 1 from the primes a million years ago, but now, with all of our new theorems, it doesn't. I'm not going to look at every single theorem involving prime numbers and see (for the record, right now I agree with you that the current definition is probably most sensible).