Ippy Over poker last night we got into a little bit of a number theory argument. One of my friends mistakenly used the word “integer” instead of “prime number” and somehow we got onto defining sets of numbers. (yea, I know we’re dorks, whatever I’m cool with it.)

That led a debate about whether 0 was an integer. Adam and I said “yes, it is an integer, and it’s a real number, you’re thinking of natural numbers. Zero isn’t a natural number.”

But why not?

The natural numbers are often referred to as the “counting numbers” and since defined in the 19th century haven’t included zero. But why?

As a computer scientist it makes a ton of sense to me to include zero, since whenever I run a loop I start counting at 0.

Looking at it from set theory, it makes sense to include zero too.

Each set of natural numbers is equal to the set of sets less than it.

Basically,

0 = {} (an empty set)

1 = {{}} (the set containing 0)

2 = {1, 0} = {{ }, {{ }}}

This gets very ugly as you go on further, but it makes sense.

From a set theory or a computer scientist view, zero should be included in the natural numbers.

In fact, the only reason I can see not to include it is for nostalgia reasons and confusing test questions in number theory classes.

What do you think? Is it time we let zero in on all the fun of being a natural number?