November 29, 2021

Should Zero Be A Natural Number?

Over poker last night we got into a little bit of a number theory argument. One of my friends mistakenly used the word “integer” instead of “prime number” and somehow we got onto defining sets of numbers. (yea, I know we’re dorks, whatever I’m cool with it.)

That led a debate about whether 0 was an integer. Adam and I said “yes, it is an integer, and it’s a real number, you’re thinking of natural numbers. Zero isn’t a natural number.”

But why not?

The natural numbers are often referred to as the “counting numbers” and since defined in the 19th century haven’t included zero. But why?

As a computer scientist it makes a ton of sense to me to include zero, since whenever I run a loop I start counting at 0.

Looking at it from set theory, it makes sense to include zero too.

Each set of natural numbers is equal to the set of sets less than it.

Basically,

0 = {} (an empty set)
1 = {{}} (the set containing 0)
2 = {1, 0} = {{ }, {{ }}}

This gets very ugly as you go on further, but it makes sense.

From a set theory or a computer scientist view, zero should be included in the natural numbers.

In fact, the only reason I can see not to include it is for nostalgia reasons and confusing test questions in number theory classes.

What do you think? Is it time we let zero in on all the fun of being a natural number?

About Ryan Jones

Ryan Jones is an SEO from Detroit. By day he works as a manager of SEO & Analytics at SapientNitro where his team performs SEO for Fortune500 clients. By night he's either playing hockey or attempting to take over the world with his own websites - which he would have already succeeded in doing had it not been for those meddling kids and their dog. The views expressed here have not been paid for and belong only to Ryan, not any of his employers or clients. Follow Ryan on Twitter at: @RyanJones, add him on Google+ or visit his personal website: www.RyanMJones.com