Depends on where you are! In some places it is more common to say that 0 is natural and in other’s not. Some argue it’s useful to have it N, some, say that it makes more historical and logical sense for 0 not to be in N and use N_0 when including it. It’s not a settled issue, it’s a matter of perspective.
Those are valid points and make some practical sense, but I’ve talked too much with mathematicians about this so let me give you another point of view.
First of all, we do modular arithmetic with integers, not natural numbers, same with all those objects you listed.
On the first point, we are not talking about 0 as a digit but as a number. The main argument against 0 being in N is more a philosophical one. What are we looking at when we study N? What is this set? “The integers starting from 0” seems a bit of a weird definition. Historically, the natural numbers always were the counting numbers, and that doesn’t include 0 because you can’t have 0 apples, so when we talk about N we’re talking about the counting numbers. That’s just the consensus where I’m from, if it’s more practical to include 0 in whatever you are doing, you use N0.
Also the axiomatization of N is more natural that way IMO.
Depends on where you are! In some places it is more common to say that 0 is natural and in other’s not. Some argue it’s useful to have it N, some, say that it makes more historical and logical sense for 0 not to be in N and use N_0 when including it. It’s not a settled issue, it’s a matter of perspective.
I guess it depends on the place. But the arguments for not including seem futile, when
Of course 0 vs no 0 only matters if you actually do arithmetic with it. If you only index you could just as well start with 5.
(The only reasons I can think of to start at 1 is that 1 is the 1-st element then and the sequence (1/n) is defined for all natural n)
Those are valid points and make some practical sense, but I’ve talked too much with mathematicians about this so let me give you another point of view.
First of all, we do modular arithmetic with integers, not natural numbers, same with all those objects you listed.
On the first point, we are not talking about 0 as a digit but as a number. The main argument against 0 being in N is more a philosophical one. What are we looking at when we study N? What is this set? “The integers starting from 0” seems a bit of a weird definition. Historically, the natural numbers always were the counting numbers, and that doesn’t include 0 because you can’t have 0 apples, so when we talk about N we’re talking about the counting numbers. That’s just the consensus where I’m from, if it’s more practical to include 0 in whatever you are doing, you use N0. Also the axiomatization of N is more natural that way IMO.