I just found out about this debate and it’s patently absurd. The ISO 80000-2 standard defines ℕ as including 0 and it’s foundational in basically all of mathematics and computer science. Excluding 0 is a fringe position and shouldn’t be taken seriously.
Ehh, among American academic mathematicians, including 0 is the fringe position. It’s not a “debate,” it’s just a different convention. There are numerous ISO standards which would be highly unusual in American academia.
FWIW I was taught that the inclusion of 0 is a French tradition.
The US is one of 3 countries on the planet that still stubbornly primarily uses imperial units. “The US doesn’t do it that way” isn’t a great argument for not adopting a standard.
I could be completely wrong, but I doubt any of my (US) professors would reference an ISO definition, and may not even know it exists. Mathematicians in my experience are far less concerned about the terminology or symbols used to describe something as long as they’re clearly defined. In fact, they’ll probably make up their own symbology just because it’s slightly more convenient for their proof.
I just found out about this debate and it’s patently absurd. The ISO 80000-2 standard defines ℕ as including 0 and it’s foundational in basically all of mathematics and computer science. Excluding 0 is a fringe position and shouldn’t be taken seriously.
Ehh, among American academic mathematicians, including 0 is the fringe position. It’s not a “debate,” it’s just a different convention. There are numerous ISO standards which would be highly unusual in American academia.
FWIW I was taught that the inclusion of 0 is a French tradition.
The US is one of 3 countries on the planet that still stubbornly primarily uses imperial units. “The US doesn’t do it that way” isn’t a great argument for not adopting a standard.
I could be completely wrong, but I doubt any of my (US) professors would reference an ISO definition, and may not even know it exists. Mathematicians in my experience are far less concerned about the terminology or symbols used to describe something as long as they’re clearly defined. In fact, they’ll probably make up their own symbology just because it’s slightly more convenient for their proof.
From what i understand, you can pay iso to standardise anything. So it’s only useful for interoperability.
Yeah, interoperability. Like every software implementation of natural numbers that include 0.
How programmers utilize something doesn’t mean it’s the mathematical standard, idk why ISO would be a reference for this at all
Because ISO is the International Organisation for Standardization