I just found out about this debate and it’s patently absurd. The ISO 80000-2 standard defines ℕ as including 0 and it’s foundational in basically all of mathematics and computer science. Excluding 0 is a fringe position and shouldn’t be taken seriously.
I could be completely wrong, but I doubt any of my (US) professors would reference an ISO definition, and may not even know it exists. Mathematicians in my experience are far less concerned about the terminology or symbols used to describe something as long as they’re clearly defined. In fact, they’ll probably make up their own symbology just because it’s slightly more convenient for their proof.
they’ll probably make up their own symbology just because it’s slightly more convenient for their proof
I feel so thoroughly called out RN. 😂
My experience (bachelor’s in math and physics, but I went into physics) is that if you want to be clear about including zero or not you add a subscript or superscript to specify. For non-negative integers you add a subscript zero (ℕ_0). For strictly positive natural numbers you can either do ℕ_1 or ℕ^+.