Suppose I ask you to step into the next room, count the people in there, and report the answer back to me. What is the smallest number you can report? Obviously the answer is zero, corresponding to the case when there are no people at all in the next room. Thus it is plain that zero is the smallest of the counting numbers. It is also the smallest of the unsigned measuring numbers: the least possible distance between two points is zero, when the points coincide.
These simple facts, so easily stated, did not become generally understood until modern times—well into the sixteenth century in Europe and even later elsewhere. There is nothing like the history of mathematics to leave you thinking that people in the past were less intelligent than we. That is, of course, an illusion. The correct conclusion to be drawn from Robert Kaplan’s book, which is accurately subtitled A Natural History of Zero, is that until words and symbols have evolved to denote a thing, that thing cannot be discussed or used; and that words and symbols for abstract entities evolve terribly, terribly slowly.
Mathematical entities—numbers, lines, functions—are of course the ultimate abstractions. The millennia-long struggle to get to grips with zero testifies to the deeply unnatural nature of mathematical thinking. Every mathematical truth is like that: extruded after unimaginable intellectual effort and lifetimes of frustration, against all the grain of ordinary human thought and language processes. In the preface to Principia