All of our human mathematics, all of our arithmetic, goes back to a simple idea: that the world with which we concern ourselves is made up of near-identical and so countable things. One person, two people, three people … One sheep, two sheep, three sheep … One stone, two stones, three stones …
When you can see the world in such interchangeable units and read them as whole numbers, the integers, you are on your way to the arithmetical actions of addition and subtraction, multiplication and division. Three people stand on a street corner, then two more join them, and now five people stand there. I have ten sheep, but the wolf eats two of them, so now I have eight sheep. Three of my ewes each bears two lambs, and now I have six sheep.
Only after you discover the integers do you come to the fractions. I have two bags of apples, but part of one bag is rotten, so now I have, let me count them … a bag and a half. And from there, it’s just a few more steps to the concept of zero, then the negative numbers, exponents, square roots, algebra, trigonometry, analytical geometry, and calculus. Ah, calculus—which is the art of quantitating smooth curves, variably timed rates such as acceleration and deceleration, and other measures that can change shape continuously and so do not yield results in easy integers.
But what if our brains, our eyes, and our sense of proportion did not start out by seeing units at all? For in truth, no two things in the universe are exactly alike. One stone is not like another, because the first stone you pick up might weigh 212 grams and be made of quartz, while the next is 310 grams and made of feldspar. One sheep is not like another, because to start with, some are rams, some lambs, some ewes. Or, with a breeder’s eye, one can see that this sheep has the DNA from one genetic line, while the next sheep diverged from that line of inheritance and has brought in other gene sets in every generation.1 True, most of us don’t see such divergence so clearly in our animals, but try telling that to the people who judge dogs at the American Kennel Club!
You might say that we would still count the years, the days, and other such divisions of existence, down to the hours and minutes in a day. But remember, it was not until the invention of clocks that most human beings had any notion of passing daytime or nighttime beyond the “canonical hours” set aside for certain prayers. And in the experience of most early peoples, the division of each year into different seasons appropriate to planting and harvesting, roundup and slaughter, was more important than counting from one year to the next. From a certain point of view, any divisions of time are meaningless except for the all-important “now.”
If we lived in a world—that is, if our brains made such distinctions of precision and granularity—where we saw no two things as being alike enough to count, then human-style mathematics might never have gotten started. Of course, we would eventually have run into indistinguishable units, the groundwork of integers, in studying atomic structure. There, for all we can discover, one proton is exactly like another, one electron like another, one neutron like another, and so on through the catalog of particles in the Standard Model.2 In such a universe, one proton makes a hydrogen atom, two make up helium, three lithium, and so on through the Periodic Table. But it’s more probable that, if humans had possessed such granularized, distinction-making vision that we never discovered numbers per se, then we probably would never have developed the theories or the instruments needed to study atoms, subatomic particles, or quantum physics. We would live in a completely different conceptual universe.
A world that was so granularized that it could not identify a likeness of things and group them would be a world without descriptive nouns, either. One could not speak generically of a “tree” but only of a “red poplar” or a “white birch.” And not even that, because those are species—a grouping of similar individuals. Instead, one could only speak of the “old, red-colored, vertical-standing life form which my grandfather planted in the north corner of the yard.”
It goes without saying that in this numberless world we would have no collective nouns, no words with which to identify a group of things, because those things would be so particular and unique that they could not be gathered into groups. It would be a world with barely any words at all. Instead, we would refer to the unique shape, composition, heritage, smell, or taste of every object, like a dolphin or a bat trying to describe its environment only through the echo-return of its own pings and squeaks.
It’s hard to imagine any thinking creature that did not or could not examine individuals cases, make associations through likeness of dominant or noticeable features, and group these cases—and so create the occasions for counting and other mathematical manipulations. Certainly, a creature or a person with extremely primitive awareness, an incurious nature, or a fixed attention to a particular detail might make comparisons solely through examination of unique features. This tree is not that tree, because this one has a fuller, more rounded shape, while that one is rather spindly—like a family judging Christmas trees on a lot. Or, this tree is not that tree, because this one has green foliage, while that one is colored yellow and red—like a traveler enjoying the turning leaves on a trip through New England in the autumn. An amoeba, with no thinking apparatus at all, might consider every crumb of food that it encounters—or every change in the temperature, salinity, or acidity in the water surrounding it—to be a unique and nonrecurring event.
You could almost say that the act of noting and comparing differences and likenesses, cataloging traits, generalizing about them, and making associations is the origin not only of numbers but also of words. Numbers and words are the beginning of our thinking processes. They are the basis of perceiving our surroundings and separating ourselves from our environment and experiences.
Without them, a person would be an automaton, reacting to each stimuli, making anew his decision in regard to every event, living in a void absent of time and memory. This is the reality of one-celled organisms, vegetables, filter-feeding mollusks, and insufficiently programmed artificial intelligences. If we had no numbers, no groups, no names for things as classes other than as unique incidences—we would not be human. And if we were to encounter such creatures, say, out among the stars, we would not be able to communicate with them, much less share in their viewpoint and experiences.
We already suspect that life—if it does exist elsewhere in the universe—will come in many strange and wonderful forms. The same must be said for the kinds intelligence that life spawns.
1. Ah, but if we want things to count, you say, we can count generations! But can we, really, when a mother can have a child every year or so? To which generation, then, does the first child in the family belong, compared to the last who comes a dozen years later? The situation is even more complicated when different fathers may be involved. You can make a distinction that the children are all related genetically in being the immediate offspring of one mother, rather than being among her grandchildren or great-grandchildren. But on an individual level, the concept of “a generation”—the age cohort or time period into which I was born—becomes granularized. I may have been born to the Baby Boom generation, but because my parents delayed the birthing experience longer than most, they were older and more established than the parents of many of my peers, and so they died sooner than did the parents of most of my friends. And if you view an extended family of aunts and uncles, nieces and nephews, and unrelated grandparents and in-laws, generational lines can become quite blurred.
2. And is not the Standard Model becoming so varied and fragmented that we might eventually be left with a universe of strangely unique particles and essences?