Sunday, December 27, 2015

My Problem with Math

All right, I was never very good at math. But I do have a pretty good sense of timing and proportion. When I set my alarm clock at night, I tend to wake up one minute before it goes off in the morning. I also have a good sense of physical spacing and, when I was working with layouts and typography, I could spot a misalignment on the order of one point—about equal to a millimeter.1 In high school, I was good at geometry, which involves a mixture of spatial sense and deductive logic. But algebra and its equations were mostly a puzzle for me, and quadratic equations were my downfall.

Still, I can appreciate mathematics. And I understand that the difference between our lives today and the world and its technologies in ancient and medieval times has a lot to do with applications of trigonometry, analytical geometry, and calculus. But sometimes I think our popular notions of logical proof and certainty go overboard when it comes to mathematics.2

First, mathematics is a human language. But it is unlike other human languages, which can be innately fuzzy. Spoken language is always changing: evolving its pronunciations, such as the English vowel shift; contracting long words into shorter ones by dropping vowels, consonants, and whole syllables; trying to group word forms together to capture new ideas—the Germans are good at this—before chopping them down again; and tinkering with a word’s basic meaning and sense.3 By comparison, the unspoken and more academically based language of mathematics is precise and relatively unchanging.

But math is still just a language, useful for describing things and stating ideas. It helps to state an object’s velocity as so many miles per hour or feet per second, rather than just saying “pretty fast” or “very fast.” Mathematical equations are used to describe motions, transformations, and the actions of large groups, and so they are suited to express thoughts in physics and chemistry. More recently, mathematics has come to dominate certain biological studies such as genetics, cytology, and epidemiology. Most recently, mathematics in the form of numerical models—which are simply a series of equations joined together and set in motion over some arbitrary time scale—have come to describe and predict the action of complex processes such as macroeconomics and weather.

As a language, mathematics is merely the expression of an underlying human thought. The language of mathematics may accurately describe a situation that would be hard to express accurately and unequivocally in English or any other spoken language. It might be harder still to hold that thought immobile in any spoken language over a span of several generations, given the tendency of languages to evolve. Anyone who has grappled with the word choices and shades of meaning in Shakespeare or the King James Version knows this.

But expressing a thought or a relationship or a transformation in the language of mathematics does not prove the truth of the underlying idea. Languages—including mathematics—can express and describe, they can invoke and use logic. But by themselves description and logic are not a proof of anything.

Second, mathematics and its underlying logic may be misapplied. It is possible to make mathematical statements that may be arithmetically and perhaps even logically true, but that do not describe anything in the real world.

I can write a series of logical syllogisms or propositions about black swans and white swans that may be rational and lucid and have nothing to do with real birds. I can describe my trip to Los Angeles, not in so many miles, but in gas stops and bathroom breaks, or in the number of Starbucks® Frappucinos® consumed between them. I can then perform complex transformations with these coffee drinks, but the computations still won’t get you anywhere. Lewis Carroll had exquisite fun with these kinds of cogitations in Alice’s Adventures in Wonderland and Through the Looking-Glass.

A great deal of modern physics—from the calculations underlying fusion experiments to the mathematical models underlying much of quantum mechanics, string theory, and cosmology—smacks of this sort of dealing. Schrödinger can describe the state of his cat in a box with a wave function showing the cat to be both alive and dead until the box is opened and the wave resolved. It’s a perfect metaphor—a language equivalent—for the unresolved and unknown state of a particle traveling through space, which can only be resolved and known by physical observation. But for anything so large and complex as a cat in a box with bottle of cyanide, applying equations is ludicrous. The cat is either alive or dead—and in the former case, it knows itself to be so. The underlying idea of a cat actually suspended between life and death, although elegantly expressed in mathematics, is nonsense.

Similarly, in string theory, the unification of the four basic forces of physics requires that reality in its smallest measures—too small to be noticed in everyday life or detected directly by observation—be composed of multiple dimensions wrapped around themselves. It’s an elegant proposition and mathematically impeccable, but it’s also unprovable by physical experiment. The underlying idea, expressed in mathematics, leads nowhere.

Third, the application of a mathematical idea may not in itself be complete. A system of equations that tries to express very complex systems, like weather or currency movements or animal migrations, must make choices involving the number of variables to include and how to interpret and weight their actions and effects. The main choice is always between completeness—and so an increasing complexity that may approach an infinity of variables—and manageability and comprehensibility.

Here I am reminded of another type of modeling: the choice that a war game designer must make between playability and producing realistic feeling and results. Simple games with few exceptions and die-roll modifiers are the most playable, but they can become so abstract that one loses the feeling of battle and gets all-too-predictable results. Complex games that include many calculations for actual battle conditions—unit morale, lines of communications, visibility and cover, state of supply, weapons calibers and penetration values, windage, time of day, etc., etc.—are elegant and yet unplayable. A three-minute battle turn can take half an hour or more to play. But, in the other direction, consider chess, with its simple rule set, as a model of diplomacy and warfare: the play becomes so abstract that it no longer feels like either diplomatic maneuvering or war.

Fourth and finally, math itself is still a human creation. Some of its principles—the order of the integers, the proportions of the fractions, and oddities like pi and the golden ratio—would seem to be ordained by a non-human hand and descended from nature. But other principles, used in everyday equations, are the invention of human minds working against human-imposed limitations. The invention of imaginary numbers—for example, the square root of a negative number, where arithmetic defines multiplication of two negatives to always have a positive result—is an attempt to climb over this limitation. Or to identify a proportion that cannot be expressed as a ratio of two whole numbers, like pi or the square root of two, as irrational is just a species of fussy bookkeeping. The human mind wants the world to be flat and all fractions to express themselves simply—but it isn’t, and they don’t.

In the same way, the application of mathematics—especially when it comes to creating equations that map and study problems in economics and meteorology—is still a human activity. In that way, math is an expression of human ideas—not much different from creating a story or myth where all the parts come together neatly to satisfy the human psyche. But how many economic models have accurately predicted the state of the stock market or the national money supply at the end of the next quarter? How many weather models—absent satellite observations that can be measured and tracked on a map—can tell you exactly when the rain will start falling?

While mathematics may follow fixed, discoverable, and rational rules, its applications do not. Human beings can still fool themselves and tell half-truths and whole lies in any language.

“My equations show”—
be they ever so precise—
does not make it so.

1. And I’m a demon—practically OCD—about straightening pictures, rugs, placemats, and other things that can go crooked. I believe this comes of having an landscape architect for a mother and a mechanical engineer for a father. They ran a ruler-straight home with everything spaced just so.

2. This is a theme carried forward from past blogs. See, for example, Fun With Numbers (I) and (II) from 2010, as well as Fun with (Negative) Numbers from November 3, 2013.

3. By “sense” I mean the basic nature of the word. A word can go from ameliorative (tending to praise or find the good in something) to pejorative (tending to fault or demean) over time. An example is our English word “nice,” which used to mean “simple” and now means “pleasing” or “charming.” It can also mean “precise” or “perfect,” as in “That was a nice golf shot.” When my family moved to New England, I learned that “wicked,” rather than meaning “evil,” could be used to mean “extremely” or “cool,” as in “That was a wicked fast ball.” And so, of course, a wicked fast ball can also be a nice fast ball.

Sunday, December 20, 2015

Needs vs. Wants II1

I love what I call “cleavage questions”: those that split the issue down the center, like a diamond cutter, and expose its heart. And I think I’ve found one. What does each side in our political spectrum aim to achieve? People on the left side of the aisle aim to provide everyone with what he or she needs. People on the right side aim for everyone to have the opportunity to get what he or she wants.

And that leads immediately to the question of methods.

To provide everyone with what he or she needs, you must first determine what those needs are. Unfortunately, any mass of individuals so large, diverse, and unwieldly as a city, state, or nation cannot express any simple hierarchy of needs that will apply to every member.2 Even if you could query each member individually and create a huge database of individual needs, then fulfilling them in a timely fashion would be next to impossible. We can barely keep up—either administratively or financially—with providing the barest necessities to people in need under existing blanket programs like Social Security, Medicaid, food stamps, or general welfare. To go into those systems and reorganize their allocations based on individual preferences would be a bureaucratic nightmare.

So the definition of need is left to public advocates. These may be the administrators of existing fulfillment programs like Social Security, or the congressional representatives in charge of mandates and funding for these programs, or self-appointed public advocates who attempt to speak for the recipients of the programs. Given that all of these advocates are also human beings—and not artificially intelligent, imagination-neutral robots—they will tend to temper their recommendations with their own ideas of human nature, adequate sustenance, sustainability, reciprocity, and fairness. They are in a position to decide what other people need—and so they will decide. And self-appointed public advocates, whose pronouncements focus solely on the beneficiaries, are even free to ignore constraints like financial resources, program sustainability, and moral or economic parity.

It’s too easy, also, for advocates to attribute to the individual men and women in their charge the kind of brute stupidity and errant foolishness that only shows up in the movements of large, mindless groups. And so the public policy mavens will to override a simple statement of wants, such as “I like peanut butter and jelly on white bread,” with their own views about the the allergy potential of peanuts, the sugar content of jelly, and the benefits of whole wheat bread. When you speak for others, it is easy to replace living, breathing people like yourself and those you hold near and dear with a kind of fiction: gray, faceless, nominal “people,” not too bright, with poor insight, limited education, bad habits, and no self-respect. This vision comes with many names: the “booboisie,” “hoi polloi,” “Joe Sixpack,” and “the people.” When your business is taking a statistical average among people you don’t know—and without doing any rigorous statistical work to begin with—it’s almost impossible not to aim low.

And so your recommendations as a public advocate tend to be colorless, drab, unexciting, and devoid of spirit. In this I am reminded of why I loathe the work of Consumers Union® and the evaluations and recommendations in their publications. All of their choices are for basic, no-frills, least-cost options that put a premium on safety, reliability, and sustainability and care not a fig for innovation, style, or charm. They would have every car be as exciting and intriguing as a refrigerator—if you didn’t particularly like your refrigerator and didn’t care for optional functions like automatic defrosting and ice making. Consumer Reports is a guide for people who buy things they need but don’t want and will use but not enjoy. These are products for people who’ve had the nerves connected to excitement, adventure, love, and occasionally coping with the quirky defects of innovative products burned right out of them.3

Further, when you have this low opinion of human nature based on an imagined “Joe Sixpack” or “the booboisie,” the temptation exists to try to fit all the square people into your rounded hole. You tend to teach to the lowest common denominator, inform and entertain at the lowest acceptable levels, and expect the people you serve to exhibit passive conformity instead of active individuality.4

On the other side of the aisle, when your aim is to provide people with the opportunities to get what they want, your methods immediately change. You are no longer in the business of polling everyone about their desires, because it is not your responsibility—from the standpoint of a single entity, organization, or government body—to provide for that galaxy of divergent wants.

You also assume that other people are not unlike yourself, in that they may have different interests, requirements of taste and color sense, thresholds of boredom and excitement, appreciative abilities, and imaginations. As you prefer pistachio ice cream over plain vanilla or chocolate, and would rather drive a fast and expensive car over a staid and practical one—all without thinking of yourself as either a weirdo or a jerk—you can imagine that other people will have desires and ambitions that lie outside the mainstream as well as outside your personal preferences. And you can accept their differences of taste and perception without pausing to consider these others as weirdos and jerks.

From this viewpoint, it doesn’t matter if people are innately smart or stupid. They are as they have been brought up to be, and their life situation is not your immediate concern. You are free to view people as varied and diverse: some smart, some stupid, some foolhardy and likely to lead short lives, some prudent and likely to live long, some elegant, and some slovenly. You are also free to pick and choose your friends and associates, and to hope for the best for your children. You will take people as you find them and not as you wish them to be.

This is because you do no adopt the mantle of satisfying either their needs or their wants. When you trust that other people may be about as smart as you are, then you trust that they will know the difference between a basic necessity and a needless frill. You trust that they will see to their own needs and those of their families first, and then spend their resources either on novelties and excitements or on saving for the future. Which they choose is their lookout and not your immediate business.

From this viewpoint, you know that people are also ambitious and creative—or at least a large fraction of them are. With those innate drives, and with a random mixture of skills and talents in the basic population, you can trust that people and organizations will arise to provide for needs and desires. Some will cater to the bland and boring basics, like whole wheat bread and bran muffins, while others will serve niche markets in artisanal jellies and Viennese pastries. Who arises to provide which need or desire is still not your business or responsibility.

Your business is to provide opportunity to access the goods and services available in a free market—not to provide the good or service itself. Your responsibility as an entity, organization, or government is then to guard against things that fall under the rubric of “restraint of trade.” In this role, you will certainly want to guard against monopoly power, because monopolists tend to ignore market signals and promote decreasing quality while increasing price. You guard against confidence tricksters and scam artists who sell promises not backed by actual goods and services. But it is not your business to sample and approve all goods or license all levels of service, because you know that the market will eliminate those who do not provide quality commensurate with price. And believing in the innate sense of your fellow human beings, you trust them to be a little bit suspicious and to subscribe to “buyer beware” as an economic principle.

When your aim is to provide for needs, you tend to believe in conformity and sameness. When your aim is merely to guarantee opportunity, you tend to believe in freedom and possibility. As someone with no faith in Utopia and with a basic belief that the world will spin as it spins in good times and bad, I cheerfully opt for the latter.

1. For parallel thoughts on this same topic, see Needs vs. Wants from April 5, 2015.

2. Some attempt at this was made in the 1940s with the Second Bill of Rights proposed by President Franklin D. Roosevelt. His “economic bill of rights” included employment and the right to work; food, clothing, and leisure; a farmer’s right to a fair income; freedom from unfair competition and monopolies; housing; medical care; social security; and education. Many of these “rights” have become the basis for later federal programs and departments of the executive branch.
       As others have already pointed out, the original Bill of Rights attached to the U.S. Constitution was about people being left alone to live as they wanted—free to worship, free to speak, to defend themselves, to be protected against unreasonable searches and seizures, against witness that incriminates him- or herself, and so on—which are rights that cost nothing to provide, except in terms of placing limitations on government prerogative. Roosevelt’s Second Bill of Rights, however, is about providing tangible assets, goods, and services—paid employment, food, clothing, a house, doctor services, insurance, teacher services—which all require someone else to provide them at either personal or government expense.
       And still these needs may not satisfy everyone. For example, my definition of employment might include a level of interest, achievement, and compensation—for example, I’d really like to be paid a million dollars to write an epic poem about the Apollo space program—that others are not prepared to provide under any conditions. And don’t even start asking about my needs for food, clothing, and leisure activities!

3. I am reminded here about the comment from Democratic presidential candidate and U.S. Senator from Vermont Bernie Sanders, that we don’t need twenty-three different brands of deodorant. Of course, we don’t “need” them if we all don’t care how we smell or if we all have the same skin conditions. The fact that twenty-three brands are viable in the marketplace should tell you something about human nature.

4. So the Soviet Union thought to engender a new race of Homo sovieticus, a conforming human devoid of initiative, ambition, desire, inquiry, or family feeling. A drab people to drive cars not even as exciting as refrigerators.

Sunday, December 13, 2015

After the Wheel

A Facebook friend recently posted a short video showing a team of horses pulling a stalled tanker truck out of a snowbank. The four horses were harnessed in parallel to two doubletrees, an elaborate amount of tack to have on hand for an emergency procedure. And the animals pulled well together, suggesting they were a team sent to the accident site from a nearby farm. The video was shot in Central Pennsylvania, which is small-farm country, but I would also have guessed somewhere in Eastern Europe—anywhere people might still work with trained horse teams. Certainly, our factory farms out West use tractors, and fairly large, high-tech, GPS-guided ones at that.

But all of this is reflection after the fact. What struck me in the moment of watching those four horses pull the front of the truck back around on the snow- and ice-glazed road was how surefooted they were. The horses didn’t need snow tires or chains, or salted roads, or even a smooth road surface, as our wheeled vehicles require. And that gave me a vision of the future—or one possible future.

The origin of the wheel as a transportation device dates back to Mesopotamia in about 3,500 B.C. This makes it a fairly late invention, compared to the domestication of crops and farm animals, and about contemporaneous with smelting of copper, one of humanity’s first metals. As suggested at the link above, inventing the wheel required a good set of metal tools to work wood, because you’re not going to make useful wheels out of quarried stone you’ve hammered into shape with some harder kind of rock. You need chisels and planes to fit and join sections of plank into a stable, round form, because you can’t just saw off the end of a log and get four wheels of almost identical dimensions, or not dependably. And you need drills to bore a hole for the axle, and a draw knife to shape the end of the axle to fit inside the hole. You also need to invent and affix a locking nut or cross peg to the outer end of the axle, in order to keep the wheel from working off and rolling away. That’s a lot more sophisticated than just shaping some round logs for rolling one of Pharaoh’s stone building blocks across a pathway of smoothed, compacted sand.

But the wheel you make also needs a fairly level, smooth surface to roll on—a flat riverbed or graded road. As any Jeep owner will tell you, driving across open country littered with chuck holes, downed logs, and medium-sized boulders is hell on wheel rims and tires. In desert conditions, you can probably stick to level ground and the occasional dry wadi. In the more hilly and wooded parts of the Mediterranean and Europe, the Romans built elaborate, paved roads so they could move marching men and supply wagons quickly to access conquered countries. In America, we pour millions of cubic yards of concrete and roll out tons of asphalt to maintain our roads, which lace together every household, town, city, and state. We nail or bolt millions of miles of steel ribbons onto wooden or concrete crossties embedded in gravel to guide the wheels of our railroads. We put a lot of work into all this infrastructure to keep those wheels of all dimensions and materials turning.

Interestingly, the indigenous peoples of North and South America—those who walked across the Bering land bridge during the last Ice Age—never invented the wheel and so had no need of roads. The Incas created a fairly sophisticated empire based on footpaths through the mountains. Other cultures used dogs for carrying and hauling their burdens—that is, until the Spanish brought horses into the Western Hemisphere and let enough of them escape to form wild herds that the natives could catch and tame.

The wheel is considered one of the six “simple machines,” along with the lever and the wedge.1 It has also forced our modern technology into a very specific pattern. The earliest source of non-animal, non-muscle mechanical power was the water wheel—which was the thinking man’s adaptation of the wagon wheel. The thinking must have been something like: “If the wheel turns when it travels along the road, what would happen if the road were to run under the wheel? Like that stream there, where the water runs down the valley?” That and the windmill—another adaptation of the wheel—meant that our earliest machines imparted a circular motion to all our tasks. The first grain mills not powered by human muscles rubbing the kernels between one stone and another used water or wind power to drive large stone grinding wheels. When steam was first applied to pumping out mines and then to driving riverboats, the up-and-down, in-and-out action of the cylinder and its piston had to be converted to the more familiar circular motion through a crank shaft and a series of gears.

The first electric motors mounted magnets in a circle around a shaft and spun them inside an armature of other magnets. But, as demonstrated by magneto-hydrodynamic (MHD) generators for making electric current, and linear induction motors (LIM) and solenoids for applying it, spinning shafts are not the only way to use electricity. However, the concept of an “engine” was so firmly fixed on the idea of circular motion and rotating shafts that almost all our inventive effort went into making these things go round. And certainly, a spinning crankshaft or rotor mated well with turning the wheels of our “horseless carriages.”

Perhaps things would have come out differently if those indigenous American cultures had advanced far enough, without early influence from the Old World, to discover and make use of electricity. Then, inspired by human and animal muscles, the leverage in elbows and knees, and their experience of grinding corn and cocoa beans under oblong stones going back and forth on the flat-rock metate, the first electricity use might have been linear instead of circular.

So the vision I had, watching those horses drag the truck out of the snowbank, was that we may one day return to that linear motive force—but not by reverting to the use of animal muscles. With our advances in computer technology and sophisticated control systems, people are now working on robotics and other automata—things that move by themselves. And right now, the science seems to be at a fork in the road.

Many robots roll around on wheels, but they all require smooth surfaces, flat floors, and inclined ramps or elevators instead of stairs to go from one floor to the next. True, there are neat ways to make a trio of wheels rotating around a common axle climb a set of stairs—but that’s a one-time application, designed for stair step with particular dimensions of elevation and depth. That triple wheel totally breaks down if the gadget encounters a step that’s too tall or has to move across a boulder field. The Mars rovers use cleated wheels, but they must stick to the wind-swept plains and the fortuitous ancient streambed.

Various laboratories are now working on walking robots, which use linear motors based on pistons, pulleys, tension cables, and other straight-line drives to work the joints in hypothetical hips, knees, and ankles. They require a sense of balance—usually through gyroscopes and strain gauges—to remain upright. And they must be coupled with sophisticated control systems, usually combining visual and tactile sensors, to place their feet accurately and in proper order. From what I can see in the video clips that occasionally get out to the public, these robots can walk or run on level surfaces and up modest inclines—about the same achievement as a wheeled vehicle.

One of my articles of faith is that we are not at the end of our discoveries in any field. In many applications, linear motors, computer controls, and sensor systems to emulate animal models are lagging behind rotary motors driving shafts and wheels. But we are still in the early stages of discovery and development in robotics. I can envision a future—not perhaps in the next hundred years, but certainly within the next thousand—where cyber-controlled devices will be so ubiquitous and the use of electro-mechanically adaptive materials that emulate muscles will be so common, that the most useful vehicle for human and cargo transport will not be wheeled. Instead, when we must travel over the ground, instead of flying, we will use multi-legged transports. Being as surefooted as a horse or mule, they won’t need a prior investment in the infrastructure of flattened surfaces, graded roadbeds, or steel rails. Riding on or inside one of these cyber-creatures would mean that your route and destination are not limited to a layout of prepared pathways. You could travel cross-country, or even climb the side of a mountain, if you chose.

Of course, by then we humans will probably have paved the surface of the Earth and even extended our roads a good ways out beyond the low-tide mark. But on the other planets we likely will not have—or take the time to make—such conveniences. A mechanical horse would go a long way on Mars or Titan. And its only disadvantage would be that, unlike a real horse, you can’t eat it if your supply of food packs happens to run out.

1. Classically, these six machines are the lever, wheel and axle, pulley, inclined plane, wedge, and screw. The definition of a simple machine is a device that changes the direction or magnitude of an applied force. But if you want to get mystical about it, the wheel and pulley are just different applications of the same invention, as are the wedge and the inclined plane. So definitions of the earliest inventions will vary.

Sunday, December 6, 2015

The First Great Revolution

One of the great dividing lines in early evolution on this planet—the first great revolution, if you will—was the move from single-celled organisms like bacteria to multi-celled organization like plants, animals, and the rest of us. That is, in the terms that biologists use, to go from prokaryotes to eukaryotes.1

The main difference in cellular structure between prokaryotes and eukaryotes is the presence of a nucleus. Bacteria have none. Their DNA is loosely scattered throughout the cellular protoplasm. This means their DNA gets randomly and continuously transcribed and translated as part of their basic lifestyle. All of their DNA, all the time, is templating the proteins needed to make up and maintain the bacterium. When times are good and the bacterium can absorb nutrients, it feeds and grows. And when the amount of proteinaceous material made by the translated DNA becomes too great for the cellular membrane to contain, the bacterium copies those loose strands of DNA, divides the membrane down the middle, and becomes two new cells.2

What the multi-celled eukaryotes had to learn through the random mutations of evolution—in order to solve a problem that the prokaryotes never encountered—was how to regulate all that transcribing and translating of internal DNA as their various cells acquired different functions and became different types of cells. The first question was how to keep order among the cells in a multi-celled organism, so that each one doesn’t keep swelling and dividing, swelling and dividing, until whole organism becomes too big and ungainly to feed itself, manage its environment, and maintain its basic functions—and then swells up, chokes, and dies. The second question was how to use the encyclopedic DNA that comes down from the organism’s parents, as contributed by both egg and sperm and combined in the zygote, to create different types of cells in different types of tissues. Loose strands of DNA getting endlessly translated into all the potentially available proteins just doesn’t work in a multi-cellular context.3

But, almost from the beginning, it appears that geneticists never stopped to think about these problems. The discipline early adopted the “Central Dogma” of genetics: that DNA transcribes to RNA, which codes for proteins. The code written in deoxyribose nucleic acid (DNA) inside the cell nucleus gets copied—almost at random, it would seem—into a similar type of coding in ribose nucleic acid (RNA), which goes out into the cell body as messenger RNA (mRNA) to be captured and read by the ribosome—a structure made out of proteins and strands of RNA itself—which thumbs through the messenger RNA code and assembles proteins out of amino acids, which are bits of carbon molecules free-floating in the cell. How the transcribing of all this nuclear DNA was controlled, and how segments of it were chosen to produce only those proteins that the cell type needed—those were questions that the Central Dogma either failed to consider or put aside for later.

Now it’s later, and we have the answer. It’s an answer we probably could not have discovered without sequencing the entire human genome, which was only completed in the year 2000. And it would have taken longer if the Human Genome Project had stuck with its initial approach of fishing out genes by searching through the genome for the start codon that is common to every protein-coding gene, represented by the bases for adenosine, thymine, and guanine, in that order, ATG, which also codes for the amino acid methionine. The approach was simple: look for an ATG codon anywhere in the genome and sequence from there until you reach one of the messenger RNA’s “stop codons,” which are TAA, TGA, or TAG.

That method would have been fruitful, except it would have run out of gene candidates way early. A few years after the Human Genome Project started, however, genetic scientist Craig Venter proposed another approach, which he called “shot sequencing.” His idea was to chop the entire human genome up into random strings of a couple hundred bases each, sequence them all, then toss the newly discovered A’s, C’s, G’s, and T’s into a supercomputer and let it sort them out. The computer did this by finding identical sequences, considering them to be possible overlaps, linking the overlapping strings together, and putting the whole thing into one long, coherent order. Venter’s rationale was, why fish for the genes individually when you could just drain the lake and pick up all the fish? The head of my former company, Applied Biosystems, maker of the dominant form of gene-sequencing equipment, backed Venter in this approach, creating a company called Celera—from the Latin for “fast”—to make its own run at the human genome. They speeded up the process so much that the other labs in the Human Genome Project soon adopted shotgun sequencing and finished the program well ahead of time—about seven years before their originally projected end date.

What they discovered then was that, of the three billion base pairs in the genome—all those sequenced A’s, C’s, G’s, and T’s—only about ten percent coded for proteins according to the Central Dogma. And the use that these protein-coding genes made of each sequence was highly complex, involving patches of expressed sequences (called “exons”) and patches of intervening sequences (called “introns”) that allowed each gene to be spliced together and interpreted in alternate ways to make different but related proteins. That was a really clever trick of evolution. But the rest of the genome, the other ninety percent, seemed kind of dumb: just nonsense coding. Scientists started calling it “junk DNA,” and imagined it represented genes that we humans no longer used—from our early heritage first as fish, then amphibians, then reptiles, and finally mammals. And now those discarded gene sequences were slowly mutating themselves into nonsense.

One day in 2002 or 2003, however, I was crossing the Applied Biosystems campus with one of our scientists and she told me, “I don’t believe in junk DNA.” Her reasoning was that copying all that excess DNA each time a cell divides consumes a lot of energy. The backbone of the DNA molecule is, after all, a series of phosphate bonds. These bonds are also part of the cell’s energy molecule, adenosine triphosphate. Since phosphorus is a relatively rare element in the human body, it makes no sense that we would store a lot of it in DNA sequences that have no meaning.

Along about this time, also, genetic scientists began isolating and studying short strands of RNA, only about twenty to fifty bases long, called “microRNAs.” These were found in the cell nucleus, never seemed to go out into the cell body for protein coding or any other work, and appeared to be associated with “gene silencing” in a process called “RNA interference.” At first, these microRNAs looked like some kind of accident that could occasionally turn off a gene. One of the earliest incidents studied was microRNA interference with a petunia’s ability to produce purple pigment, so that the flowers turned out white.

And finally, about 2004, I heard a presentation at Applied Biosystems by Eric Davidson from the California Institute of Technology.4 He and his colleagues had been studying sea urchin embryos—because, as he explained, they could start 10,000 embryos in the morning and sacrifice them in the afternoon, and nobody was likely to complain. You can’t do that with puppies or human babies.

What Davidson and his group were learning was that shortly after the fertilized egg begins dividing and the cells form a spherical structure called a blastula, the cells start differentiating based on their position within that sphere. Some become a bony spine that goes on to form the urchin’s skeleton. Others become skin or gut or nerve tissue. By sacrificing those 10,000 embryos in tightly spaced time frames—on the order of ten to fifteen minutes—and examining their nuclear DNA, Davidson and his colleagues discovered the mechanism whereby one patch of DNA begins to transcribe a bit of microRNA that stays inside the nucleus, finds a complementary promoter sequence somewhere else in the urchin’s genome, triggers the transcription of another bit of microRNA, which settles in another place, and this continues on and on, starting a cascade of differentiation effects. From the first transcription, the course of this cascade will alter depending on what quadrant of the blastula the cell originally occupied and the elapsed time since the inception of egg and sperm.

By studying a similar process in other animals that haven’t shared a common ancestor with the sea urchin in millions of years, the Davidson group determined that this DNA to microRNA to DNA to microRNA action is highly conserved, which suggests that something very similar happens in other animals and in humans as well. In short, the Caltech lab uncovered a gene regulatory network that controls the development of each cell and its differentiation from other cells in the organism. It is why some cells become liver cells and produce only the proteins needed for liver cell functions, while others become skin, brain, or bone cells and produce only the proteins needed for their own functions. And, in the case of stem cells, the process goes only so far and lets the partially differentiated cell remain in an immature state until needed to repair damaged tissues—at which time the cell completes its differentiation.5

While the ten percent of the genome that codes for proteins are the body’s parts list, that other ninety percent—once thought to be “junk”—is actually the instruction manual for the body’s self-assembly. And that’s an even more clever trick of evolution.

This great revolution came in several parts, all of which more or less depended on each other. First, the DNA in the cell body had to be gathered inside a new structure called the nucleus. That way, the creation and deployment of these regulatory microRNAs could be contained and focused. Second, a whole new regimen of non-protein-coding sequences had to develop to govern gene promotion and regulation. These non-coding sequences had to become important to, and transmitted along with, the DNA load in each cell. And third, a new type of cell, the gamete, which carries only half the organism’s full DNA complement—based on the normal cell having two copies (called “alleles”) of each chromosome—must be developed, must multiply by a new cell-division process called meiosis, in order to halve the chromosome content, and then must be released through egg and sperm to form the next generation of offspring.6

That’s a lot of systems and structures to ask a one-celled creature to develop in short order. I would almost think the origination of all this complexity would need a divine hand or intelligent guidance—except that I can appreciate how evolution works. Mutations are always happening. Some are bad and kill their subject either immediately or soon after inception or maturation. And those mutations that aren’t lethal might actually be beneficial in forming new proteins that can take advantage of a changing environment or other molecular opportunities. But, more likely, any one mutation just hangs around in the organism’s genome until it’s either eliminated by accident or becomes combined with some other mutation, either in the same protein string or in another protein. Eventually, the modified protein may find a purpose. And if that happens often enough, the whole organism changes slowly over generations.

They say “rust never sleeps,” and neither does evolution. Its actions and effects are constantly shuffling genes and proteins in millions of individuals individual, in every species, and from one generation to the next. And remember that one-celled organisms go through hundreds of generations in the time that human babies are just forming in the womb, getting born, and starting to grow. The process pushes forward in all tissues, in all sorts of configurations, during every moment of time over a scale that goes back three and a half billion years, almost to the beginnings of this planet’s formation in a bombardment of hot rocks.

Seen that way, without regard for individual fortunes or for the thousands, nay, millions of embryos that Mother Nature herself hatches and sacrifices in every generation, you can imagine some pretty remarkable things occurring. Like a bacterium with its loosely organized DNA strands growing up to become all sorts of highly organized and differentiated creatures—including us humans with the organized brains to discover and start to think about this process.

1. Of course, there are single-celled eukaryotes, too. They are generally in the kingdom Protista and include many forms of algae and single-celled protozoa like the paramecium.

2. And when times aren’t good, when the environment becomes too cold or dry, or the food source goes away, then the bacterium stops processing its DNA, loses its internal water content, shrinks down to a hard little spore, and prepares for a long winter.

3. What follows is a story I’ve told before—see, for example, Gene Deserts and the Future of Medicine from December 5, 2010; The Flowering of Life from August 25, 2013; The Chemistry of Control from May 11, 2014; and Continuing Mysteries of the Genome from October 12, 2014. But the story bears repeating in this context.

4. Eric Davidson died this past September from a heart attack. We lost a great mind there.

5. And you’d better believe that genetics labs all over the world are searching for and studying those regulatory networks, trying to figure out how they can induce stem cells to grow outside the body and become complete new organs that we can use for implantation.

6. Of course, some eukaryotes like the paramecium reproduce asexually, without sharing genetic material with another organism. Most cases of this kind are found in one-celled protists, although in higher animals the process is called “parthenogenesis,” or virgin birth.