Sunday, July 15, 2018

Shortcuts to Reality

Robot juggling

Sometimes we don’t see life when it’s right in front of our noses. That’s part of the way our minds work. And combatting this loss of perception is one of the goals of Zen mindfulness, to enable us to confront reality as we experience it, not brush past it with mind tricks and traps.

One of the mind traps is the human tendency to develop daily routines. Routines like shaving, brushing teeth, washing dishes, and so on—necessary business that we all just have to get done—help us streamline our lives. The eyes move, the hands move, and the work proceeds without our having to think about and plan for each separate action. It’s an efficient way to move through the day, but in the exercise of these routines, we become more like “meat robots” than perceiving human beings.

Sometimes, when I’m brushing my teeth or doing another daily routine, I actually lose track of time. I use an electric toothbrush, which fortunately has a thirty-second cycle and beeps at me. This reminds me to move from one side of my mouth to the other, then from the lower jaw to the upper: the same pattern, timed to the beep, morning, noon, and night. If the thing didn’t make that noise, then I wouldn’t know how long I might brush the same set of teeth, mechanically, blindly, without thinking about it, or perhaps thinking about something else entirely. I might also forget and leave one part of my mouth not brushed at all.

I can lose track of time while driving, too. The motions are automatic: watch the road and center the car in the lane; locate other cars in the pattern all around me; scan the mirrors left, right, and center; watch the road; locate cars, for mile after mile. The routine of driving on the highway, without the distractions of having to look for a street sign or watch for an upcoming exit, can bring on “highway hypnosis,” where the mind is lost to reality. Sometimes I can become so fixed in the mechanics that I become separated from the very things I’m supposed to be watching for: the car next to me that is actually moving into my lane; the light up ahead that hasn’t seemed to change for a few moments, was green the last time I looked, and—holy cow!—it now is red, not green!

Even routines that are supposed to be Zen-like and to free the mind, like doing karate exercises, can become perceptual traps. I’ve been doing the same Isshinryu katas for almost fifty years now. What I’m doing at this late age is not so much learning the moves and committing them to somatic memory as keeping my joints limber, my balance stable, and my muscles supple and strong. If I ever need to actually fight someone, I’m pretty sure I will execute the punch or kick correctly per the forms. In the meantime, I proceed through the motions, the same motions, the patterns I learned back in college, whole regimented sets of them in the same order, during workout sessions two or three times a week.

Lately, I have noticed that I will start a kata and then begin thinking about something else: a plot point in the book I’m working on, how I’m going to react in an interpersonal situation, or some decision I have to make. My body will still be moving, but I won’t be aware of it. And then ten or twenty seconds later I will “wake up,” having mentally come to a decision on the issue occupying my mind, and realize that I’m ten or fifteen moves further into the exercise—or approaching the end—with no awareness of whether I have performed the intervening moves correctly, made the right number of repeats and variations, or anything that’s been going on in the room for those passing seconds. The routine that is supposed to heighten awareness of reality has actually dulled it through repetition.

Another mind trap is the labels we use in our daily lives in place of active and mindful attention to what we see, hear, think, and feel. The human mind cannot actually survive without using labels in place of their more complicated referents, at least in some cases. But depending on them too heavily can insulate us within our own minds and separate us from life.

The sciences have a rich history of assigning labels to new phenomena and processes—so much so that some people think the study of biology, chemistry, and physics is nothing more than an exercise in label manipulation. Because I try to keep up with the fields that interest me, I subscribe to Science and Nature. But I freely admit that some of the article titles—and even the abstracts, which are supposed to offer a higher-level view and be more reader-friendly—baffle me. “Multivalent counterions diminish the lubricity of polyelectrolyte brushes.” “Second Chern number of quantum-simulated non-Abelian Yang monopole.” “Enantioselective remote meta-C-H arylation and alkylation via a chiral transient mediator.” I am not making these up: they are three article titles from recent issues. Even if I recognize some of the words, I can guess that they are not being used in the way that, say, an English major would understand them. Sometimes I can only guess the field of science they are discussing. But what is life without mysteries?

Actually, the process of learning anything is a matter of, first, understanding the underlying nature of a principle, object, event, or process—the referent—and second, assigning proper terms and labels to those concrete understandings so that we can communicate about them. Otherwise, we end up talking about “the thing that does the thing to the thing”—or words to that effect. First you understand the ideas of dichotomy and duality, and then you assign the label “two” and “twain” to the things they represent.

But the more you bandy these labels about, the more risk you run of losing sight of the wonder you felt when you first understood the thing itself. The shortcut does not lead you toward reality but away from it.1 Sometimes you think you know the thing when you only know the label. The name is not the reality, in the same way that following a daily routine is not really living.

One of the differences between human beings and the artificial intelligences, robots, and automated systems that we are starting to build today—and which will become ever more important in years to come—is this access to reality. Humans can experience a wide range of senses and put them together in novel ways. Having that “Aha!” moment of clarity, the epiphany, the sudden understanding, is a uniquely human thing. Robots and software systems don’t perceive reality except as it affects or interferes with their programming. They are focused on the parameters and processes for which they were designed. That design may encompass a wide field of view and a breathtaking array of sensory inputs and programmed contingencies. But it is still a focus, a built-in routine, and a label for which there may not be an understood referent. The robot does what it was designed to do. The automated system processes the parameters that are given to it, or for which it has cameras, microphones, haptics, and strain gauges designed to receive certain signals.

A robot brain is not designed to hear a rustle in the grass and suspect it may be a tiger about to pounce. A mechanical brain is not designed to read meaning into patterns, like the sodden tea leaves in a cup or the glints of candlelight in a crystal. A robot is not susceptible to the wonder and mystery of the life around it. But we are.

1. And sometimes that is intentional. There are scientists in any field who speak in code words simply for the delight of sounding more sophisticated and knowledgeable about the subject than those who speak clearly. Although, on the other hand, there are subjects that can’t be approached without a knowledge of the nomenclature. You can imagine trying to discuss quantum mechanics and the discovery of the Higgs boson if you don’t have a reference for the nature of subatomic particles, concepts about mass, and the theories of this Higgs fellow.

Sunday, July 8, 2018

Still Questioning Gravity

General relativity

With the usual caveats,1 and at risk of showing my great ignorance, I still don’t understand how gravity works. I’ve written about this before,2 and I read every popular explanation I can find, because the math-dense version is generally beyond me in all sorts of dimensions. And yet … some things about general relativity and gravity just don’t make sense to me.

Aristotle, the polymath and chief explainer of things scientific in the Greek golden age, thought gravity was simply the way that things find their own natural place. It’s linked to the concept expressed when we say that water seeks its lowest level. That is, gravity and its action on objects like draining pipes, falling stones, and stumbling people is merely a characteristic of the object itself. Water flows downhill, rocks tumble off cliffs, and you fall on your face if you trip, because that’s where the water, the rock, and you actually want to belong. That is, it’s the water’s and the rock’s fault—and yours.

Isaac Newton, who was the premier polymath of the 17th century, thought gravity was a force. Something about massive objects like the Sun and the Earth exerts a force to attract all lesser bodies, such as falling apples and those same stumbling humans. That is, gravity is a characteristic of the ground, not of the falling object itself. That is, it’s the Earth’s fault.

Newton’s concept of gravity worked well for a couple of hundred years and satisfied most of the observations of astronomy, governing the motions of stars and planets. The one problem was that Newton’s force of gravity was thought to be instantaneous: it was action at a distance not governed by time. So, if the Sun were to explode—no, that would still leave the equivalent mass in rapidly dispersing hydrogen, helium, and fusion products at the center of the solar system—or rather, the Sun could be magically “disappeared” from its central position, the Earth and the other planets would immediately head off in a straight line tangent to their normal orbits that had previously been shaped by gravity. In reality, at the speed of light—the limit governing all actions in the universe—the effects of any such instantaneous removal would take the same eight minutes to be felt at Earth’s orbit that it normally takes light from the Sun to reach us.

Albert Einstein, the polymath of the 20th century, rejected the idea of a “force” and, through his theory of general relativity, defined the effects of gravity as being a curvature in space and time. That is, massive objects bend space and slow down time. And the more massive the object, the more the surrounding space and time—which Einstein conceived as simply different dimensions of the same reality and called altogether “spacetime”—are curved. That is, it’s the fault of the geometry of space and time themselves.

In this conception, the idea of force and how quickly it might act or react is irrelevant, the curvature exists so long as the mass is present. And, of course, while the star might explode and scatter its mass, nothing known to physics is going to remove that mass, magically or otherwise, at any speed greater than, or in any timeframe shorter than, the speed of light, c.

As every science popularizer is quick to point out, Einstein’s concept of general relativity didn’t prove Newton “wrong.” Einstein’s concept of spacetime curvature and the mathematics to support it were just a more refined approach to the problem than a generalized force representing gravity. While Newton’s math worked for most problems in planetary astronomy—being useful, for instance, in calculating a near-Earth orbit or plotting a trip to the Moon—Einstein’s equations gave better answers to more decimal places. Einstein’s math, for example, gave a more accurate prediction of the precession of Mercury’s orbit around the Sun than Newton’s by a few seconds of arc.

Still, and mathematics aside, Einstein and Newton offer very different and irreconcilable conceptions: Newton postulates a force whereby one body acts upon another, like a pitcher hurling a baseball;3 while Einstein postulates the effect that a massive body has on its surroundings, and that effect is present regardless of whether any second body is around to experience it.

For ease of visualization by the layperson, illustrators show the curvature effects of gravity under general relativity with something like a bowling ball sitting on a trampoline and creating a curved depression in the surface—like the illustration here. The trampoline is supposed to represent the “fabric of space.” Of course, the curve is not in just the two dimensions shown for this flat surface but in all three dimensions of space plus a commensurate slowing of time.

I have always had a problem with this usage, even as an analogy, of the word “fabric” to refer to space and time. Space in all other contexts, is generally accepted as simply being empty. If it has a structure, an internal component that can be bent or warped, then space is not just a form of emptiness but instead is something all its own and separate from the protons, neutrons, electrons, photons, and other particles that exist within it and pass through it. Similarly, if time can be made to slow down, that implies some structure or medium that a nearby mass somehow manipulates. Time is not just the measured passage of events but a thing all its own, separate from those events.

In quantum mechanics, particles have associated fields, and these fields guide the motions of the particles within them. The photon, for example, is not just tiny, massless “thing,” but it also establishes a field that guides the movement of light and the properties of magnetism. Supposedly, in quantum mechanics, there must exist a particle called a “graviton” that has an associated field governing what we experience as gravity. But such a particle has never been discovered.4 Otherwise, we could hope to make our cars fly by blocking the exchange of gravitons with the Earth beneath them. But no one has yet been able to reconcile the concepts and mathematics of quantum mechanics with general relativity. Big is big, small is small, and they don’t seem to play by the same rule sets as currently conceived by the best human minds.

The confusion I have with general relativity and the curvature of spacetime is this: I can understand how curved space and time might alter the direction of a body that is already in motion, such as planet Earth wanting to move in a straight line (like all good inertial objects) but being forced into an elliptical orbit by the mass of the Sun. But what about a body that is not moving with respect to that center of mass? Just sitting or standing on the surface of the Earth, I am—according to general relativity—accelerating toward the center of the planet. But I am not moving with respect to that center. I never get any closer to the center, although I am accelerating toward it at a rate of 9.8 meters—more than 32 feet—per second per second. That would be a pretty good clip if I were moving across the surface of the planet and going faster and faster with every second.

Sure, the analogy with a trampoline shows a depression that I might be sliding into, like a kid on a sled sliding “down” a hill. But if I am at rest with respect to the center of the planet or another nearby mass, why would I be moving toward it at all? Even if that surrounding space is curved, what … forces me to move down the curve?5

I’ve read explanations that all of this has to do with different and higher orders of geometry. Also, that objects existing in a faster timeframe, such as in the less-curved spacetime further away from a planet or star, will seek to move toward the slower timeframe created by the mass of a large nearby body. Perhaps it all works out with elegant mathematics. But that still leaves the common-sensical question of why an object would prefer, and naturally move toward, a slower timeframe. Isn’t that just a version of Aristotle’s definition of gravity: that things just try to find their natural place?

I don’t mind if there’s math involved. That doesn’t insult or frighten me. But I do mind if the concept is solely based on mathematical equations. If the underpinnings of the universe cannot be explained except through a set of equations, then we run the risk of the ever-inventive and fertile human mind creating an equation that describes a situation without actually explaining it.6

You can write any number of equations, and they may make mathematically perfect sense. I can measure the distance across the continental Unites States in terms of gallons of milkshake consumed at Dairy Queens along the way. I can relate this function to a traveling body’s metabolism and the ambient temperature, and then link that intake to toilet flushes in restrooms further down the road. I can create elaborate mathematical structures related to distance and dairy products. But they won’t explain anything.

I still don’t understand gravity. And given that we have to fudge around with concepts like “dark matter” to reconcile current conceptions of gravity with the observed motion of stars in galaxies, and with “dark energy” to relate the motions of those galaxies with the size and scale of the universe itself … I don’t think anyone else does, either.

1. I was an English major in college with a minor in karate. The highest level of math I took in high school was Algebra II and Geometry, and I satisfied my college math requirement, as did so many other liberal arts students, with Philosophy I (aka Logic). But since then I’ve been reading continuously in the sciences, particularly physics and astronomy, to support my science fiction writing. My professional life over the years has been to explain the work of engineers and scientists for the lay reader. So, while I am math-challenged, I am neither ignorant of nor disinterested in the subject.

2. See Three Things We Don’t Know about Physics (I) from December 30, 2012, and (II) from January 6, 2013.

3. A force is represented by the most basic equation in physics, f=ma, or “force equals mass times acceleration.” The pitcher’s arm muscles accelerate the 142-gram mass of a baseball from, say, zero miles per hour in his set position to, say, 90 miles per hour—or 132 feet per second—for a fastball at the full extension of his arm at release, which occurs about half a second later. That’s an acceleration—not an exit velocity, but the acceleration needed to achieve it—of 264 feet per second per second. Mass times acceleration. Einstein used a variation of this physics equation to come up with his famous statement about the energy content of matter itself, e=mc2.

4. Recently, there was much to-do when the Large Hadron Collider at CERN identified the previously theoretical Higgs boson. This heavy particle, which is not normally found in nature, is supposed to give matter its mass. This is a different particle from, but might be a kind of precursor to, a graviton. We still have much to learn.

5. I used this question to create a fantasy mechanism for time travel in The Children of Possibility.

6. I can define gravity as the hand of an ever-watchful and invisible little god, call him “Mr. G.” He watches me and every other thing in the universe. If I am sitting, he presses gently on my lap so that I don’t float away from the seat of the chair. If I am walking and careful about my steps, he has a hand on my shoulder to keep me in contact with the ground. But if I stumble, he flicks the back of my head with his finger, pushes me over, and presses me down. If I jump, he lets me rise only so far, consistent with my muscle tone, and then pushes me back down to the floor. And if I take a capsule into orbit, he watches my direction and speed, and at the appropriate time he stops pushing down on me so that I can float freely around the cabin. There! I have a working concept of gravity that fits all observations. I could even write out Mr. G’s influence in the form of a set of equations. But is this what’s actually going on in the universe?

Sunday, July 1, 2018

Distrust of Government

Minute Man

I’ve written before1 about how for the past four centuries America, and the New World in general, simply by existing became an escape valve for Europe’s population of disenchanted individualists. And now, by extension, we have become the magnet for people from all over the world who want more freedom, greater opportunity, and a better life. This drive for freedom and what my mother used to call “inde-goddamn-pendence” is not just a casual or passing attitude, it’s written into our genes from ancestors who voted with their feet long ago—or maybe in just the previous generation.

Our founding fathers, the authors of the American Revolution, also known as the American War for Independence, had a profound distrust of government. It wasn’t just distrust of a distant and unresponsive king and parliament, “taxation without representation,” and the economic strictures and political disadvantages imposed on the thirteen colonies because they were, after all, possessions and not the same as English counties and boroughs with direct representation in Parliament and ancient rights under English law. It wasn’t a bad experience or two with the occupying force of British redcoats, having to quarter them in civilian homes, or enduring the Boston massacre, and later having to fight a war in which the might of the British nation—or as much as it could spare at the time—came down on ragtag bands of freedom fighters and a woefully underfunded and ill-equipped Continental army.

The distrust was in large part the heritage of dissenters, deportees, transportees, indentured servants—and later freed slaves—who had seen the iron rule of law at work in the hands of men grown too well-stuffed and powerful to care about their neighbor’s plight. Of people who wanted a place less crowded, less restricted, less governed, in which they could live where and how they wanted. A large measure of this dissatisfaction was also religious—carried by people with different ideas who were escaping an established Church of England that poorly tolerated unconventional practices and viewpoints—and gave rise to local congregations and enclaves of Puritans, Calvinists, Quakers, the Anabaptist Amish, and later the Mormons or Latter Day Saints. But the distrust went beyond religion to any established institution that would impose that iron rule with no easy or direct line of escape for the freethinker.

Distrust of government as an institution is written into the U.S. Constitution. The basic structure is arranged to provide those famous “checks and balances.” The Congress, however structured and elected, can only write the laws. The President, however supported by cabinet and other administrative positions, can only enforce the laws as written. And the Supreme Court, whose members are nominated by the President but must be confirmed by Congress, can only rule on the soundness of the law in practice, once someone has brought a case contesting its actual application. No one branch of government is meant to be all powerful or able to take action except in the context of the other two.

Today, as in the past, various Presidents have sought to bypass Congress through “executive orders.” While the Constitution makes no specific reference to executive orders, they are usually justified as part of the broad powers that the Constitution gives the President as chief executive and Commander in Chief. Still, they are not meant to supersede the power of Congress to make law.

Similarly, the Constitution has no provision for the vast federal bureaucracy that has grown up around the President’s cabinet posts and its various departments like Agriculture, Commerce, Education, Energy, Health and Human Services, Housing and Urban Development, Interior, Labor, Transportation, and so on. Defense and Homeland Security would appear to be the only posts necessary to the President’s role as head of the armed forces. State and the Treasury would also appear necessary to the chief executive’s function as representative to other nations of the world. But the rest of the cabinet has grown up over the years—mostly during the 20th century—to become interpreters and implementers of the laws passed by Congress.

These days we have the spectacle of laws passed with ever more pages of detail, requiring ever more interpretation by the executive branch. Simple laws that can fit on a page or two and be easily read and understood by the average citizen are a thing of the past. Our country’s book of administrative law, the Code of Federal Regulations, as published in the Federal Register, now adds about 80,000 pages a year. It’s a commonplace thought that everyone, without doing anything out of the ordinary or intentionally criminal, is guilty of something under current federal law. All the more to put the average citizen in his or her place.

I believe the founding fathers would regret this state of affairs.

In part their distrust of government was based on the founders’ own experience with what they called “factions,” which today we would call “parties” or “partisanship.” Not only is each branch of government set as a check and balance on the other two as a matter of design but also as a prevention against one group gaining control of the levers of power and using them without fear of obstruction, impediment, or retaliation. The members of Congress are—or were—supposed to be impermanent, serving for terms of two or six years, and capable of being voted out if they failed to do the job the public wanted. The President nominates members of the cabinet but they must be confirmed by Congress, as are the heads of major bureaucracies like the Central Intelligence Agency. The U.S. Civil Service, representing non-appointed, non-military civilian government employees, was only established by law in 1871. But these positions have traditionally been and are supposed to be filled by competitive hiring based on personal merit—and not, as in the conspicuous case of corruption in New York’s Tammany Hall, as a reward for partisan support.

The founders’ respected majority opinion, but they also looked out for the rights of the minority. People and political positions that lose a legislative battle by a vote of 49% to 51% are not to be automatically ground under, hunted down, or led to the guillotine.2 And important votes, such as overriding a President’s veto, have to be settled by more than a simple majority. The Constitution also allows each body to set its own rules for operation, and the Senate early on—that is, from about the 1850s—allowed minority objectors to a piece of legislation to filibuster it, or hold the floor and delay the vote for as long as their legs and their breath held out.

And finally, the Constitution’s own Article VII allows for its ratification by the states. That is, the new government under the Constitution could not simply impose itself on states that did not want to be ruled by this document. They had to choose to abide by its conditions.

Distrust of government is thickly strewn through the Bill of Rights, too. These first ten amendments to the Constitution were proposed after the battles for ratification in the late 1780s and specified federal guarantees to individual citizens. The people could speak their minds and worship how they pleased; defend themselves against tyranny; refuse to house soldiers except as prescribed by law; be secure in their persons, houses, papers, and effects from unreasonable searches and seizures; be free from double jeopardy and self-incrimination; enjoy the right to a speedy public trial before an impartial jury and to confront their accuser; be free of excessive bails and fines, and from cruel and unusual punishments; and enjoy all the rights and powers not enumerated in the Constitution.

The Bill of Rights staked out the ground where the new government could operate—quite narrowly, in fact, when compared with the old laws of Europe. These rights were designed to say that people, on their own as individuals and without the consent of a king or parliament, or even of their own elected government, had worth and stature. It was really meant to be a government of, by, and for the people, and not government for its own sake or as a convenience to those who held temporary power.

In short, the founders considered a national government, state government, or any formal control over the freedom of the individual as a necessary evil—not as a good thing in and of itself.

There are people and parties in this country today who would like to bring back the old European ideals: that the government grants rights and sets limits for the individual; that the products of an orderly society should be uniformly shared, even if that means giving up individual freedoms; that the average person is too willful, reckless, or stupid to make reasonable, intelligent decisions for him- or herself; and that to protect the rest of society, the “best and brightest” must step forward to direct the common citizen.3 These people want a more orderly, controlled—and controlling—state to define the limits of human existence.

And there are people and parties in this country today who say to that: “Been there. Done that. No thanks.”

1. See We Get the Smart Ones from November 28, 2010.

2. Thomas Jefferson, in his 1801 inaugural address, interpreted the Constitution thus: “All … will bear in mind this sacred principle, that though the will of the majority is in all cases to prevail, that will to be rightful must be reasonable; that the minority possess their equal rights, which equal law must protect and to violate would be oppression.”

3. This was the essence of Plato’s “philosopher kings” in The Republic. But remember that Plato and his crowd were admirers of the rigid Spartan regime, which was a closely held oligarchy and not an open society of equal individuals. His ideas were notable in Athens not because they were revered but because they were antithetical to Athenian democracy. Or else why was Plato’s annoying mentor and protagonist Socrates forced to drink poison?

Sunday, June 24, 2018

Contamination Everywhere

Petri dish

I have had a fascinating career as a technical writer and internal communicator. One of my most interesting jobs was editing procedures and batch records1 at a pharmaceutical company that used recombinant DNA to make its most advanced products. Later I wrote operating procedures at a company that made genetic analysis equipment and reagents and was hoping to bring its documentation up to the standards required by U.S. Food and Drug Administration regulations. This was the only way they could transition their products from research use only, good for serving academic and industrial laboratories, to diagnostic use with actual patients in clinical settings.

In working under FDA regulations, one of the things you learn is to believe in things you cannot see. The filters protecting the ductwork that condition the atmosphere in a clean room are either high efficiency particulate air (HEPA) or ultra-low particulate (or penetration) air (ULPA, see the specifications at IEEE Engineering 360). One removes 99.97% of particles 0.3 μm (micrometers, or millionths of the meter) in diameter from the airstream; the other removes 99.999% of particles 0.12 μm, or about a third the size of the HEPA filtration. This is dust you cannot see. These filters will catch pollen, water vapor, bacteria and their spores, most kinds of smoke, and sometimes even an odor in the air. Everything but small virus particles, which generally fall below 0.1 μm.2

For those virus particles, and for anything that happens to drift in through the airlock—although clean room suites are kept under positive pressure, so that any dirt inside will move outward, to the unqualified parts of the building—or that rides on the operator’s clothing after meticulous gowning, the regulations require rigorous cleaning. Since most forms of contamination in the pharmaceutical world involve active microorganisms, like those viruses, the specific cleaning agent is sodium hypochlorite (NaOCl), familiar to every householder as chlorine bleach. It not only removes stains, but its chemical action specifically destroys the long-chain polymers of DNA and RNA, effectively killing bacteria and neutralizing viruses.3 Sodium hypochlorite is highly alkaline—the chemical opposite of acidic—and so it also chemically attacks most surfaces like plastics and metals.4

Alternatively, surfaces and instruments might be washed with 70% ethyl alcohol, which kills bacteria by evaporating so quickly that it dries out the cell membrane before the microorganism can sporulate to protect itself. But not too quickly, though. Some people thought that if 70% alcohol was effective, 90% would be even better. But the more concentrated alcohol evaporates even before it can do its job, leaving the bacteria unaffected. Yeah, and perhaps even mildly drunk.

When working with these disinfectants, the clean room operator uses a wiping material—something like a paper towel, but denser and less prone to linting—under a procedure called “work and turn.” The operator saturates the towel, folds it a certain way, and makes one stroke across the surface to be cleaned. He or she then refolds the towel to expose a new, untouched side, and makes another stroke. The process continues until no unexposed parts of the towel remain, and then the operator discards it and starts with another. The motions for cleaning the surface are prescribed, too. The operator doesn’t just rub the saturated towel around in a circle, like a bartender wiping down the bar. Instead, he or she makes defined, overlapping strokes and never backtracks to cover an already cleaned part of the surface with a section of towel that has already been used. Cleaning a work surface requires diligence and concentration.

If you think this attention to detail is a trifle excessive, neurotic, or obsessive-compulsive, know that the pharmaceuticals this site was making were parenterals—that is, drugs that would eventually be injected into patient’s veins or muscles. Everyone on site repeated the mantra, almost daily, “We make drugs that go into people’s arms, so we have to be clean.”

In the FDA-regulated world, the word “contamination” doesn’t just apply to particles, bacteria, and viruses—dirt you cannot see—but also to the condition of the product’s being exposed, even potentially exposed, to dirt or some other kind of danger. So a batch of product, or an intermediate step in its production, that has inadequate documentation or has acquired some other defect at some point in the operation is labeled “contaminated.” If you don’t know and can’t prove whether the product is pure or not, then it’s not, and it must be discarded or “dispositioned.”5

What conclusions do I draw from all this experience with the finicky, precise, and sometimes whacky obsession with contamination in the pharmaceutical world?

First, our drugs are well made. This care in manufacturing—along with review and oversight of the initial development process, and double-blind testing for safety and efficacy before releasing a new drug to the public—means there’s not a medication made in this country that I would not willingly take on my doctor’s advice. Similar enforcement regimes are practiced in the rest of the developed world. I don’t know that I can say the same for medications made in less cautious countries.

Second, and despite the theme and message hammered home to audiences through now three generations of science fiction and horror movies, products made with recombinant DNA and other advanced biological techniques are not going to get out into the environment, mutate beyond all imagining, and take over the world.

Recombinant DNA is simply the technique of taking a gene that exists in nature, perhaps even in the human genome; isolating it from its chromosome and the embedded system of promoter regions that allow it to function inside the cell’s nucleus; looping it into a plasmid, or circle of double-stranded DNA; and inserting it into the cell body—not the nucleus—of a compatible host cell. There, the host cell’s mechanisms for transcribing DNA into RNA and then translating RNA into proteins proceeds to work on the foreign plasmid just as if it was just a part of the cell’s normal genome. Host cells can be yeasts, bacteria like E. coli, or certain mammalian cells that have long lives and can replicate freely.

The recombinant cells are put into a closed vessel called a “fermenter” or bioreactor, fed a growth medium plus oxygen and other supplements, and allowed to grow. If the protein produced from the plasmid is supposed to be secreted from the cell—such as the human clotting factor produced at our site—then it enters the liquid in the reactor and can be periodically siphoned off and purified as a biological agent. If the protein is normally held within the cell, then the reaction campaign is stopped after a specified period, the cells are extracted and split open, or “lysed,” and the protein is purified from the organic debris.

Fermenter campaigns are a delicate thing. Get the mixture wrong in the growth medium, add too much oxygen, fail to draw off enough of the resulting carbon dioxide, let the temperature vary by a couple of degrees—any number of maladjustments can cause the cells in the bioreactor to die. Oh, and allowing an outside bacteria or other cell type to invade the mixture will contaminate the process, too. Other cells are not only a danger to the identity, safety, quality, and purity of the final product, but they also compete with the host cells for the fermenter’s calibrated resources. Since the host cells are carrying that extra DNA and the burden of making all those copies of a foreign protein, they don’t compete very well.

If recombinant DNA host cells need to be pampered that much inside a bioreactor, which is the safest of all possible environments for them, imagine how vulnerable they must be out in the real world. A bit of used media that carries a few live host cells would pose no real threat if it ever got dumped down a drain—although no pharmaceutical company would be so careless. In an environment crawling with every kind of bacteria, fungi, and other microorganisms, a cruelly burdened strain of E. coli or a baby mouse kidney cell would stand no chance of survival, let alone of attaining wild and uncontrolled growth.

And my third conclusion is that the environment outside the laboratory is a really rough place. I was working at the genetic analysis company after the bioterrorism scare with weaponized anthrax bacilli in the Senate building, and we were tasked with developing a genetic method for detecting the spores in the U.S. Postal Service’s bulk mail centers. As part of that program, we convened a meeting with a number of influential microbiologists to explore what other biological vectors might be weaponized by terrorists and so would need screening.

Most of these experts had just come off a congressional hearing about the recent outbreaks of bird flu, and when we put this question to them, they just laughed. “Mother Nature is the greatest terrorist of them all,” one of the experts said. What he meant was, for every human attempt to culture, refine, and package an infectious agent, the environment itself is inventing a thousand different ways to kill us, like avian influenza viruses, bacteria, molds, and fungi spores. Evolution is at work all the time and, in the aggregate, is a lot more powerful than any human ingenuity. What can get into the lab and ruin your experiment or your production run is far more dangerous than anything that can get out of the lab and into the water supply.

Contamination is everywhere. Your immune system and that of every other human being on the planet is working hard just to keep up. And eternal vigilance is the price of safe drugs and a healthy food supply.

1. In the pharmaceutical business, a “batch record” is the procedure for making, storing, and packaging a product at every step. Unlike, say, a “standard operating procedure,” which simply tells you how to operate a piece of equipment or perform a task in abstract, the batch record includes checks and signoffs at each step in the manufacturing process. It provides written proof that the steps were performed correctly and in the proper order; that the results of every measurement were noted; that critical steps and measurements were also observed and confirmed by a second operator; and that the entire document was reviewed by the department supervisor at the time of production and by a representative from quality assurance before product release.
       Batch records apply not just to the product itself but also to any part of the plant and its operation that might affect the product’s identity, safety, quality, and purity. So the operators will complete a batch record for cleaning tanks, sterilizing hoses and utensils, calibrating equipment, and even mopping the floor in the production suite.

2. Of course, if a virus gets airborne at all, it will probably be riding on a water droplet from a sneeze or a dust particle derived from a flake of skin.

3. Because a virus is not technically alive, outside a host cell, you can’t “kill” it. But you can stop it from doing whatever it’s going to do, which is the same thing.

4. But that’s not the worst hazard in the manufacturing suite. At the pharmaceutical company, we used to clean mixing tanks with heated sprays that alternated hydrochloric acid (HCl) with sodium hydroxide (NaOH)—an acid followed by a base. This would scour out any protein residue left over from the manufacturing process. And then the procedure followed it all with pressurized steam at 122°C. You might worry about putting these acidic and caustic flushes down the sewer drain, but first they were sent to a holding tank, where the two chemicals neutralized each other, producing salt water.

5. And don’t believe that means just thrown away. A dispositioned batch has a documentation process all its own, so there is no possibility that it might get back into the product stream.

Sunday, June 17, 2018

Everyday Miracles

Yeast cells

My mother wanted her sons to be strong and self-reliant. That was why she started us early doing household chores like vacuuming, dusting, cleaning bathrooms, doing laundry, and washing dishes—or in our time, loading the dishwasher. She also taught us the rudiments of cooking: how to measure and pour, boil an egg, and fry bacon without splattering ourselves or setting the kitchen on fire, all so that we could at least survive without the contemporary equivalent of McDonald’s or Taco Bell.

Because her own mother had been an excellent baker, and she was a fair hand at it herself, she taught me how to make bread and scones. Sure, you can buy these things in the store readymade, but she believed in knowing how to rustle up a meal from the basics you have on hand. And with kneading dough and watching it rise, I discovered the miracle of yeast.

Yeast—Saccharomyces cerevisiae, for the most common species, used in baking and brewing—is one of the most complex of all single-celled microorganisms. Yeast is actually part of the fungus kingdom. Unlike bacteria, yeast cells hold their genetic complement inside a nucleus, as do most plants and animals, rather than letting it float freely inside the cell body where it can transcribe and translate willy-nilly. Thus yeasts are eukaryotes, just as we are, because secluding the genome inside a nucleus is the first step toward developing a multicellular organism, capable of differentiated and specialized tissue types. In fact, yeasts are believed to have evolved from multicellular organisms. But now yeasts reproduce asexually, by budding new cells off existing parent cells.

The yeast genome (see the Saccharomyces Genome Database) has 6,275 genes—of which about 5,800 are thought to be functional—in more than 12 million base pairs. These genes are packed on 16 chromosomes, which doesn’t compare badly with the human’s three billion base pairs on 23 chromosomes. In fact, about a third of yeast genes are shared with the human genome. For comparison, most bacteria have just one or two chromosomes, looped in a circular shape called a plasmid for easy transcription, and containing about 1,500 genes.

Because of the antiquity of bread and beer making, anthropologists believe yeast was humankind’s first domesticated species, predating wheat or rice, cows or sheep, and perhaps even the dog. It certainly came into our lives after the hunter-gatherer stage, when we settled down in one place long enough to brew up a pot of beer. It also must have come sometime after the discovery and taming of fire, because you can’t bake bread on a flat, sun-heated rock.

Yeast is not hard to get. In the second novel of my time-travel series, The House at the Crossroads, a young woman from the far future learns the basic skills of a medieval housewife: “Dame Agnes also taught her how to isolate and nurture the yeast cultures she would need—both for fermenting and to make her dough rise—by putting fruit skins and vegetable peels in a jar with water and leaving them in a warm, dark place overnight.”

The basic function of yeast—that is, for human purposes—is to eat up sugars and starches, known as carbohydrates, and excrete ethyl alcohol and carbon dioxide. For making beer and wine, we capture the alcohol and let the carbon dioxide waft away. For making bread rise, we let the carbon dioxide pump up the dough—which is why bread has tiny holes throughout the loaf, while a cracker does not—and let the alcohol evaporate in the baking.

Anyone who has seen a lump of dough, kept for an hour under a tea towel on the back of the stove, rise into a glorious dome twice the size of the original lump will know what I mean by a miracle.1 It’s a form of magic to see this grainy, yellowish powder—store-bought yeast—which becomes a gray muck when mixed with warm water, turn out anything as sweet and pleasant smelling as good bread. And yeah, that smell is the alcohol. The same miracle occurs when a bushel of crushed up grapes, or spouted and dried barley, or even mashed up potatoes plus water turns into wine, beer, or vodka.

Christ’s miracle of turning water into wine at the wedding in Cana just speeded up the process. He might as easily, although less dramatically, have made the balls of dough for the feast’s bread pop up before being put into the oven. These are miracles of time, not necessarily of substance, nor of reversing the normal course of events—such as the raising of Lazarus.

Cooking has other small miracles, too. There is the moment when you are mixing the dough in the first place, and the flour, water, and other ingredients go from a soggy mass to a plastic lump. Or when you’re making gravy, and the isolated streams of beef drippings, water or wine, and those little clumps of flour come together into a smooth paste and then a glistening liquid. Or the moment when an egg beaten with milk and seasonings and poured into a hot pan simmering with olive oil or melted butter turns from a runny yellow liquid into light and fluffy solids—or into a foamy custard, if you have more patience and you’re trying to make an omelet.

A young man who thinks food comes from the kitchen, or prefers to spend his money at McDonald’s or Taco Bell, never gets to see these things. And that’s a pity.

1. For certain applications, such as biscuits and scones, you can also use baking powder. This is a mixture of the alkaline sodium bicarbonate and a weak acid like potassium bitartrate, or cream of tartar. It releases carbon dioxide through an acid-base reaction once you get it wet. You can also use sodium bicarbonate to make soda bread. These chemicals are easier to handle than yeast, which can be killed with too much heat during the rising process or an unsatisfactory ingredient in the bread making—such as the time I tried to make Jim Beam bread, don’t ask. Dead yeast leaves you with a flat loaf like a paving stone.

Sunday, June 10, 2018

Fear Itself

Steelcraft cabin cruiser

Over the years, I’ve found something strange about myself. I may sometimes become nervous or fretful about future events, such as the ordering of steps to complete a project under deadline, or the progress of negotiations on a contract whose terms might eventually end up in court. A lot of this agitation has to do with the clock that’s continually running inside my head and my attempts to keep things functioning smoothly. Sometimes, also, I succumb to existential fears: the things unseen in the darkness but felt in the imagination that can spook anybody.

But from years of riding a motorcycle, I’ve noticed that my fear of actual and immediately present dangers seems to be muted—if not entirely absent. Looking back, I attribute this to an event that happened in 1954, when I was just six years old. I’ve told this story before, but it bears repeating in terms of personal fear.1

My father loved boats and boating. From the early 1950s, he was a member of the Power Squadron, which was an organization of small boat owners to teach seamanship and good boat handling practices. We lived on Long Island, just northeast of New York City, and he bought a twenty-six-foot cabin cruiser. It was built by Steelcraft and thus was unusual for having a steel hull instead of marine plywood or, more recently, fiberglass. The design was originally for a boatyard launch, because the hull was so small, but various entrepreneurs had fitted them out with cabins in the same way that the people these days convert delivery vans into rolling living rooms. Because of the hull material, my father named the boat Rusty. And that was apt because, although the steel was painted, it still bubbled up with blisters of rust in the salt water of the Sound. Every spring he and my mother would spend a couple of weekends chipping the hull and bilges and repainting them while my brother, who was just eight years old, and I played in the sand nearby and occasionally tried to help out.

My family kept the boat at an anchorage on Manhasset Bay, and every weekend during the summer when the weather was clear we would take it up the Sound. We only went an hour or two, east to Lloyd’s Neck or Eaton’s Neck along the north shore of Long Island. There we would anchor in the cove, picnic on the beach, dig for clams at low tide, and sleep over Saturday night. It was our version of having a cabin on the lake.

In the summer of 1954, my father planned an extended trip to coincide with his two-week vacation. He would take our family in the boat around New York City to the Hudson River, up to Albany, through the Champlain Canal and the Federal Locks, and into Lake Champlain itself on the New York–Vermont border. He figured he could make it as far as Burlington, Vermont, at the center of the long lake, in the first week and return in the second week.

It was an idyllic trip. I remember long days on the river and canal, seeing West Point and Storm King Mountain, the paper mills around Albany and the pulpwood barges coming down from the plantations in Quebec to feed them, and Schuylerville—which I only learned later was the site of the Battle of Saratoga in the American Revolution. My father, who had grown up in the Hudson Valley, played tour guide during the day, and my brother and I scrambled around the marinas where we tied up in the evening.

The turning point, both literally and figuratively, came in Burlington. On the morning we were supposed to begin the return trip to New York City, the people at the dock warned my father that a storm was coming and this was no day to be out on the lake. My father was a good sailor, and he listened to the weather reports, too. But he also had a schedule to keep. And the sun was shining that morning when we cast off and headed south.

By late morning, the clouds had rolled in and the waves started to build up. What we didn’t understand but discovered later was that hurricane Carol, which was churning up the East Coast and aiming for Boston,2 had calved a secondary storm, not quite a hurricane but with high winds, that had gone up the Hudson Valley. We got caught in that storm.

Lake Champlain is not very deep, averaging sixty-four feet—although some spots go down four hundred feet. A relatively shallow lake can kick up some real waves in high winds. And, of course, we had the winds themselves to deal with and the sheets of rain that came with them. Champlain is also a relatively narrow lake, averaging about fourteen miles across, so my father didn’t have a lot of leeway—which is a nautical term for how much you can let yourself drift downwind before running aground—in which to maneuver.

I was standing in the main cabin with my butt and shoulders pressed against the door that led out to the back deck. My father was at the lower helm—he also had a steering position on a flying bridge mounted on the cabintop, but that was no place to be right then. My mother was helping him by turning the hand-operated windshield wiper—a relic from early automotive days—and wiping condensation from the inside of the glass. Both of them were too busy to bother about me.

The wind was coming from astern, and the door at my back was banging and rattling until it seemed about to blow in. I imagined a furious imp stood outside, pounding on it. When I looked through the window, though, I couldn’t see anything except the bare deck, with a little toy sailboat of ours that had been pushed into the scupper, and the waves piling up on either side of the boat. If the Rusty had a freeboard of about three feet, waterline to rail, then those waves must have been ten or twelve feet tall, and maybe more. They certainly towered above the boat to my inexperienced eye.

Inside the cabin, everything was chaos. My mother and father were fully occupied keeping the boat on course and hitting the waves at the right angles. So they had no time for anything or anyone else. The coffee pot with some of the cold morning coffee slid across the high dinette table and crashed into the chart bin next to me. Some of those charts carried brown spots and streaks for years afterward. The portable radio, which was our main source of news and weather, flew after it and shattered on the steel bulkhead. I watched all this and could do nothing, because I was fighting that imp.

We were kept in the storm—but off the shore, and didn’t sink—until late afternoon. Then my father could round the point of Fort Ticonderoga, which sheltered us from the winds. My mother went out through the bow hatch—which meant she was standing on the closed lid of the head, or toilet—to unclip the anchor, let it go, and pay out line while my father handled the engine throttle to put the boat in reverse and drag the anchor along the bottom until it could catch.

When we were safely anchored, my mother came back into the lower cabin. Even though she was wearing hooded rain gear, the wind had driven the rain into her face and hair and down her collar. At the same time, my father brought me forward and told her, “This little fellow needs some dry clothes.” My mother shrieked, “He needs dry clothes?” because the fronts of my shirt and shorts were bone dry. Then my dad turned me around, and I was as soaked as she was from rain that had blown in around the edges of the door.

And my brother? He slept through the whole experience in the lower bunk and only came awake when an onion from the galley rolled up against his nose. Or that’s what he always said. Not until forty years later did he confess that he was awake the whole time and terrified.

My parents spent the rest of the evening cleaning up the cabin, watching their bearings to see if the anchor was holding or had torn loose, and trying to nurse the shattered radio—luckily none of the tubes had broken, only the plastic case—to give them a weather report. By about seven o’clock the storm had abated enough that we could cross the narrow stretch of water to a town on the east side of the lake and get some dinner at a rustic local restaurant.

That day on Lake Champlain the entire Thomas family, parents and children, might have disappeared without leaving a bit of wreckage on the face of the water. We might have departed Burlington in the morning and never reached land, and no one would know. So it was a miracle—or good seamanship on my father’s part in dire straits—that saved us. But those five or six hours at the cabin door, fighting the imp and watching the waves roll past, high above my head, changed me forever.

At the age of six, I learned that you could think you were going to die, and you wouldn’t. You could hold out at your self-assigned station, bracing that door, fighting that imp, for longer than you thought you ever could. You could do this because it was your job, your part to play, your place in the family. And there would be no point in giving in to fear, because at the moment of crisis everyone else is busy and no one is going to turn around and take care of you.

That’s a tough thing for a child to learn, but I thank whatever gods may be that I learned it young.

1. See the third footnote to Son of a Mechanical Engineer from March 31, 2013.

2. Hurricane Carol was famous for blowing down the steeple of Old North Church in Boston. The original steeple—this one was a replacement—was the site of the lanterns signaling “one if by land, two if by sea” in the ride of Paul Revere. History is everywhere in the East—and everywhere else, I guess.

Sunday, June 3, 2018

Keeping an Even Temper

Roman mask

Maybe it’s just me. Maybe it’s the place I live, the San Francisco Bay Area. But it seems that too many of the people I meet publicly these days are innately hostile. Like a case of walking road rage. You smile at them and get a glare or a blank stare in return. You ask a question, and you get a reply that is either tinged with scorn—like “Shouldn’t you already know the answer?”—or cold indifference—like “Why don’t you just go jump off the Earth?”

Were people always this rude? I don’t remember this kind of reaction when I was growing up in the East. Sure, some people are grumpy—some of them perpetually. And some people are too busy to talk or pay attention to those around them. That’s always been true. But it seems more and more people in society today are either aggrieved or battened down. It seems as if the social glue that holds us all together has dried out.

If the problem is the place I live, then I have a counter example. Some years ago, we traveled to Austin, Texas, to visit friends. In our journeys around town and across the state to see various sights and attractions, I encountered a pretty good slice of average Texans. I remember seeing and noting many more smiles, friendly greetings, and cordial responses than I’d been getting in California lately. As one example, I was turning a corner in the corridor to the men’s room in one museum and almost collided with an older man, a short fellow half my size wearing a business suit and a Stetson hat. As we mutually retreated, he tilted his head back and said in the cheeriest way, “Howdy!”1

If that had been in San Francisco or Oakland, he would have pushed past me and growled, “Get the [expletive] out of my way!” When I first came to Berkeley, forty years ago, I was standing in line at the cash register in a stationery store. The woman ahead of me completed her purchase, turned quickly, was surprised to find me there, and said, “What are you doing here? You’re blocking my way. Get out!” Yes, ma’am, right away, and sorry to be breathing your air.

This was not how I and my brother—and my spiritual sisters and my cousins, whom I reckon up by dozens—were brought up. My mother was constantly telling us to put a smile on our faces, and not just so that we would be pleasanter company around the house. We were supposed to be nice to the people we meet, nod to the people we know, hold doors for the people coming behind us, pick up our own litter and sometimes that of other people and go find a trash can, and answer respectfully and cheerfully when asked a question.

Being polite is not just good manners but a survival strategy. If you meet the world with a frown or a glare, you’re going to attract the attention of psychopaths. It’s just not healthy living in a state that perpetually provokes people. Incidents of road rage—even of the walking variety—begin with the first honk, the first snarl, the first rude gesture.

I’m also surprised at how casually these people will disrespect me. I stand six foot six and broad in proportion, usually move briskly about my business, and am not apparently decrepit—or not yet anyway. I try not to be menacing in my demeanor, appearance, and body language, consciously do not invade other people’s personal space, and back off a step in any encounter just to be polite. A sensible person could see that I am a healthy male who outweighs them, has a longer reach with more leverage, and could mow them down in any physical clash.2 And yet many smaller, weaker, less equipped people mouth off to a man my size as if they were surrounded by the invisible force field of protection that once was provided by the decorous traditions of a Western civilization in which they apparently no longer believe.

As a result, I walk around with a fixed, sometimes slightly dazed, smile on my face that is only beginning to crack at the corners. And still, as I encounter people out on the street, I am prepared for the next rude look or snarling reply. I am not really happy about it.

There is an art in this world that has to do with empathy, with taking the other person’s views and feelings into account, considering their own situation, and trying not to make them feel bad. Not make them look and feel like fools. My mother taught me this as a kind of protective coloration. “If you don’t move your hand as if you were striking at the dog, you’re less likely to get nipped,” she would say. She taught the art of moving through the world without riling people and attracting the psychopaths you might encounter.

It’s also a better way to get what you want. During forty years in businesses ranging from publishing to engineering, a public utility, and various biotech companies, I saw enough people fail to accomplish their mission and goals, who got their proposals crushed and saw their days go wrong, because they met the world, and the people whose cooperation they needed, with a hard word and the presumption that they were going to be dealing with fools.

Being polite and friendly and perhaps cracking a smile and a joke—“Howdy!”—also gets you better service in restaurants and other retail encounters. I makes other people, unless they are snarling under their own dark cloud, want to do the little bit extra that makes for good, friction-free exchanges.

Perhaps these angry people feel entitled. Many in the Bay Area do, because after all we live in Nirvana, the utopia that is now, and the utopia that is yet to come. Our views are correct, our politics impeccable, and our lifestyle and livelihoods sustainable. Or perhaps these people have had their expectations crushed once too often in this best of all possible worlds. And maybe they are just perpetually grumpy and busy.

But I would share with them my mother’s secret of keeping an even temper, putting a smile on their faces, meeting the world halfway, and taking a moment to make the other person around them feel good.

1. I also remember seeing little blue signs along the Texas highways: “Drive friendly.” For whatever reason, that makes me feel good inside.

2. Well, at the age of seventy, I probably still could, being an old black belt who runs through the karate katas every other morning as a form of exercise (see Isshinryu Karate). I would take my licks in a fight against a younger man with any street experience, but I am not exactly feeble or undangerous myself.

Sunday, May 27, 2018

Predicting the Future

The Fool
The Magician

Back at the university, I took a course on the future. It was the late ’60s; I had three elective credits to spend; and my mentor in the English Department1 was teaching this course as part of a general broadening of the Liberal Arts curriculum, which the student body was demanding. Because I wanted to be a writer of creative fiction, specializing in science fiction, learning about predicting the future seemed to be a fit. And, hey, the New Age movement was just taking off, and the subject would also include some of the less scientific, more mystical approaches to divination, such as the I Ching and Tarot cards.

We read various authors on their predictions, including Robert L. Heilbroner’s The Future as History. We studied issues related to observation and probability. And yes, I wrote a paper on the Tarot, more as a matter of comparative literature than as a useful means of prediction. Finally, we examined various predictive strategies related to economics and similar fields, along with their fallacies.

One of my key takeaways from the course is that the future is in flux. One fallacy of prediction that human beings routinely fall into is the “if this goes on …” form of prediction. You see it endlessly played out among trend spotters. “If housing prices keep rising, soon no one here will be able to afford a house.” The same thoughts and fears have driven the value of tech stocks and dot-com stocks in the past, along with the price of tulip bulbs and shares in the South Sea Company. One curve drives all. We see this in economics all the time, too, where one theory or another follows a trend until it flies right out the window.

I saw this tendency demonstrated personally when I worked at the biotech company about fifteen years ago. Our genetic sequencing equipment and reagents had been used exclusively by both the Human Genome Project’s sequencing centers and by our own sister company, Celera, which had introduced “shotgun sequencing” to speed up the first draft of the genome.2 After that draft had been released, the vice president of our division, which had made and was continuing to make those reagents, showed his team a sales chart of the past couple of quarters with its bold upward trend. He predicted even greater sales in this product line for years to come.

In one sense, he was right. The success of the Human Genome Project might well have led to more and varied sequencing efforts in other laboratories. The inventory of other useful genomes—from mice and chimps as comparative human models, to thousands of different bacteria, plants, and pests as targets for finding genetic strengths and weaknesses—existed in the real world and would demand study now that we had broken through with the complete human genome. In another sense, however, he was wrong. The ready market for sequencing using these high-cost, first-generation machines was already saturated. Continued study of parallel genomes would demand lower-cost, faster and more versatile, next-generation machines—which our company was already working on.

This vice president had fallen into the “if this goes on” fallacy. Yes, if past sales were to keep up, the future for these instruments looked rosy. But other factors were in play. Predicting the future is a mix of having the right model—in this case, a simple tracking of previous orders against the company’s productive capacity at various levels of investment and expansion—and thought experiments to learn what is actually happening in the world and what other factors might affect the model. From there, you get into analysis of perturbations, positive and negative feedback loops, the model’s sensitivity to certain data sets, and other necessary adjustments.

Something similar seems to be going on with current questions in climate science and the political issue of anthropogenic global warming.3 For reasons of proprietary intellectual property, the models that various scientists are using to predict temperature rise around the world are closed to general view. That means their feedback and sensitivity settings may not be right, but nobody can say for certain. What we do know is that, while the models may be aligned with some anecdotal observations from around the world, and have been used to confirm trends seen in certain habitats, they have had trouble predicting past warming or cooling trends from previously available data. So how much are the modeler’s predictions predicated on the “if this goes on” fallacy?

Beyond learning to be wary of scientific predictions about the future, I also learned about the Tarot. This is a fascinating subject, because not only are the four suits—Swords, Wands, Coins, and Cups4—related to the suits of modern playing cards, but the structure of ten numbered cards plus group of face cards—Kings, Queens, Knights, Pages—corresponds to our modern King, Queen, Jack. With four suits of fourteen cards, the basic Tarot deck numbers fifty-six cards of what is called the Minor Arcana. The Major Arcana is a series of twenty-two cards—or more accurately, cards numbered zero to twenty-one, the way computer programmers count—depicting archetypal figures. These figures include The Fool and The Magician—who are the seeker’s beginning and end states in his or her journey among the arcana in search of knowledge—and other characters such as the Emperor, the Hierophant, and the Lovers, and figurative states such as Justice, Temperance, Death, and the Devil.

The Tarot supposedly originated with the Gypsies, the Roma, and thereby hangs a tale. As I learned much later from a book by Isabel Fonseca, Bury Me Standing: The Gypsies and Their Journey, these mysterious people didn’t come from Egypt or Romania at all, but had their origins in northwestern India in the early Middle Ages. They were wandering musicians and performers who migrated gradually through the Middle East and Eastern Europe. They brought with them one book, the seventy-eight cards of the Tarot deck. It was their explanation of the forces and personalities that drive human nature.

If you like, the Tarot is not so much a device of divination as a readymade collection of visual images going back to Indo-European myth and psychology. They remind the reader or seeker of important relationships worked out centuries and millennia ago. They are the dark soul of the ancient people from whom we sprang. They are to Europeans an underlying story—standing outside the Judeo-Christian tradition—in the same way that the etched symbols on turtle shells became the sixty-four hexagrams of the I Ching.

What else did I learn from that course on predicting the future? Not, after all, that you can’t do it. You certainly can predict the future in any number of ways—you just can’t be sure of the accuracy of your predictions. But the surest method is to pick a direction, the place you want to be or the end-state you wish to achieve, and start walking and working toward it.

You don’t predict the future. You make it happen yourself.

1. This was Philip Klass, who wrote delightful novels and short stories under the pen name William Tenn. He is not to be confused with Philip J. Klass, the aviation journalist and UFO debunker.

2. See Continuing Mysteries of the Genome from October 12, 2014.

3. See Science and Computer Modeling from June 29, 2014.

4. The Wands are sometimes called batons, rods, or staves, and the Coins are sometimes pentacles or disks.

Sunday, May 20, 2018

The Moon and Beyond

Full Moon

We humans are a migrant species—at least most of us. Out of Africa, across the world. There and back again. We have itchy feet and restless natures. That’s what comes of having a big brain, inventive ideas, and a general dissatisfaction with the status quo.

Of course, over the ages pockets of people have settled down and remained content. Consider the West Africans at the dawn of humanity, who found rich valleys around the Congo and Volta rivers and did not follow the rest of humankind out of East Africa’s stark Rift Valley and into the wider world. And since the dawn of agriculture we have seen the rise of various empires based on water or some other natural resource: the Mesopotamians, the Egyptians, the Qin and Han Dynasties in China, the Mayans and the Incas in the Americas. Once you build infrastructure around a resource, like an irrigation system beside a broad river in a fertile plain, some people will stay put to harvest and use it.

But for the most part, humanity has been on the move ever since we learned to walk. We are one species that has adapted itself, through its brains, muscles, imagination, and courage, to environments as difficult and varied as desert oases, rainforest jungles, and Arctic permafrost.

The story of Europe—to take just a small corner of the globe—has been one of successive overruns from outside. From back before the beginning of recorded history, we have tantalizing pockets of unrelated languages in the continent’s far corners: Finno-Ugric in the far north, the Ural Mountains, and the Hungarian Plain; Pictish at the northern end of the British Isles; and Basque in the northern mountains of the Iberian Peninsula, in the awkward corner between France and Spain. These are mostly places out of the way of regular migration routes. For the rest of Europe—and strangely, parts of northern India—we find a common root language, Indo-European, which is the father of the Norse, Germanic, Greek, and Romance languages.

I attribute this spread of common language to what I call a “people pump” operating out of the Caucasus Mountains. For ages since antiquity it has fed restless groups of people north onto the steppes. There they got up on horses and rode west into Europe and east into the Indus and Ganges valleys. The history of the Greek peninsula and Asia Minor, or modern Turkey, is the story of invasion by the Dorians, Ionians, and the mysterious Sea Peoples, who got moving about the time of the Trojan War. The story of the Mediterranean as a whole is the movement west by Phoenicians, Greeks, and perhaps those misplaced Trojans, who fetched up in Etruscan Italy to become Romans. While the Romans were building their empire, the Celts crossed from Turkey into Austria and progressed through Germany and northern France into Britain. And as the Romans were losing their empire, the Goths and Vandals moved out of the Baltic region and Poland to pass through southern France and Spain and sack Rome itself. The story of the British Isles is the invasion of Celtic lands by Frisians and Saxons, Danes, and finally by those Vikings who had settled in Normandy, became Frenchmen themselves, and then went off north to conquer England.

Europe is a restless place. The movements appeared to subside in the Dark Ages after the collapse of Rome, and it looked like people were finally settling down. But then the art of building seaworthy ships—thanks in large part to the Vikings—caught up with people’s yearning to travel, and Europeans braved the Atlantic Ocean starting in the 15th century. De Gama went south around Africa to find a route to India and its riches. Magellan went south around Cape Horn to find a route to Asia. And Columbus, funded by the Spanish crown, sailed due west and discovered the richest prize of all.1

And the migration has continued ever since. Millions of Europeans have left the Old World for the New one across the Atlantic Ocean, starting almost as soon as the first colonies were established in the 16th century. And in later centuries they “discovered” and occupied large parts of Africa, India, and Australia and built enclaves and empires throughout the old, established empires of Asia.

But that doesn’t mean the rest of the world is full of pleasant, peaceable homebodies. The story of China has been one of repeated invasions from the north—the whole purpose of their Great Wall. And their Mongol neighbors conquered and briefly held the largest land empire in history. The Arabs followed the instructions of their Prophet and invaded Europe through North Africa and Spain, and through the Balkans up to Vienna. They moved into Central Asia along the Silk Road and entered India. Everybody steps on their neighbors at some point. In the 17th and 18th centuries, Iroquois of what would become Upstate New York fought the Hurons and Algonquians. And before that the Aztecs tried to conquer the Tlaxcalans, among other groups, in modern-day Mexico. Everybody invades. Everybody fights.

What does all this have to do with the Moon? Simply that we are a restless people by nature. When one place becomes too settled, too predictable, too bound by property rights and rules, too hemmed in with political alliances and charitable organizations, a certain percentage of the people are going to rebel. Some will opt for revolution and social upheaval, but many will just light out for the new territory, the next frontier, the land beyond the mountains.

In the 1960s, we Americans went to the Moon. It was the capstone of a space program begun in the Eisenhower Administration as a response to Russian rocketry and then promoted by President John F. Kennedy—“not because it is easy, but because it is hard.” The Apollo Program was a science experiment, a seed crystal for developing new technologies focused on outer space. In that sense, it was not a migration or colonization effort. It was in the nature of De Gama’s and Magellan’s voyages: go there, prove it can be done, come back.

Since then, we have sent robot probes all around the Solar System and even out beyond the heliopause to interstellar space. We have focused our human presence and efforts on science experiments and scientific and commercial satellites in Earth orbit. But most people, at least in the developed countries, believe we will go back to the Moon and travel to Mars—not just as an experiment or to gather data, but to colonize.

I am one of those people. Whether it’s a government program or funded by private entrepreneurs like SpaceX and Virgin Galactic, and whether it’s a base on the Moon or a colony on Mars, those are details. The Moon is nearby and completely airless, washed by the harsh radiation of the solar wind. Mars is farther away and has more available resources, including an atmosphere rich in carbon dioxide2 and possibly water in the form of ice, but it still gets a hard blast of radiation because Mars’s core is dead and no longer generating a magnetic field.

Either choice will be hard and will launch us on a new wave of technological discovery. Given the logistics and the ambient environment associated with either place, it would be easier to build a five-star hotel with an Olympic-sized swimming pool on the peak of Mount Everest—or, say, at Camp 4 on the South Col of the mountain, which approaches the “death zone” and its lack of breathable oxygen. Or you could build the same resort 500 meters (1,640 feet) down in the Red Sea. That would probably be easier, because years of submarine building have taught us how to handle water pressure at those depths.

But we will go, if not in this century, then in the next. Once we were a land-wandering people who only looked out on the deep blue with longing, until we acquired the technology to cross the oceans. Now we are an ocean-faring people—a people who routinely fly over the ocean’s vast barrier—who look at the deep black among the stars with longing.

One day, we will go there. And then it will be easy.

1. Except for the Vikings, who had ventured out long before and discovered and settled Iceland, Greenland, and—so rumor has it—Newfoundland. Of course, the greatest migration into the Americas came at the end of the last Ice Age, when Siberian hunters crossed the land bridge that is now the Bering Strait and flooded both the northern and southern continents.

2. Mars’s atmosphere, however, with a pressure less than one percent that of Earth’s, would qualify as a good laboratory vacuum with trace gases.

Sunday, May 13, 2018

The Original Jedi Mind Trick

Volcanic opening

Supposedly, in the Star Wars universe, the Jedi knights could control the thoughts and perceptions of other people in order to slip through the world without conflict or incident: “These are not the droids you’re looking for.” Whether they used telepathy or simply changed the appearance of the world and the other person’s apprehension of it—rippling the Force to their own advantage—it was a neat trick.

My parents taught me something similar, except it didn’t work on other people. It was a form of mind control directed at yourself. This is nothing new or exotic: we see posters all the time, more than ever on social media like Facebook, advising that you can’t change what happens to you, but you can change how you feel about and react to it. Like the Jedi Mind Trick, it’s a Zen thing.

A story from Zen Flesh Zen Bones concerns two monks walking down the sidewalk in the rain. They come to a corner where a beautiful geisha in her fine silk kimono is dithering about having to cross the muddy street. The older monk says, “Come on, darling,” picks her up, and carries her across. This horrifies the younger monk, who fumes about it as they walk along the next block. Finally, he cannot contain himself. “You know we’re not supposed to have anything to do with women, let alone geishas. Yet you handled her in a very familiar way.” The old monk turns to him in surprise. “Are you still carrying her? I put her down back at the corner.”

The world may exist in itself—objective data and incidents do exist outside your field of perception—but how you perceive it, what you make of it, and how it affects you is the Jedi Mind Trick. You can stare into the open caldera of an active volcano, or walk the steaming lava fields of Kilauea, fear fire and death, and become paralyzed. Or you can experience these things and see their wonder and beauty. Your response shapes the world.

When I worked in the Kaiser organization, one of the many stories about its founder, Henry J. Kaiser, came from the end of World War II. He heard at a dinner party that the U.S. government was putting up for sale some aluminum smelters it had built along the Columbia River to supply metal for manufacturing aircraft as part of the war effort. The war was over and the smelters were being sold as surplus. Now Kaiser knew nothing about aluminum. But when he got home that night he called his vice president in the iron and steel business, Tom Price, and asked for a report on the aluminum business. Kaiser wanted it on his desk by eight o’clock the next morning.

Price didn’t know anything about aluminum, either. So, according to the story, he went to his children’s encyclopedia and looked up about mining bauxite (which is just a form of dirt that concentrates a common mineral, alum), then chemically processing that dirt into pure aluminum oxide powder (Al2O3, also known as alumina), and electrolytically smelting that powder into aluminum metal. Clearly, just owning the smelters was not the whole business; you needed facilities in two or three areas. For example, the smelters had to be near a ready source of electricity, which the dams of the Columbia River were already supplying, while the mines might be a continent away on ground rich in bauxite, and the chemical plants could be anywhere in between where it was profitable to operate them. Tom Price copied this all down in a couple of handwritten pages. Kaiser read them and bought the smelters the next morning.

The difference between what the government was going to let go as surplus and what Kaiser wanted to buy as the core of his new business was vision. They were the same smelters either way. But the government was through building airplanes for the war, didn’t want to pay to run the smelters anymore, and was willing to let them go for scrap. Kaiser saw how this lightweight but strong metal had served in one application—becoming fighters and bombers—and was willing to bet that it could be useful in any number of other applications, from lawn furniture to house siding to soda cans and the trays for TV dinners. He wasn’t the only one to see this business, but he saw the opportunity and was willing to act on it fast.1

Henry Kaiser had a positive outlook on life. When he was in the cement business, he wanted to paint his trucks pink, even when other officers in his company suggested a more sedate gray-and-green pattern. “Pink is a happy color,” Kaiser responded. People also said that his negotiating style was that of the “happy elephant”: when confronted with opposition, he would just lean and smile, lean and smile, until he got his way. That man understood the Jedi Mind Trick; it just took longer than waving your fingers and speaking in a reassuring voice.

Another aspect of the Mind Trick is not letting personal hurts matter you. A scene in the movie Lawrence of Arabia has Lawrence demonstrate to some young officers how he puts out a match with his fingertips. When one of the others tries it, he exclaims, “It damn well hurts!” Lawrence smiles and replies, “The trick, William Potter, is not minding that it hurts.”

The world is full of burning matches and a lot worse. One is reminded of Hamlet’s “slings and arrows of outrageous fortune.” As a fully functioning human being, we can either dwell upon them, take offense, file a grievance, and nurse a grudge,2 or we can accept that being alive in the world comes with an infinite number of bumps and stings, hard looks and rude responses, and we can let them roll off as if we were personally coated in Teflon.

And when we die, as we all must, we can look back on that life as we pass out of this world. Whether you believe that you will go to some elsewhere mystical place, a heaven or hell, or that you will simply go out, like Buddha’s candle flame or Lawrence’s match, you can bet that you yourself will definitely be beyond caring, and probably be beyond even knowing, what effect you had in life and whether it was positive or negative. In that situation, your life as you live it here and now in this world can either be a futile waste, just one more surplus human being taking up space and consuming value, like those government smelters, or you can see the same sort of opportunities for a better future that Henry J. Kaiser saw all around him. You can make your space in the world as big and happy, as pink and elephantlike, as your imagination allows.

The trick, as Lawrence would say, is not seeing fiery death in the volcano but seeing the beauty of nature that surrounds you. The rest is simply walking the path that you see.

1. Another Kaiser venture after the war, when he tried to turn the business of making Jeeps into a car company to go up against General Motors, Ford, and Chrysler, didn’t work out so well. But then, Kaiser also knew the motto of every venture capitalist: “You pay your money and you take your chance.”

2. Or to quote the painter Paul Gaugin: “Life being what it is, one dreams of revenge—and has to content oneself with dreaming.”

Sunday, May 6, 2018

Biological Nanotech

Algae making biofuel

I can remember, oh, twenty years ago and maybe more, seeing on television the microscopic image of what was supposed to be the world’s smallest electric motor. It showed a rotor that had been cut—inexactly, so that it was not a perfectly round circle—from some kind of metal. It spun—not fast and not smoothly—against a stator plate made of some other metal. It wasn’t good for much else than the gee-whiz factor, but it was a motor smaller than, say, the period at the end of this sentence. That motor was probably the beginning of humankind’s dreams of nanotechnology.

The world of tiny motors has gotten a lot smaller since then. What is now supposed to be the smallest on record is a slim fraction of the width of a human hair, and the current effort is supposed to have a rotor that is just one molecule. Not a material one molecule wide or thick or high, but the whole rotor is composed of a single molecule. That makes the manufacturing process more a matter of chemistry than metalwork.

The idea behind nanotechnology is to design machines that work at the submicroscopic level, down at the scale of micrometers (millionths of a meter) and more likely nanometers (billionths of a meter).1 At the nano level, we’re not just talking about active dust—more like tiny mites compared to which dust is a boulder the size of a house. What any of these machines might do is in the nature of “If you build it, someone will find a use for it.” And that may be why the whole enterprise has been so slow to start: it is a world of theory looking for a purpose, rather than, as Henry J. Kaiser used to say, “Find a need and fill it.”

One thing is certain though: nobody is going to build just one of these nanites or nanobes or whatever you call it and expect to accomplish much of anything. One of them would be a technological wonder, which might be examined with a scanning or tunneling electron microscope, applauded, and then dismissed with a shrug. To achieve any real effect at the quantum level—which these machines are approaching—you have to make and launch thousands or rather millions of them and then rely on statistical measurement to observe their effects. That is, however many of the nanites or nanobes you make, a certain percentage will be defective and not work at all; a larger percentage will technically work but may never find the “shop floor” on which they are supposed to operate; and an even larger percentage will work for a while and then hit an air pocket or a vacuole or some other dry spot or barrier and wander away. This is like counting the number of molecules of acetylsalicylic acid in an aspirin tablet and asking how many of them you actually need to relieve a headache: as many of them as find the right nerves.

On this basis, with the machines so tiny and their singular effects so negligible, I can’t imagine that anyone is seriously going to try making them using the traditional methods of materials processing. That is, nobody will be buying raw materials, molding and cutting individual pieces and parts them (like that tiny metal motor), and then assembling these components in the same way Ford puts together the chassis, engine, wheels, and doors to make an automobile in Dearborn. Nobody is going to drop a molecular motor into a molecular framework—not even with a tiny molecular eyedropper2—and hook it up to molecular axles and wheels.

Down among the microbes and the nanobes, you have to stop thinking of this technology as some kind of machine. You have to treat it as a life form. Why would you try to design and fabricate metal wires, springs, and motors, manually pack them into tiny plastic shells and metal frames, and hope to have everything work at the molecular level, the nano-scale? It would be so much simpler to program these components in DNA and grow electro-chemical control circuits with actual nerves, achieve motor function with the elastic expansions and compressions of muscle fibers and proteins, and house everything in shells made of cellulose or keratin and frames made of calcium.3

When I was working at the biotech company, I heard about Craig Venter sending his 95-foot sloop Sorcerer II around the Sargasso Sea, then the Baltic and the Mediterranean seas, to sample the world’s oceans. He wasn’t looking for new sea creatures, although his team did discover that what we normally think of as isolated plankton species are usually whole genera that evolve and change every twenty miles or so. No, he was looking for novel proteins, in novel combinations, and with novel functions, along with the DNA genes and promoters that would code for them. His idea was to find ways to change the life-cycle, the operation, and the metabolic inputs and outputs of existing microbes to make them more useful to human beings.

For example, adding the right set of new genes might give algae a way to turn their photosynthetic processes to making lipids—fatty liquids with properties similar to crude oil—and then secrete them through their cell walls, so that each cell can go on making this oil substitute without becoming engorged and either stopping production or exploding. Such an algae cell—or a whole pond full of billions of them—would lie there in the sunlight and produce a form of oil that could be siphoned off the surface and refined to make gasoline. And then, with a bit of chemical tinkering, the cell might even be coaxed into making gasoline itself, if the stuff weren’t so toxic.

Of course, these would be cells that have been modified with the DNA sequences, proteins, and functional relationships between proteins that are already present in nature.

All such DNA is currently purposed to design and repair living creatures. Any adaptations either serve to improve the living body or else they become discarded over time—immediately if they are lethal to necessary functions. But the principles of coding and self-assembly might easily be adapted to small machines that operate in the submicroscopic environment, like single-celled creatures, for other purposes, ones designed by human beings. It would, after all, be easier to grow a microprocessor as a network of neurons than to etch one in silicon at the nanoscale, install it inside a mechanism, and wire it into sensory and motor systems.

Purposefully designing DNA to create new nanomachines might even employ metals and other materials we don’t currently think of as organic. For example, the epithelial cells in the mammalian jaw that form tooth buds secrete a mineral called hydroxyapatite, a crystalline form of calcium phosphate, which becomes the enamel surface of our teeth. Enamel is the hardest substance in the human body, and it contains the highest percentage of minerals. With a bit of chemical tinkering, such cells might be taught to absorb—from the managed environment of a bioreactor—and secrete other minerals and compounds. A pure structure or surface of, say, vanadium steel is not likely, or not at first. But hard parts made of bonelike and stonelike materials should be possible. And of course, making anything with polymers and resins, like plastics, should be a DNA-coding snap.

Nanomachines—or certainly micrometer-scale machines—might be made by groups of preprogrammed cells. Like tooth buds, or the embryos of living beings, they would form a cocoon of tissue that produced each part in place and then would be programmed to die and wash away,4 leaving the new micromachine in place and ready to operate.

And what would the new machine do? Well … it’s hardly likely we’ll need anything that small to pave roads or drive on them, or to manufacture complex machinery like automobiles or kitchen blenders. And we already have little cellular machines that can make usable oils and even drinkable beer and wine; they’re called seeds and yeasts. Complex little machines might be designed to repair the human body, or even repair and resurface the bodywork on your car. It will all depend on what you want.

Find a need and fill it.

1. Remember that a meter is just over a yard, 39.3701 inches. So a millionth or a billionth of that length is a significantly reduced measurement.

2. For which the technical term in biochemistry is “pipette,” and those things can only be accurately calibrated at scale of about milliliter, or thousandth of a liter—which itself is about a quart.

3. Of course, one-celled animals already have chemical motors that can whip around in circles, powering flagella for their movement through the liquid medium; so even the electric motor can be replaced with a living example from the biological world. That might be the easiest part of the machine to design, because the prototype already exists in nature. But circular motion has limited use in the submicroscopic environment. We use round wheels at the human scale mostly to propel loads over level terrain, and when the going gets too rough we revert to horses or mules. Motive power from limbs articulated by mechanical joints and muscle fibers might be more useful in the world of the really tiny. Other wheel-like functions, such as the gears in clocks, can be achieved in other ways.

4. The technical term for “programmed cell death” is apoptosis, and the word for “wash away” is lysis, involving the chemical dissolution and destruction of the cell membrane and its contents.