Sunday, May 20, 2018

The Moon and Beyond

Full Moon

We humans are a migrant species—at least most of us. Out of Africa, across the world. There and back again. We have itchy feet and restless natures. That’s what comes of having a big brain, inventive ideas, and a general dissatisfaction with the status quo.

Of course, over the ages pockets of people have settled down and remained content. Consider the West Africans at the dawn of humanity, who found rich valleys around the Congo and Volta rivers and did not follow the rest of humankind out of East Africa’s stark Rift Valley and into the wider world. And since the dawn of agriculture we have seen the rise of various empires based on water or some other natural resource: the Mesopotamians, the Egyptians, the Qin and Han Dynasties in China, the Mayans and the Incas in the Americas. Once you build infrastructure around a resource, like an irrigation system beside a broad river in a fertile plain, some people will stay put to harvest and use it.

But for the most part, humanity has been on the move ever since we learned to walk. We are one species that has adapted itself, through its brains, muscles, imagination, and courage, to environments as difficult and varied as desert oases, rainforest jungles, and Arctic permafrost.

The story of Europe—to take just a small corner of the globe—has been one of successive overruns from outside. From back before the beginning of recorded history, we have tantalizing pockets of unrelated languages in the continent’s far corners: Finno-Ugric in the far north, the Ural Mountains, and the Hungarian Plain; Pictish at the northern end of the British Isles; and Basque in the northern mountains of the Iberian Peninsula, in the awkward corner between France and Spain. These are mostly places out of the way of regular migration routes. For the rest of Europe—and strangely, parts of northern India—we find a common root language, Indo-European, which is the father of the Norse, Germanic, Greek, and Romance languages.

I attribute this spread of common language to what I call a “people pump” operating out of the Caucasus Mountains. For ages since antiquity it has fed restless groups of people north onto the steppes. There they got up on horses and rode west into Europe and east into the Indus and Ganges valleys. The history of the Greek peninsula and Asia Minor, or modern Turkey, is the story of invasion by the Dorians, Ionians, and the mysterious Sea Peoples, who got moving about the time of the Trojan War. The story of the Mediterranean as a whole is the movement west by Phoenicians, Greeks, and perhaps those misplaced Trojans, who fetched up in Etruscan Italy to become Romans. While the Romans were building their empire, the Celts crossed from Turkey into Austria and progressed through Germany and northern France into Britain. And as the Romans were losing their empire, the Goths and Vandals moved out of the Baltic region and Poland to pass through southern France and Spain and sack Rome itself. The story of the British Isles is the invasion of Celtic lands by Frisians and Saxons, Danes, and finally by those Vikings who had settled in Normandy, became Frenchmen themselves, and then went off north to conquer England.

Europe is a restless place. The movements appeared to subside in the Dark Ages after the collapse of Rome, and it looked like people were finally settling down. But then the art of building seaworthy ships—thanks in large part to the Vikings—caught up with people’s yearning to travel, and Europeans braved the Atlantic Ocean starting in the 15th century. De Gama went south around Africa to find a route to India and its riches. Magellan went south around Cape Horn to find a route to Asia. And Columbus, funded by the Spanish crown, sailed due west and discovered the richest prize of all.1

And the migration has continued ever since. Millions of Europeans have left the Old World for the New one across the Atlantic Ocean, starting almost as soon as the first colonies were established in the 16th century. And in later centuries they “discovered” and occupied large parts of Africa, India, and Australia and built enclaves and empires throughout the old, established empires of Asia.

But that doesn’t mean the rest of the world is full of pleasant, peaceable homebodies. The story of China has been one of repeated invasions from the north—the whole purpose of their Great Wall. And their Mongol neighbors conquered and briefly held the largest land empire in history. The Arabs followed the instructions of their Prophet and invaded Europe through North Africa and Spain, and through the Balkans up to Vienna. They moved into Central Asia along the Silk Road and entered India. Everybody steps on their neighbors at some point. In the 17th and 18th centuries, Iroquois of what would become Upstate New York fought the Hurons and Algonquians. And before that the Aztecs tried to conquer the Tlaxcalans, among other groups, in modern-day Mexico. Everybody invades. Everybody fights.

What does all this have to do with the Moon? Simply that we are a restless people by nature. When one place becomes too settled, too predictable, too bound by property rights and rules, too hemmed in with political alliances and charitable organizations, a certain percentage of the people are going to rebel. Some will opt for revolution and social upheaval, but many will just light out for the new territory, the next frontier, the land beyond the mountains.

In the 1960s, we Americans went to the Moon. It was the capstone of a space program begun in the Eisenhower Administration as a response to Russian rocketry and then promoted by President John F. Kennedy—“not because it is easy, but because it is hard.” The Apollo Program was a science experiment, a seed crystal for developing new technologies focused on outer space. In that sense, it was not a migration or colonization effort. It was in the nature of De Gama’s and Magellan’s voyages: go there, prove it can be done, come back.

Since then, we have sent robot probes all around the Solar System and even out beyond the heliopause to interstellar space. We have focused our human presence and efforts on science experiments and scientific and commercial satellites in Earth orbit. But most people, at least in the developed countries, believe we will go back to the Moon and travel to Mars—not just as an experiment or to gather data, but to colonize.

I am one of those people. Whether it’s a government program or funded by private entrepreneurs like SpaceX and Virgin Galactic, and whether it’s a base on the Moon or a colony on Mars, those are details. The Moon is nearby and completely airless, washed by the harsh radiation of the solar wind. Mars is farther away and has more available resources, including an atmosphere rich in carbon dioxide2 and possibly water in the form of ice, but it still gets a hard blast of radiation because Mars’s core is dead and no longer generating a magnetic field.

Either choice will be hard and will launch us on a new wave of technological discovery. Given the logistics and the ambient environment associated with either place, it would be easier to build a five-star hotel with an Olympic-sized swimming pool on the peak of Mount Everest—or, say, at Camp 4 on the South Col of the mountain, which approaches the “death zone” and its lack of breathable oxygen. Or you could build the same resort 500 meters (1,640 feet) down in the Red Sea. That would probably be easier, because years of submarine building have taught us how to handle water pressure at those depths.

But we will go, if not in this century, then in the next. Once we were a land-wandering people who only looked out on the deep blue with longing, until we acquired the technology to cross the oceans. Now we are an ocean-faring people—a people who routinely fly over the ocean’s vast barrier—who look at the deep black among the stars with longing.

One day, we will go there. And then it will be easy.

1. Except for the Vikings, who had ventured out long before and discovered and settled Iceland, Greenland, and—so rumor has it—Newfoundland. Of course, the greatest migration into the Americas came at the end of the last Ice Age, when Siberian hunters crossed the land bridge that is now the Bering Strait and flooded both the northern and southern continents.

2. Mars’s atmosphere, however, with a pressure less than one percent that of Earth’s, would qualify as a good laboratory vacuum with trace gases.

Sunday, May 13, 2018

The Original Jedi Mind Trick

Volcanic opening

Supposedly, in the Star Wars universe, the Jedi knights could control the thoughts and perceptions of other people in order to slip through the world without conflict or incident: “These are not the droids you’re looking for.” Whether they used telepathy or simply changed the appearance of the world and the other person’s apprehension of it—rippling the Force to their own advantage—it was a neat trick.

My parents taught me something similar, except it didn’t work on other people. It was a form of mind control directed at yourself. This is nothing new or exotic: we see posters all the time, more than ever on social media like Facebook, advising that you can’t change what happens to you, but you can change how you feel about and react to it. Like the Jedi Mind Trick, it’s a Zen thing.

A story from Zen Flesh Zen Bones concerns two monks walking down the sidewalk in the rain. They come to a corner where a beautiful geisha in her fine silk kimono is dithering about having to cross the muddy street. The older monk says, “Come on, darling,” picks her up, and carries her across. This horrifies the younger monk, who fumes about it as they walk along the next block. Finally, he cannot contain himself. “You know we’re not supposed to have anything to do with women, let alone geishas. Yet you handled her in a very familiar way.” The old monk turns to him in surprise. “Are you still carrying her? I put her down back at the corner.”

The world may exist in itself—objective data and incidents do exist outside your field of perception—but how you perceive it, what you make of it, and how it affects you is the Jedi Mind Trick. You can stare into the open caldera of an active volcano, or walk the steaming lava fields of Kilauea, fear fire and death, and become paralyzed. Or you can experience these things and see their wonder and beauty. Your response shapes the world.

When I worked in the Kaiser organization, one of the many stories about its founder, Henry J. Kaiser, came from the end of World War II. He heard at a dinner party that the U.S. government was putting up for sale some aluminum smelters it had built along the Columbia River to supply metal for manufacturing aircraft as part of the war effort. The war was over and the smelters were being sold as surplus. Now Kaiser knew nothing about aluminum. But when he got home that night he called his vice president in the iron and steel business, Tom Price, and asked for a report on the aluminum business. Kaiser wanted it on his desk by eight o’clock the next morning.

Price didn’t know anything about aluminum, either. So, according to the story, he went to his children’s encyclopedia and looked up about mining bauxite (which is just a form of dirt that concentrates a common mineral, alum), then chemically processing that dirt into pure aluminum oxide powder (Al2O3, also known as alumina), and electrolytically smelting that powder into aluminum metal. Clearly, just owning the smelters was not the whole business; you needed facilities in two or three areas. For example, the smelters had to be near a ready source of electricity, which the dams of the Columbia River were already supplying, while the mines might be a continent away on ground rich in bauxite, and the chemical plants could be anywhere in between where it was profitable to operate them. Tom Price copied this all down in a couple of handwritten pages. Kaiser read them and bought the smelters the next morning.

The difference between what the government was going to let go as surplus and what Kaiser wanted to buy as the core of his new business was vision. They were the same smelters either way. But the government was through building airplanes for the war, didn’t want to pay to run the smelters anymore, and was willing to let them go for scrap. Kaiser saw how this lightweight but strong metal had served in one application—becoming fighters and bombers—and was willing to bet that it could be useful in any number of other applications, from lawn furniture to house siding to soda cans and the trays for TV dinners. He wasn’t the only one to see this business, but he saw the opportunity and was willing to act on it fast.1

Henry Kaiser had a positive outlook on life. When he was in the cement business, he wanted to paint his trucks pink, even when other officers in his company suggested a more sedate gray-and-green pattern. “Pink is a happy color,” Kaiser responded. People also said that his negotiating style was that of the “happy elephant”: when confronted with opposition, he would just lean and smile, lean and smile, until he got his way. That man understood the Jedi Mind Trick; it just took longer than waving your fingers and speaking in a reassuring voice.

Another aspect of the Mind Trick is not letting personal hurts matter you. A scene in the movie Lawrence of Arabia has Lawrence demonstrate to some young officers how he puts out a match with his fingertips. When one of the others tries it, he exclaims, “It damn well hurts!” Lawrence smiles and replies, “The trick, William Potter, is not minding that it hurts.”

The world is full of burning matches and a lot worse. One is reminded of Hamlet’s “slings and arrows of outrageous fortune.” As a fully functioning human being, we can either dwell upon them, take offense, file a grievance, and nurse a grudge,2 or we can accept that being alive in the world comes with an infinite number of bumps and stings, hard looks and rude responses, and we can let them roll off as if we were personally coated in Teflon.

And when we die, as we all must, we can look back on that life as we pass out of this world. Whether you believe that you will go to some elsewhere mystical place, a heaven or hell, or that you will simply go out, like Buddha’s candle flame or Lawrence’s match, you can bet that you yourself will definitely be beyond caring, and probably be beyond even knowing, what effect you had in life and whether it was positive or negative. In that situation, your life as you live it here and now in this world can either be a futile waste, just one more surplus human being taking up space and consuming value, like those government smelters, or you can see the same sort of opportunities for a better future that Henry J. Kaiser saw all around him. You can make your space in the world as big and happy, as pink and elephantlike, as your imagination allows.

The trick, as Lawrence would say, is not seeing fiery death in the volcano but seeing the beauty of nature that surrounds you. The rest is simply walking the path that you see.

1. Another Kaiser venture after the war, when he tried to turn the business of making Jeeps into a car company to go up against General Motors, Ford, and Chrysler, didn’t work out so well. But then, Kaiser also knew the motto of every venture capitalist: “You pay your money and you take your chance.”

2. Or to quote the painter Paul Gaugin: “Life being what it is, one dreams of revenge—and has to content oneself with dreaming.”

Sunday, May 6, 2018

Biological Nanotech

Algae making biofuel

I can remember, oh, twenty years ago and maybe more, seeing on television the microscopic image of what was supposed to be the world’s smallest electric motor. It showed a rotor that had been cut—inexactly, so that it was not a perfectly round circle—from some kind of metal. It spun—not fast and not smoothly—against a stator plate made of some other metal. It wasn’t good for much else than the gee-whiz factor, but it was a motor smaller than, say, the period at the end of this sentence. That motor was probably the beginning of humankind’s dreams of nanotechnology.

The world of tiny motors has gotten a lot smaller since then. What is now supposed to be the smallest on record is a slim fraction of the width of a human hair, and the current effort is supposed to have a rotor that is just one molecule. Not a material one molecule wide or thick or high, but the whole rotor is composed of a single molecule. That makes the manufacturing process more a matter of chemistry than metalwork.

The idea behind nanotechnology is to design machines that work at the submicroscopic level, down at the scale of micrometers (millionths of a meter) and more likely nanometers (billionths of a meter).1 At the nano level, we’re not just talking about active dust—more like tiny mites compared to which dust is a boulder the size of a house. What any of these machines might do is in the nature of “If you build it, someone will find a use for it.” And that may be why the whole enterprise has been so slow to start: it is a world of theory looking for a purpose, rather than, as Henry J. Kaiser used to say, “Find a need and fill it.”

One thing is certain though: nobody is going to build just one of these nanites or nanobes or whatever you call it and expect to accomplish much of anything. One of them would be a technological wonder, which might be examined with a scanning or tunneling electron microscope, applauded, and then dismissed with a shrug. To achieve any real effect at the quantum level—which these machines are approaching—you have to make and launch thousands or rather millions of them and then rely on statistical measurement to observe their effects. That is, however many of the nanites or nanobes you make, a certain percentage will be defective and not work at all; a larger percentage will technically work but may never find the “shop floor” on which they are supposed to operate; and an even larger percentage will work for a while and then hit an air pocket or a vacuole or some other dry spot or barrier and wander away. This is like counting the number of molecules of acetylsalicylic acid in an aspirin tablet and asking how many of them you actually need to relieve a headache: as many of them as find the right nerves.

On this basis, with the machines so tiny and their singular effects so negligible, I can’t imagine that anyone is seriously going to try making them using the traditional methods of materials processing. That is, nobody will be buying raw materials, molding and cutting individual pieces and parts them (like that tiny metal motor), and then assembling these components in the same way Ford puts together the chassis, engine, wheels, and doors to make an automobile in Dearborn. Nobody is going to drop a molecular motor into a molecular framework—not even with a tiny molecular eyedropper2—and hook it up to molecular axles and wheels.

Down among the microbes and the nanobes, you have to stop thinking of this technology as some kind of machine. You have to treat it as a life form. Why would you try to design and fabricate metal wires, springs, and motors, manually pack them into tiny plastic shells and metal frames, and hope to have everything work at the molecular level, the nano-scale? It would be so much simpler to program these components in DNA and grow electro-chemical control circuits with actual nerves, achieve motor function with the elastic expansions and compressions of muscle fibers and proteins, and house everything in shells made of cellulose or keratin and frames made of calcium.3

When I was working at the biotech company, I heard about Craig Venter sending his 95-foot sloop Sorcerer II around the Sargasso Sea, then the Baltic and the Mediterranean seas, to sample the world’s oceans. He wasn’t looking for new sea creatures, although his team did discover that what we normally think of as isolated plankton species are usually whole genera that evolve and change every twenty miles or so. No, he was looking for novel proteins, in novel combinations, and with novel functions, along with the DNA genes and promoters that would code for them. His idea was to find ways to change the life-cycle, the operation, and the metabolic inputs and outputs of existing microbes to make them more useful to human beings.

For example, adding the right set of new genes might give algae a way to turn their photosynthetic processes to making lipids—fatty liquids with properties similar to crude oil—and then secrete them through their cell walls, so that each cell can go on making this oil substitute without becoming engorged and either stopping production or exploding. Such an algae cell—or a whole pond full of billions of them—would lie there in the sunlight and produce a form of oil that could be siphoned off the surface and refined to make gasoline. And then, with a bit of chemical tinkering, the cell might even be coaxed into making gasoline itself, if the stuff weren’t so toxic.

Of course, these would be cells that have been modified with the DNA sequences, proteins, and functional relationships between proteins that are already present in nature.

All such DNA is currently purposed to design and repair living creatures. Any adaptations either serve to improve the living body or else they become discarded over time—immediately if they are lethal to necessary functions. But the principles of coding and self-assembly might easily be adapted to small machines that operate in the submicroscopic environment, like single-celled creatures, for other purposes, ones designed by human beings. It would, after all, be easier to grow a microprocessor as a network of neurons than to etch one in silicon at the nanoscale, install it inside a mechanism, and wire it into sensory and motor systems.

Purposefully designing DNA to create new nanomachines might even employ metals and other materials we don’t currently think of as organic. For example, the epithelial cells in the mammalian jaw that form tooth buds secrete a mineral called hydroxyapatite, a crystalline form of calcium phosphate, which becomes the enamel surface of our teeth. Enamel is the hardest substance in the human body, and it contains the highest percentage of minerals. With a bit of chemical tinkering, such cells might be taught to absorb—from the managed environment of a bioreactor—and secrete other minerals and compounds. A pure structure or surface of, say, vanadium steel is not likely, or not at first. But hard parts made of bonelike and stonelike materials should be possible. And of course, making anything with polymers and resins, like plastics, should be a DNA-coding snap.

Nanomachines—or certainly micrometer-scale machines—might be made by groups of preprogrammed cells. Like tooth buds, or the embryos of living beings, they would form a cocoon of tissue that produced each part in place and then would be programmed to die and wash away,4 leaving the new micromachine in place and ready to operate.

And what would the new machine do? Well … it’s hardly likely we’ll need anything that small to pave roads or drive on them, or to manufacture complex machinery like automobiles or kitchen blenders. And we already have little cellular machines that can make usable oils and even drinkable beer and wine; they’re called seeds and yeasts. Complex little machines might be designed to repair the human body, or even repair and resurface the bodywork on your car. It will all depend on what you want.

Find a need and fill it.

1. Remember that a meter is just over a yard, 39.3701 inches. So a millionth or a billionth of that length is a significantly reduced measurement.

2. For which the technical term in biochemistry is “pipette,” and those things can only be accurately calibrated at scale of about milliliter, or thousandth of a liter—which itself is about a quart.

3. Of course, one-celled animals already have chemical motors that can whip around in circles, powering flagella for their movement through the liquid medium; so even the electric motor can be replaced with a living example from the biological world. That might be the easiest part of the machine to design, because the prototype already exists in nature. But circular motion has limited use in the submicroscopic environment. We use round wheels at the human scale mostly to propel loads over level terrain, and when the going gets too rough we revert to horses or mules. Motive power from limbs articulated by mechanical joints and muscle fibers might be more useful in the world of the really tiny. Other wheel-like functions, such as the gears in clocks, can be achieved in other ways.

4. The technical term for “programmed cell death” is apoptosis, and the word for “wash away” is lysis, involving the chemical dissolution and destruction of the cell membrane and its contents.

Sunday, April 29, 2018

The Immediacy of Life

Motorcycle on a curve

I’ve been riding a motorcycle—on and off but more on than off—for forty-four years. It is a whole body experience, much more than driving a car or even riding a bicycle. A motorcycle is like Harry Potter’s broom, moving with the speed of thought and requiring you to gauge every curve, every obstacle in the road, the nearness of every car, the gap between fenders if you are splitting lanes, and a dozen other things that happen practically all at once.

One of the things motorcycle riding teaches you is to perceive and move now.1 When you are approaching a curve at fifty miles an hour, and your balance—your very life—depends on making the correct adjustments to throttle speed, front and rear braking, handlebar position and force inputs, distribution of your body weight, choice of apex and your route through the curve to stay in your lane and avoid obstacles like gravel, potholes, and dead animals … you don’t have time to sit down and figure things out. In three seconds or less, you are going to be committed to making the maneuver. If you try to stop, jam on the brakes, or in any way reject the oncoming maneuver, you are going to be in a much worse place, with possibly catastrophic results, than if you miscalculate or misapply any of the above variables.

In such a position, you cannot wish that you were anywhere else, or that the choices before you were something other than what they are. You don’t have the luxury of time to decide. You don’t have the option of a do-over, or taking a mulligan, or apologizing to the gods of speed, balance, gravity, and tire adhesion. You are in the maneuver, and all you can do is face it, make your best approach, and ride it through.

This is one of the things that makes motorcycle riding so exhilarating, so much more a confirmation of life and skill than driving a car or riding a bicycle. You are in the moment, your life is on the line—figuratively and literally—and you have no choice but to succeed. If you stopped and thought about it, paused to dwell upon the consequences of a bad maneuver in terms of broken bones and abraded skin, if not concussion and death, you would probably not be there. But since you are there, you must put those thoughts and fears aside, swallow your heart, and ride it through. The moment is now, and the action is irretrievable.

There must be other experiences that match this: perhaps landing an airplane, hitting a fastball, throwing a long pass, or blocking a punch. These actions all have points in time that require you to commit to your skill and follow through. But even in these moments, you know that you have options. The airplane pilot can always apply power at the last second before the wheels touch down, fly out of a bad landing, and go around for another pass. The hitter with two strikes or less scan usually let the fastball go by and wait for another pitch which might be better. The quarterback can fake the pass and try a running play. The fighter can duck his head or even let the blow land, because a punch to the face is not likely to be lethal.

But life itself offers us many immediate and irredeemable moments. These are sudden choices, like rounding a curve at speed on a motorcycle, like facing a collapsing bulkhead in Lord Jim, where we have only a second or two to make a decision, with consequences that may affect and alter the rest of our lives, and then ride it through.

In the moment that you are offered a deal—like landing a new job that will require you to relocate, or a contract whose fulfillment will call for more effort or time or money than you were prepared to give, or a new house that is more space and upkeep than you planned for, or a mortgage that is more in monthly payments than your budget calls for—you have a sudden decision to make. Sometimes you can say, “Let me think about it,” and go away to discuss the situation with your business partner, your family, you wife. And you believe you can trust that, when the offering party says, “Okay, you think about it,” the parameters of the deal, or the deal itself, won’t change while you’re doing your thinking.

This is the “fish or cut bait” moment. This is the point of commitment. And an honorable person knows that if he or she commits to the deal, there can be no backing out. A person’s word—even if not backed up by oaths, is a bond, a signature is binding, and so is a handshake. The coward or the games player thinks of after-the-fact alternatives: that if things don’t work out, he or she can always get a lawyer, invoke an escape clause, annul the marriage, or just walk away from the obligations and the payments and let a court sort out the breakage. But the honorable person doesn’t have such thoughts. The commitment of a moment, made with a lifetime of careful planning and measuring, is binding for the term of the agreement—or until death do us part.

In this I am reminded of the assessment of managerial style by the majordomo Moneo Atreides in Frank Herbert’s God Emperor of Dune: “The difference between a good administrator and a bad one is about five heartbeats. Good administrators make immediate choices. … They usually can be made to work. A bad administrator, on the other hand, hesitates, diddles around, asks for committees, for research and reports. Eventually, he acts in ways which create serious problems. … A bad administrator is more concerned with reports than with decisions. He wants the hard record which he can display as an excuse for his errors. … Oh, they depend on verbal orders. They never lie about what they’ve done if their verbal orders cause problems, and they surround themselves with people able to act wisely on the basis of verbal orders. Often, the most important piece of information is that something has gone wrong. Bad administrators hide their mistakes until it’s too late to make corrections.”

That is, the good administrator can make a decision, ride through the curve—and have the honorable intention to accept and deal with the consequences. A bad administrator cannot make a decision and will either crash or absent himself—wish himself elsewhere—at the moment of impact.

Life itself offers all of us such moments. How we deal with them is a true test of character.

1. See SIPRE as a Way of Life from March 13, 2011.

Sunday, April 22, 2018

Life as a Mad Scramble

Word pile

Because I had already written one novel in high school—not a good one, a derivative space opera, but still it was a complete story in 472 double-spaced pages—I determined to go into English literature as my major at the university. I believed that, when I got out, I would become a novelist, a fiction writer, in order to make my living.

Most English majors around me at the university wanted to become teachers, either at the high school or college level. To teach in grade school or high school, they would need teaching credits from the College of Education, as it was then called at my university. To teach at the university level, they would need to take a master’s degree and then a doctorate to become a full professor with tenure rights. But this matter of learning English literature in order to teach English literature seemed self-perpetuating to me, like learning Japanese swordmaking in order to teach others to make Japanese swords. So long as society felt a need for students and the adults they would become to have a grounding in the literature of their civilization—or an in-depth knowledge of the artistry of swordmaking—the process might be self-sustaining. But let the faith in either literature or historic arms fade, and the teaching paradigm collapses.

But I was on a different path, learning to write novels. I actually proposed as my senior thesis to write a work of fiction instead of some scholarly dissertation, and my advisor agreed to consider it. But when the time came, I didn’t have an idea for a long story or novella. And when it came time to graduate, when in my dreams I would step away from the university and begin writing the novels that would make my reputation and earn my living, I discovered I didn’t have much of anything to say. That one novel back in high school, the story I had been dying to tell, appeared to have stood alone in my mind.

On a feverish Saturday morning in the winter term before my graduation, I actually went across the street to the building that housed the School of Journalism and found a professor in his office. I described my situation and asked about the prospects for an English major in journalism. He was kind but also, of course, amused. In his world, learning to become a journalist took years of training and practice—and this was back in the day when print ruled supreme and reporting with the written word counted for everything, while the mechanics of radio and television were taught as an interesting sideline, and the internet had not even been invented or imagined. He suggested that I should have started back in my sophomore year, switched majors, and dedicated myself to learning the reporter’s trade.1

In the end, I had the good luck of having impressed two of my professors. One recommended me to an open position at the university press, and the other immediately told me to bring my first editing assignment over to his house where, after dinner, his wife gave me a crash course in copy editing and manuscript markup.2 I held that university press job for all of six months, until the current depression hit home, the state university was squeezed for funding—in competition with the budget for plowing the state’s roads—and I was laid off two weeks before Christmas. But it was a start in the publishing world.

After being laid off, I came west to California, where my parents had established themselves the year before, and worked in my father’s business until I found a job editing trade books at a small press that specialized in railroad histories, western history, and Californiana. The job didn’t pay much, but it was interesting and served as a great introduction to my adopted state and a part of the country I knew only vaguely.3

From publishing, I went into technical editing at a local engineering and construction company, and that job turned into a position in their public relations department. And from there, I went to the local gas and electric utility, first as a technical writer and then in internal communications. After a stint of writing and publishing my first eight novels of science fiction—and getting caught up in the tidal wave that swept over the publishing world in the late 1980s and early ’90s (see The Future of Publishing: Welcome to Rome, 475 AD from September 9, 2012)—I went back to the working world at, first, a biotech company that made pharmaceuticals through recombinant DNA and, then, a manufacturer of genetic analysis instruments and reagents. In both cases, I started with technical writing and editing, then worked my way into internal communications, interviewing scientists and managers about new products and company issues.

At every step in this forty-year career, I had to scramble and reinvent myself. And I had to keep learning, asking questions, and absorbing new scientific and business knowledge along the way. But I never resorted to teaching professionally and I never once—in that old joke about English majors—had to ask, “Do you want fries with that?”

In my university days, I often hung out with engineering students. They would deride my English courses because, in their view, all I had to do was “bullshit” my way through a term paper, while they had to solve difficult equations and get the right number. Imagine their surprise, some years later, when I met the same kind of young engineers at work, and they were just discovering that “getting the right number” was the job of junior engineers. To advance to senior positions like project manager and vice president, they would have to write and speak well, entertain clients who were not all engineers or had grown beyond the engineering techniques they had learned so long ago, and understand a lot more about business and economics. That is, those engineering students would have benefited from a healthy dose of the humanities that any English major or liberal arts student was supposed to take along with his or her major. These talents last longer and go farther than the particular knowledge an engineer learns in school—although you have to keep up with the technical advances in any specialty.4

And when I got into the biotech world, I saw any number of postdocs who had devoted their academic lives to one submicroscopic scientific specialty. When they got into the working world, they were beginning to learn that, outside of academia or a government lab, they would need a lot more generalized understanding of both their area of science and the business principles that make a successful product function and sell. If they remained within their scientific “comfort zone,” they would limit themselves to just the one or two laboratories that did their own specialized line of inquiry.

Liberal arts education used to prepare generalists for work in management and government, where breadth of view, a knowledge of history and its patterns, and a familiarity with different viewpoints and ways of thinking—not to mention the common sense gained from this exposure—were valued more than any particular technical knowledge. Now they teach even social science and public policy as a specialty where students think they can safely ignore literature, art, and music. We are training a generation of termites for fitting into the narrow functions of the hive.

And in all of this, I am reminded of the quote from Robert A Heinlein in Time Enough for Love: “A human being should be able to change a diaper, plan an invasion, butcher a hog, conn a ship, design a building, write a sonnet, balance accounts, build a wall, set a bone, comfort the dying, take orders, give orders, cooperate, act alone, solve equations, analyze a new problem, pitch manure, program a computer, cook a tasty meal, fight efficiently, die gallantly. Specialization is for insects.”

1. I have since learned a bit about journalism, because writing for an employee newsletter or magazine—which I had to do as my main job in internal communications—uses the forms and principles of professional reporting: the 5W lead (i.e., Who, What, Where, When, and Why, or sometimes How) and the inverted pyramid style (i.e., tell the most important facts at the top of the story and then gradually descend into more minute details—because readers may not stay with the story to the end and want to know they are not missing anything important if they stop now, and because editors are constantly repositioning the stories in their newspaper and want the convenience of shortening the text from the bottom up without having to call for a rewrite). An internal communicator, even though he or she is presenting a viewpoint congenial to the company’s interests, must also present the story objectively or risk losing the employee-reader’s trust. And then, articles in a magazine will of course have a different structure—with a traditional beginning, middle, and end—because they are not necessarily jostling for page space and the reader’s attention, time, and interest.

2. Twice since then I have paid the favor forward, taking people who had just been hired into the editing profession, showing them about copy marking and typography, and presenting them with their very own copy of The Chicago Manual of Style, the Bible of the book publishing world.

3. Because this publisher had started out as a job printer, they were was one of only three houses in North America at the time that put all the elements of publishing under one roof. Right back of the office where I worked on manuscripts and read galley proofs, we had the linotype machines, which set the copy for all their books; the photography department, which made screened photo reductions and stripped the plates for the printing presses; the small four-color press for book jackets and the large sheet-fed Harris press for the interior pages; the bindery where the folios were folded, sewn, and bound; and the stock room and shipping department, where the books were stored, packaged, and sent out. Manuscripts and cartons of loose photos came in one door, and bound books went out the other. They even had a massive flat-bed letter press in the back, which printed directly from the cast type, one sheet at a time. Working there was a daily history in the civilization of the printed word: This is a Printing Office.

4. Except, perhaps, in the study of English literature itself. Since I left the university, the English Department there and everywhere has been taken over by critical analysis and the theory of deconstruction, which focuses more on the mechanics and limitations of language than the quality of literary expression. As I understand it—and I never studied the deconstructionist philosophy professionally—the approach would have you believe that no modern reader can fully understand, say, Shakespeare, because the language and the meaning of words and application of concepts has changed so much since the plays and sonnets were written. This would seem to undermine the idea that great literature presents us with themes and concepts that are universal and that resonate with human nature itself—and that’s a faith I tend to live by.

Sunday, April 15, 2018

Fake News

Total honesty

Back in the late 1960s I remember starting to hear people say that there is no such thing as absolute or objective truth. That truth could only be relative and conditional. That truth and the concept of knowing and believing something to be true applied differently in various spheres of knowledge and belief and among various peoples and societies.

Since this was also the time I was leaving the enclosed certainties of home, family, and high school and going into the wider world of the university, to this day I am not completely sure whether this variable approach to truth was a new concept in the world or just new to me in my suddenly wider reading among philosophies, religions, social theories, and the other grist processed in the mill of the humanities. The slippery nature of truth may long have been a tenet of some modern European and ancient Asian philosophies to which I was just opening my eyes.

But I’m willing to bet that, at the time, the disappearance of absolute or knowable truth was not generally accepted in the wider population. Certainly, our news reporters and commentators, the people who spoke to the public from a position of authority, acted as if the facts they were relating and analyzing descended from some knowable, supportable, and unchanging source. That they were speaking “the truth.”

But now, fifty years later, it seems that the acceptance of relative and conditional truth—as opposed to the absolute kind—has become common in the general population. Variable reality has become, in the modern usage, a “meme.”

Some of this change has been helped along by wider public knowledge of those ancient Asian philosophies. Zen Buddhism, for example, treats the world we can see, taste, smell, and touch as a sensory illusion that overlies a larger reality—or perhaps no reality at all. In the Hindu religion, from which Buddhism was originally a protest against the endless cycles of return and rebirth, the world around us is maya, an illusion, delusion, a magical trick of the demigods and not the underlying reality.

Some of the change has been adapted from the European existentialists, who taught that philosophic inquiry begins with the subjective, human viewpoint rather than from any universal, Platonic, or Aristotelian examination of an external reality. Later theorists, particularly in the areas of language and literature, brought in the concept of deconstruction, which attempted to separate the language and context of any communication from its meaning and the reader’s understanding. The assumptions of the religious, political, and emotional environment, the language, and the knowledge base of, for example, Shakespeare’s current readers and theatrical audiences are so interwoven and complex that modern readers cannot hope to thoroughly understand his work as it was intended and indeed as it is written.

And some of this rejection of absolute truth comes from a wider knowledge and acceptance of modern physics, particularly quantum mechanics. In the realm of the very small, where bits of matter sometimes act like bits of energy, exact knowledge becomes a chimera, a ghost of the old-style, mechanistic thinking of early physicists like Newton and Galileo. A quantum theorist knows that you can establish by observation either a particle’s position or its momentum and direction but not both, because to observe the one is to change the other. And, in dealing with such innumerable quantities of particles, the physicist relies on statistics and probability rather than trying to account for each photon, electron, or proton and neutron in an experiment. In the underlying philosophy of physics, reality is what you can actually detect and measure, not some presumed or imagined vision of what’s supposed to be happening.

All of these influences, which started to become mainstream in the intellectual, political, and moral upheaval of the late 1960s, helped erode a public belief in some kind of absolute, definable, true-for-all-people-and-times, knowledge of “the truth.”

In this country, the mainstreaming of this slippery version of reality was helped along by a growing agnosticism, acceptance of public atheism, and outright hostility to established religion. It gradually became fine for a person to be “spiritual” in the manner of the Zen Buddhists1 or Hindu mystics, but it was old-fashioned and absurd to still subscribe to the bourgeois, stifling, close-minded teachings of the Judeo-Christian religions.

And finally, the verbal tactics from the left side of the political aisle were brought to bear in this change. The National Socialists’ use of the “lie big enough, told often enough” to change public perception caught the attention of the entire twentieth century. Add Lenin’s earlier directive, “A lie told often enough becomes the truth,” which embedded the practicality of manipulating the truth in every political tactician’s mind. Challenging the apolitical teachings of the church and creating your own version of reality, especially a version based on statistics and false premises, forever changed the public trust in some externally verified, universal, and absolute truth.

So long as the politicians and their allies in the newsrooms of the big daily newspapers, broadcast television, and the professional commentariat could limit the scope of discussion, it was relatively safe to allow the public to understand that truth was relative, conditional, and negotiable. It served to undermine the old assurances, the established viewpoints, and the life lessons that Rudyard Kipling called “The Gods of the Copybook Headings.” Undermining popular assumptions and creating new myths and legends are what politicians and professional opinion leaders do.

And the process worked, after its fashion, for about forty years. But then in the past ten years or so that funny little science-and-technology-sharing computer circuit called “the internet” sprang into full flower. Not only could every person on the planet—except maybe those in censorship-heavy societies, like the People’s Republic of China—have his or her own Facebook page and Instagram messaging account without doing much more than press a few buttons, but everyone who wanted to put in the effort could publish his or her own books and pamphlets in electronic and paperback form, put up a web page that had all the visual authority of any “legitimate” news source, create podcasts that sounded exactly like radio and television interviews, and go into the news business with the same tools as the big city dailies and broadcast journalists. Add to that the quality of video imaging through your average smartphone being equal or superior to the best handheld equipment of a decade ago—and a lot more available and present on the scene—and you have a dozen eyes and a hundred voices ready to report “the news.”

Then the meme of truth being relative and conditional began to backfire. The established media, with their highly paid teams of reporters, editors, and commentators, their worldwide news bureaus, and their tested and accepted political narrative suddenly had to compete with god-knows-who from god-knows-where. The “news” and the “truth” started coming out of the woodwork and flying around the room—around the internet—like a storm of bats exiting their cavern at dusk.

The paid media can cry “Fake news!” at all the controverting crosstalk of the benighted amateurs, and the million voices of the internet can cry “Fake news!” right back at them.

In this, I am not advocating for a return to an oligopoly of opinion, held by the magnates who own and run the largest printing presses and most expensive broadcast newsrooms. That cat is out of the bag and long gone by now.2 I am not pining for the established voice of religion, or the oldest political parties, or my grandfather’s choice of books in the family library. In a way, the little-D democracy of letting every person have a voice is refreshing. The choice of which truth to believe is now the responsibility of the average citizen. The story is not handed down by the party in power or its media moguls, but flutters on the wind like a thousand little birds. Each of us must use his or her best intelligence and widest reading of history, science, and sociology to determine where an acceptable truth might lie. As Gurney Halleck said in the movie version of Dune,Now, guard yourself for true!

But maybe it was a bad idea, in the first place, and for political reasons, to sell the public on the belief that truth was not absolute or discernable but subject to preference, opinion, and varying conditions. In the internally wired society, you end up in the unfortunate position where my lie is just as good as your truth. But there we are.

1. Buddhism and its offshoot Zen are not actually mystical, mysterious, or even spiritual. Buddhism is a philosophy bounded by a practical approach to life and human existence. It is more of a psychological practice than a religion. But that’s just my “truth.”

2. Those national governments that think they can control their public’s opinions and perceptions by editing, filtering, and banning websites on the internet are playing a losing game. No matter how many censors you employ, the worldwide web will have a hundred times as many chattering sources who are well versed in employing euphemisms, creating metaphors, and hacking the code.

Sunday, April 8, 2018

One True Religion

Ancient of Days by William Blake

The popular image of Muslims these days is mostly that of raging fanatics who shout “Allahu Akbar!” while crashing trucks into crowds and blowing themselves and others up with suicide vests. I have no doubt that such people exist and that they are fanatical and dangerous. But the Muslims I have met in the West have been educated, reasonable, thoughtful, middle-class-seeming people.1

I’m thinking, in particular, of the husband of one of my supervisors at the biotech company. He was an Egyptian, formerly in diplomatic service, who was well-read and courteous, with a great sense of humor. He had no more interest in punishing unbelievers and throwing bombs than my grandmother. Of course, he was also from a well-to-do background. But in the matter of Islamic fanaticism, a person’s intelligence, breeding, and background are no guarantee of their gentility. Look at Osama bin Laden. Look at the young Saudis who could afford to take piloting lessons in order to fly jetliners into buildings.

In my experience, however, anyone who is well read and liberally educated tends to become more tolerant of other religions, not less. After studying the tenets of Judaism, Christianity, Islam, Buddhism, Hinduism, the ancient Greco-Roman pantheon, and the surviving forms of animism and pantheism, it becomes impossible for an enlightened person to insist that “my reading” of the scriptures, “my sacred book” itself, and the belief system “I was taught as a child” represent the one true religion and that all the rest are pernicious poppycock.

But I understand where and how many modern Muslims have acquired their fanaticism. About 1,300 years after Christ, Christianity went through its own antiheretical phase, where unbelievers and schismatics were bullied, tortured, and sometimes burned at the stake. Much of that rock-hard belief was inspired by external pressure from the previous expansions of Islam into Spain, France, and Eastern Europe, and some of it was a response to internal pressures from the early Reformationist impulses, which arose through political manipulations and frankly heretical tendencies.

Now, about that same 1,300 years after Muhammad, Islam is going through a similar phase, where apostates are executed and unbelievers are beheaded. Islam is responding to much the same set of pressures, too: an encroaching spirit of Western culture that started with the Crusades in the Middle Ages but has vastly accelerated with the blossoming of a worldwide political, commercial, and electronic cultural tide. And Islam has always had tension among its various sects—mostly between Sunni and Shi’a—which is now being pushed by political pressure from Iran against the rest of the Middle East.

I can only hope that most Muslims will come out on the other side of this turmoil and adopt the worldview of most of today’s western Christians: “I know the Bible says I must believe this and follow that—such as the injunction about ‘not suffering a witch to live’2—but … you know … I just believe in a loving God who wants the best for all of us.” In time, their Allah may likewise become a polite, country-club gentleman who makes no life-threatening demands and creates no unpleasant waves.

Among the things that a well-read, thoughtful person is likely to dismiss from the sacred scriptures are the fantastic stories, such as that the God of all creation walked in the Garden of Eden and hobnobbed with Adam and Eve, or that He spoke the only absolute truth for all the ages into the ear of Muhammad in his dreams, or that He had a human son with the Virgin Mary in order to sacrifice him to the world. Such a well-read—and doubting—person will tend to discount these stories as literal truth and instead accept them figuratively, as metaphors for God’s loving personal relationship with all men and women, both as individuals and as a group.

When you have read enough of other religions, the best you can hope for in terms of a relationship with the one true God is a kind of Vedantic liberalism, where the godhead—however defined—is simply the ultimate conscious reality, a metaphysical all-soul that encompasses all human thought and perception, and each of us is merely a part of this reality which has broken off, is now spending time in the changing physical cosmos which is comprised of body and matter, and will one day return with newfound insights to rejoin and add to the whole. That’s a bleak and not very comforting view of the situation, but it works as a kind of bedrock reality that would explain the religious impulse in human beings that we don’t find in, say, spiders or great white sharks.

What I am groping towards here is uncertainty. A well-read and thoughtful person cannot believe that every religion—in all its specific and fine-grained details—represents the ultimate truth of the whole world, let alone a universe of a hundred billion galaxies.3 Oh, the details may be true, whole, and perfect in the mind of each believer, no doubt! But is it true for the world, or for every world? Is that one story actually the way the cosmos—and whatever metaphysics lies behind all that spinning star stuff—actually work?

A thoughtful person, having been exposed to different ideas and points of view, will have doubts about the nature of singular truth. Such a person may still hold to the principles of the religion he or she was taught as a child, delight in and believe as “mostly true” the stories in his or her particular sacred book, and try to follow the precepts and guidelines of its particular morality. This is part of any person’s acculturation. But he or she will have a hard time insisting that his or her own personal religion is the one true religion, the only correct interpretation, and that people who believe otherwise are apostates, infidels, wicked, vicious, and worthy of sudden death.

Tribalism seems to be part of our basic human heritage. But to overcome it in our search for something better, truer, more real, and more universal would also seem to be part of human nature.

1. Our condo complex seems to be a landing spot for people emigrating from societies in distress and arriving in this country, because we are located in an attractive Bay Area community with a good school district. During the 1980s and ’90s, we had an influx of people from Iran and Lebanon, as now we have Russians and Chinese. It makes life interesting.

2. Exodus 22:18.

3. In this, I am reminded of the legends of Britain’s early war chieftain, King Arthur. For a legend arising from the fifth century, it has such well-defined and specific details: the passionate attraction between Uther Pendragon and Ygraine of Cornwall, resulting in the illegitimate and orphaned Arthur; the royal sword Excalibur embedded in the stone by the wizard Merlin; the betrayals of the half-fairy Morgana and her illegitimate son Mordred; the search for the Holy Grail by the Knights of the Round Table; and the illicit love affair between Queen Guinevere and Sir Lancelot, which sundered the perfect kingdom of Camelot. It’s a rich tale that seems too culturally modern to have come down from barbarian Britain in the early years after Rome’s legions had packed up and left. And of course, it’s a tale that has been adapted and extended in the telling, both in English and French legend, reaching a final form with Thomas Malory’s Le Morte d’Arthur in the fifteenth century, further refined by T. H. White’s The Once and Future King in the twentieth century, adapted again in Mary Stewart’s The Crystal Cave series, and put on Broadway as Camelot by Lerner and Loewe. This sort of quirky specificity comes from telling and retelling the culture’s favorite story over the ages. I’m sure the same sort of expansion and enrichment happened to Homer’s Iliad and helped shape most of the stories in the Judeo-Christian Bible.

Sunday, April 1, 2018

Freedom and Compliance

Minotaur vase

When I was growing up, one of the core values of this country was freedom. We celebrated it as a citizen’s natural right, embodied in the Bill of Rights, which was the first ten amendments to the U.S. Constitution. While the Constitution itself is mostly a procedural for how the government will operate, the Bill of Rights describes those operational necessities that remain with the citizen or those actions that the government is forbidden to perform.

It’s not hard to understand why my generation would value freedom. Our fathers had just fought a long and bitter war against cultures that seemed to celebrate tyranny over the individual. And our generation would be fighting—in spirit if not on an actual battlefield—a cold war against oppressive cultures that wanted to force their values on the rest of the world. Freedom of individual conscience was considered a uniquely American value, but it was one derived from the long history of democratic government in Western Civilization, going back to the Greeks and Romans.1

But it’s a wonder that in just two generations, the forty years or so since I was a young man, the core value has shifted from freedom, which is now considered dangerous, to compliance, which is considered to be safe.

“Compliance” in this sense means that the individual is supposed to surrender his or her personal choices about thought and action to the direction and approval of the group. School campuses have—and the students there loudly and actively enforce—speech codes intended to prevent causing others distress or offence. They operate a separate system of justice regulating interactions between the sexes, or rather, governing how males may touch, communicate with, or even look at the females around them. They promote gun-free zones, of course, but the surrender goes even further in that a student is not supposed to defend him- or herself in any physical or verbal attack but instead seek protection and arbitration from the school’s overriding authority.

And none of this—the limits on speech, action, and defense—is left to the perception, conscience, and direction of the individual. My parents raised me not to give intentional or unintentional offense to people, to think before I spoke, and to be mindful of the feelings of those around me. They taught me to be a gentleman in my dealings with women, to treat them with respect, and not to pressure or bully them with unwanted attention. They taught me to stand up for myself, to stand my ground in an attack, and meet force with necessary force. But none of those precepts would be adequate on today’s campus.

The issue is that compliance with codes of speech and action is not self-directed. It matters not at all what I as an individual might think would give offense to another person. Nor how much I tried to be a gentleman to the ladies around me. Nor that I had a general live-and-let-live policy, but that the freedom of others ended at the tip of my nose. Those would be my values, not those of the larger group.

These days, what is offensive or inappropriate is not left to common sense, good personal judgment, or individual perceptions of fairness. It is governed by a narrow segment of the local society: those with the most anxiety, anger, or insecurity. And where individuals in that delicate class may be too timid to speak up, the dictums are prescribed by a political viewpoint that has a built-in bias favoring anyone who can claim a past history of oppression or present him- or herself as the underdog in the argument. Because such persons by nature tend to represent minority positions in American society—people of any race or cultural heritage not derived from the Caucasian or European, or from an economic background not identified as middle class and above, or subscribing to a sexual orientation not heterosexual male or female2—this bias is dictated by and protective of a small segment of society and hostile to the larger portion of that society. The goal, whether intended or not, would seem to be making the larger part of our country feel outcast, isolated, and in terms of 1960s Transactional Analysis “not okay.” As such, this is an attack on the uninvolved majority by the offended minority and—in some cases, such as the more convoluted sexual orientations—by statistical outliers.3

The existence of these codes of speech and action—and ultimately of thought—renders the individual powerless. Freedom is no longer the state of being able to evaluate and choose for oneself from a limitless variety of possibilities about what to say, do, and think. Freedom is now reduced to the meager opportunity to harm, to offend, to act out, and to violate the directives of the group. That is, to not comply.

Of course, freedom is messy. A society that lets people think for themselves will have a large fraction—perhaps even the majority—thinking and choosing wrongly and adopting absurd, ineffective, and provably false beliefs. A society where people can say anything they want will have a large number of disagreements, hurt feelings, and even a few fistfights. A society where people can do whatever they choose will have a large number of accidents and occasionally ruined lives. It takes resourcefulness, determination, skill, and guts to grow up and function in such a society. The survivor will develop a hard intellectual and emotional shell, a measure of distrust of his or her fellow human beings, and not a little cynicism.

The alternative is the beehive or the ant colony, where individuals function cooperatively, sedately, serenely, and safely because most choices have been removed from their lives and from their minds. It was the sort of society that the National Socialists and the Soviets tried to create and operate. These were societies where the only choices and opinions that mattered were those of the Fürher, the Number One, the queen of the colony.

This is not a society worthy of fully functioning human beings.

1. Greeks, Romans, Western Europeans, French, and Britons—they all had their own versions of democracy. But most of them, at least at some stage in their histories, relied on the voice of the common man, the “strength of the people” (δημοκρατια or demokratia in Greek), to direct the government. In order for this voice to have meaning, the common man must have freedom of thought and action. Slaves can only support tyrants.

2. One of the great accomplishments of this political bias has been to conflate women as a group—who represent half of the population—with the minority position.

3. It matters not at all that, when parsed and defined too exhaustively, the sum of these minority positions generally constitutes a majority of the population. The proponents of this political bias tend to use the prefix “cis”—as in “cis-gendered”—in a pejorative way to mean “normal.” This is a political viewpoint that revels in the unconventional, the “other,” and the outré. It is designed, in the words of newspaperman Finley Peter Dunne, to “comfort the afflicted and afflict the comfortable.”

Sunday, March 25, 2018

Roads Not Taken

Gavel and lawbooks

When I was a young child just old enough to start having a personality and for people to begin talking about my future, the family thought that I would become a lawyer. In this, I would be following the path of my maternal grandfather, who at the time was the county judge in his rural Pennsylvania community. And my father’s sister had studied law before she found her true calling as a dog breeder.1 I think my father hoped at least one of his two boys would become an engineer, as he was a mechanical engineer and his own father a civil engineer. But I was verbally inclined and math impaired, while my brother seemed to be aimed at a life that was more philosophically and socially oriented, like becoming a psychiatrist or a minister.

This notion—it was never more than a far-off thought—that I would eventually study law persisted through my days in grade school and junior high school. I wasn’t wedded to it, didn’t think much about it, just accepted it as one possible future. And then, at about the age of twelve, I thought of writing a book. It was a wretched thing of four typewritten pages with only a beginning and no plot, which I then bound in cardboard-and-crayon covers.2 The writing bug bit lightly that time. But four years later I wrote a science fiction novel of 472 pages typed double-spaced, with carbon paper, after writing the first draft on a tablet with a fountain pen. It was a space opera with lots of battles and an historic love affair, and still a wretched thing.3 But by then the bug had hold of me, and thoughts of studying law disappeared into the cobwebs.

And yet, I wonder what my life would have been if I had majored in pre-law rather than English in college, gone on to law school if the money had been there, and started on the working life of a lawyer.

I think I might have been a good one. I have a proven facility with language, and I studied Latin in junior high and high school—a residue of early preparation for that possible career path. I learned through my coursework and the teaching of my parents to think logically, and I can follow complicated chains of reasoning—something that also helps when you’re plotting a novel that doesn’t suffer from lapses in causation and unanswered questions. I have a good memory for facts and can relate them in a meaningful way. And finally, I have the capacity for hard, sustained brain work, as shown by my years as an editor and writer, where I had to churn out masses of technically accurate copy on tight deadlines.

But I lacked some of the emotional or spiritual elements that a lawyer must have. First, I am not very competitive. Although I play wargames with my miniatures gaming group, winning is not that important to me; I more enjoy the camaraderie and social contact of play. And I can learn more from an intelligent loss, or from taking a risk that turned out badly, than I can from a lucky win. A good courtroom lawyer must want to win, have a visceral need to convince a judge or jury of his or her own side of the argument, and an almost hormonal resistance to failure, even to giving up and letting go in the face of overwhelming loss.

Second, I am not much of a partisan. I do not naturally take one side against another but instead seek to understand what is true—that is, closely adhering to known facts—fair to all concerned, and right. Of course, “right” is a slippery concept, especially these days, but I use the word to mean consistent with the course of action that a rational and good-hearted person would take without personal bias and by adhering to a sense of conscience or honor. I will not knowingly cheat to win—although sometimes I might not fully understand the rules of the game and the etiquette of play. I won’t support my own team if my teammates cheat or bend the rules, and I feel a strong need to call out their infractions. In this, I am more interested in fairness and right action than in a certain specified outcome. But a good courtroom lawyer is supposed to adopt his or her client’s position and put aside notions of impartiality and fairness. I know that a lawyer is not, as a matter of ethics and good conscience, allowed to represent a client whom he or she believes to be guilty—or prosecute one known to be innocent—but sometimes an attorney paid to represent a side must close his or her eyes and mind to certain facts in order to defend a client or prosecute a case.

My respect for the truth—that adherence to facts—is based on my sense that the only things standing between me, my sanity, my soul, and utter chaos are my ability to see and interpret the current situation clearly, review the facts involved impartially, make a correct judgment in good conscience, and pursue the appropriate course with a strong will. If I start putting facts aside, succumbing to bias, or participating in a lie, then I will lose my mental and emotional bearings in the world. That way lies error, darkness, and confusion.

But not all lawyers work in courtrooms and do battle on such clear ground and against known opponents. Some work in a corporate capacity, researching cases and precedents, negotiating and writing contracts, filing and defending patents, and performing other work that is less adversarial. I might have been good at this form of practice, except that I have a visceral, almost hormonal, allegiance to clarity of thought and expression. As a writer, I care most for communicating the idea or concept behind an article or story in a way that the “intelligent, well-read layman” can understand. I look for a direct line through the reader’s doubts and internal objections to place that idea or concept inside his or her brain and make it come alive. I don’t set traps for the reader—not unless I am creating an intentional mystery or a surprise ending, which upon discovery is meant to be pleasurable for the reader, teasing his or her perceptions and providing the Aha! factor. After years of work, I have learned to create and manage evocative similes and metaphors, so that the reader can associate that new idea or concept with something he or she already knows and understands. I use supporting facts and arguments selectively and judiciously, rather than exhaustively, as if hoping that one will convince if another does not. And in the realm of science fiction and fantasy, my goal is to create an interesting plausibility, rather than a technically accurate but boring possibility.

This dedication to clear and effective communication can be the death of contract work, which strives instead to identify and address all the possible issues and contingencies that might hinder or harm the corporation’s interests. That sort of defensive thinking—while it does require a bit of imagination and intuition—is too narrow and not very interesting for me as a writer. I would rather be a maker of dreams than of mousetraps.

I think that if I had followed my early career plan, gone to law school, and tried to work as a lawyer, I might have been reasonably good at it. I could have learned to suppress my emotions and natural inclinations, as every adult must do at some stage in life and always in critical situations. But the effort would have gone against my grain. And at the end of my life—which I am now approaching—I believe I would have felt empty and dissatisfied, however much money I had made.

My career as a writer, editor, and communicator—and then as a creator of novel-length fiction—has not always been everything I dreamed of as a young man. But I feel that I’ve done good and honest work, for which I have been adequately paid. I haven’t had to hurt anyone along the way. And I have not been asked to lie—fail to adhere to known facts—except as any teller of stories and weaver of fictions tells lies in the service of a more important truth.

1. These days we tend to look down on the profession of dog breeding, because at one extreme we are warned against large, commercial “puppy mills” and, at the other end of the scale, we pity the ambitious family that owns a purebred animal and every year tries to sell a litter of puppies for small profit. My aunt ran a boarding kennel outside of Philadelphia and teamed with a veterinarian on the site. She started by breeding cocker spaniels and then went to poodles for show and sale. She had a “breeders” eye and could look at two candidate animals and anticipate their offspring. It was this skill that allowed our hunter-gatherer forebears to selectively modify wild animals and plants and create the new lifestyle of settled agriculture.

2. The essence of the idea was that a steam locomotive in the Old West had come out of a tunnel with everyone aboard dead. It was going to be a murder mystery—but I never could figure out what caused them all to die. Later, when I became an editor at a publisher of Western Americana, I learned that early railroad tunnels through the Sierra Nevada were so long that engine crews would become asphyxiated. This inspired the Southern Pacific to develop the cab forward steam locomotive. But at the age of twelve I didn’t know I had identified an historical problem; I was looking for some kind of hijacking based on mass poisoning.

3. In writing, there are no overnight successes. Anyone who publishes a brilliant first novel has at least three attempted manuscripts in a drawer somewhere that never did—and never should—see the light of day. It takes time to learn this craft.

Sunday, March 18, 2018

Death Is All Around

Dead bird

The other day I was pruning back the two Ficus trees in the apartment. They tend to reach for the sun from nearby windows, which means that the branches further away lose their leaves and become dry and spindly. I had to cut back the dead branches and turn the pot a bit to even out the growth.

As I was doing this, I knew the tree’s response would be to send out new shoots from the clipped branches, fork and spread them, and continue growing. But as I made each cut, I also remembered a science fiction story from my childhood, about a man who invented an extremely sensitive microphone that could record the cries of a tree as it was being cut down or grass as it was mowed. Now, I know that in order to feel pain and cry out, an organism needs a central nervous system and a locus in the brain where pleasure and pain can become registered in consciousness. The best a tree or any other plant can do is drip some sap—a result of capillary action beginning in the roots—and let that stump of a branch wither for lack of leaves and photosynthetic nourishment. In this case, the branches were mostly dead already, although some of them had a small and inconveniently placed twig still sprouting foliage that was brushing up against the wall or the curtains, and I was simply performing some elementary bonsai for aesthetic purposes.

That old science fiction story, however, got me thinking. Even a confirmed vegetarian—who laments the slaughter of cows and pigs for food, because of their death agonies—is comfortable with raising crops for their food value. They don’t rely on acorns that have already fallen from the tree, as Native American tribes in California did for their grain supply. And our farmers don’t pluck the ears of corn and then leave the plant standing to reproduce and sprout new ears next spring. No, we kill plants by the thousands of millions and never stop to think about their piteous cries too subtle for our human ears to hear or any of our instruments to register. The American farmer and all agriculturalists throughout history are wholesale merchants of death.

On this same morning, I was in the living room doing my exercise regimen when I noticed a mourning dove on the window ledge. We have doves all over the woods in back of the apartment complex, but this was the first time I had seen one so close. It was looking around, walking back and forth, and otherwise appeared nervous. Occasionally it would glance straight up into the sky. It occurred to me that we also have three red-tailed hawks that live on our hill, and occasionally I find loose feathers, sometimes clumps of them, when walking in the woods. The dove might have been using the ledge to protect itself from attack from above, because the angle between the sill and the window glass was too narrow for a clean kill. Only when another dove flew down from the ledge of the apartment above mine, and the first dove took off to follow, did I realize that all of its walking back and forth was waiting for some sign from its friend or mate.

While I was looking at the dove, I could see that it really was a work of natural art. Whether you believe in intelligent design by a living god or blind creation through evolution adapting random mutations into systems that function perfectly in the real world, you have to marvel at that small package of life. It struts, it flies, it scans the skies, and it has relations with other doves. Those tiny, beadlike eyes can measure angles, calculate distances, recognize shapes, and perhaps even register some of the beauty of the world. This wondrous being is initiated by the fusion of sperm and egg, gestated in a shell, raised in a nest by parent doves, learns to fly, finds seeds and grows to maturity, finds a mate—if it’s lucky—and eventually gets bounced from on high by a hawk and dies. All that articulation, all that recognition, that beating heart and bright eye—snuffed out in a minute by a chance encounter. Maybe some doves live to old age, get arthritis, and break their necks falling off a branch. Maybe some get cancer or another illness and die suffering in the long grass. But for most, it’s pounce and gone!

Some thirty years ago my wife brought home a box of books and papers from the library where she worked. As a result, we have developed a lingering infestation of silverfish. About every six months I will see one boldly scuttling across the hardwood floor, appearing like a moving scrap of dusty tinsel. I always smack them, and then I have a smear of paste and a scrap of damp paper. But even as I am killing it, I marvel at the articulation of this thing that is barely alive. It moves, it seeks light and shade, it knows that it’s being attacked when I miss, and sometimes it scrambles and flees successfully. It’s not the jeweled mechanism of an ant or a beetle, but it works on the same principles. I almost feel ashamed to kill them.

This planet is covered with life. I’ve said this before: everywhere you look that isn’t bare rock, dry sand, or blue sky is teeming with signs of life and the DNA that propels it. Some of this is easy to see, like forests filled with birds and deer or fields full of grasses, wildflowers, and burrowing rodents. Some you have to search out or imagine, like microbes living in the soil or plankton in the ocean. And some you can infer from its handiworks, like anthills, coral reefs, and skyscrapers. All of it is bursting with natural energy, all of it growing and reproducing. And all of it will die. Even the supposedly immortal jellyfish that are going the rounds on Facebook now will one day find a predator or a boat propeller and turn back into their component molecules.

Death is not the enemy. If none of this life ever died, by now the planet would be a hundred feet deep—or more—in struggling animals, the seas would be solid with fishes and the sky black with birds. Since all that is impossible, some natural mechanism would have intervened to end reproduction on this planet. Immortal creatures would live out their lives and never change. For more than three billion years that was indeed the state of things: bacteria growing and dividing, growing and dividing, never dying except by happenstance, and never much advancing. Then about five hundred million years ago, something happened and life exploded in thousands of multi-celled forms—the Cambrian Explosion, which laid the groundwork for what we have today. Since then, we have had periods of intense growth and diversification, followed by periodic extinction events that wipe the board clean and clear the way for life to go in a different direction.1

But all of it dies, every time. If it’s lucky, an organism gets to breed before it dies, and new life follows after it. And if it’s very lucky, that new generation carries mutations that might, just might, allow its progeny to survive when the climate or the food supply or the predator-prey balance changes and the organism’s progeny need to adapt. But that’s still a matter of chance. And for the organism that is alive right now, death is certain.

This is not a tragedy, not a thing to dread. Because death is all around, we know that the time right now is precious. Because life adapts and changes, we know we will never again see the exact mix of animals, plants, and even microbes that we can observe and catalog right now. This planet is alive because the things that make it interesting can die. And this is a blessing.2

1. Think of the extinction of the dinosaurs 65 million years ago, which cleared the path for the rise of mammals and, eventually, humans.

2. I didn’t intend this to be morbid, but it has been six months since my wife of forty wonderful years has died, and I am still reverberating with the loss.

Sunday, March 11, 2018

The Meaning of Life, Again

Piano keyboard

I’ve written about this before,1 mostly from a scientific and technical perspective. Now, I’m thinking more along spiritual and/or philosophical lines.

Any question about the meaning of life is a product of our own brains, which appear to be alone among the animals—and that would suggest among all the other life forms on Earth—in having the capacity to think both abstractly and self-referentially. We can think about things that are not immediately in front of us, not cued by any sensory input or by our immediate life situation, and sometimes not even related to any of our experiences or memories, perhaps not even related to any other thing in the universe. And we can think about ourselves, examine our own motives, call up our memories at will, and even place ourselves and our personal reactions in imagined, hypothetical, and future situations.

We can live, second by second, in all three tenses—past, present, and future—and modify each of them with linguistic moods such as the subjunctive. So, instead of having to say, “I will do that,” our pan-temporal perspective allows us to say, “I would do that, if this other thing were to happen.” We can hold a thought that is concerned simultaneously with something that may occur in the future and with something hypothetical and contingent upon other factors that may or may not occur in the future. That’s pretty complex thinking. Animals—not even our closest mammalian relatives—don’t do this, and that means the plants and protozoans probably don’t, either. Our thinking processes and our perspective are unique.

We humans seek a meaning to life because we are capable of thinking about and examining hypothetical alternatives. What was I like before I was born? What will I be and where will I go after I die? Why am I here? Am I living up to my personal potential? Will I ever achieve the dreams I had when I was young? Is what I’m doing now with my life important enough to satisfy the expectations of my family and friends? Will it satisfy the expectations of people I don’t know personally, the general public beyond my intimate circle, and future generations? Will I be remembered after a death that, although I don’t like to think about it, seems to be coming for everyone and may one day come for me?

Animals do not have these thoughts. All of these questions are based on hypothetical alternatives to what we can immediately sense and know. They are even outside the realm of what we can remember from past experience. My dog does not question her life. She can be disappointed if I must cut short her midday walk because I have to leave for an appointment, but after a few anxious tugs at the end of her leash and a reluctant turn toward the house—because she knows how far she wants to go right now, and that she’s being shortchanged—she finds new smells to investigate on our way to the door. By the time we’re in the hallway, her tail is up and wagging again.

Even a dog that is suffering base cruelty—whipped by an angry master, left out in the hard sun or the cold rain, shut in a small space without the society of its pack for hours or days at a time, or even starved—does not begin to question its existence. It may be depressed, with head drooping and tail down. It may feel that it has lost the love of its pack and its alpha—that formerly loving and now cruel master. The dog may assume that, as caresses and treats once came when it acted to please the alpha, it has now somehow done something displeasing in order to deserve such hard treatment. A formerly loved dog who is maltreated or abandoned can recognize the change in its situation and react with confusion and despair. But even then, the dog will not ask why it was born into this life. And it will not commit suicide because life has become something different from what the dog once experienced.

Animals do not question their lives and its meaning. They do not feel they were born for a purpose; they simply live. If life has a meaning for animals—and plants and protozoans—it is written into their genes, which means it is part of the physical structure that organizes their brains—if they have any—and responds with innate drives keyed to their hormonal secretions. They eat because their stomachs are empty and chemical cues tell them they are hungry. They seek out sex—without thinking about its reproductive effects or future generations of posterity—because certain smells and pheromones stimulate their glands. They try to get out of a cage because they are used to open and familiar spaces, and the bars keep them from their known space. They resist a steel trap because the bite of the jaws is painful. And when death inevitably comes, they go quietly because they don’t think about alternatives.2

Socrates is supposed to have said at his trial, “The unexamined life is not worth living.” But Socrates was a philosopher and a human being. He lived in the dreamtime that all of us humans—and perhaps our closest primate relatives—inhabit. It is a realm of expectations and possible alternatives. It is a place that demands meaning. But life itself—as lived by every other animal, plant, and protozoan—is a chemical mystery without inherent meaning. It has its imperatives, of course: eat, move, reproduce, seek prey, evade predators, survive. But even these are unexamined premises for most of this world’s living things. They don’t have words for their drives, let alone think about them in the abstract.

We humans are the apex animal in terms of sensing, perceiving, appreciating, and examining the realms of both the abstract and our own existence. We are the first living thing in a heritage of almost four billion years—years occupied mostly by bacteria and other one-celled chemical machines—to ask that life have a meaning. We ask both from the broader perspective of the human species and from the narrow view of our own personal lives. And in both cases the answer seems to be, in the words of Colour Sergeant Bourne in the movie Zulu, “Because we’re here, lad. Nobody else. Just us.”

If life has no apparent meaning for any other species—it just is—that suggests we humans will have to make up a meaning for ourselves. If life has a purpose, other than the chemical imperatives, then we must create it.

Perhaps there really is an omniscient, omnipresent, and omnipotent God up in the sky, or somewhere beyond normal existence, who created the Heavens and the Earth and who invented life as a good idea among all that otherwise inert matter bound up in star stuff. This notion supplies a ready-made meaning. Or perhaps the simple belief in such a god—or in the nature of goodness and purpose themselves, as represented by such a belief—is enough to supply that meaning. Certainly, many people hold such beliefs and find meaning in them.

For the rest of us who don’t quite believe, and yet wonder what comes after the death that is surely awaiting us all, we are left with having to create our own meaning, both for the species and for ourselves. I tend to believe that the purpose of our big brains and their ability to sense, perceive, and wonder is to seek out and create that meaning. We are the next stage of evolution, and as the apparent inheritors of existence from all the inert matter and the non-thinking life forms in this star system, we have a duty to think up a good one.3

1. See The Meaning of Life from October 9, 2011.

2. When you take a terminally ailing dog to the vet to be put down—as we have had to do a couple of times now—it will shiver and shake. But that is not because it fears death. The animal reacts that way because the veterinary office is generally a place of painful pokes and pinches, and it smells of other fearful animals. Also, the dog senses the sorrow of its master and knows that this trip is somehow different from all others. Different is hormonally dangerous for an animal.

3. There’s a story I’ve read that says Douglas Adams, author of The Hitchhiker’s Guide to the Galaxy, had early experience as a computer programmer. In the novel, when the supercomputer Deep Thought responds to the question about “life, the universe, and everything” with the answer “42,” this is not just random nonsense. In ASCII (American Standard Code for Information Interchange) encryption, the number 42 stands for the asterisk (*), and that symbol is used as a wildcard in queries and sorts. So the Deep Thought answer was computer shorthand for “Whatever you want.”