Sunday, May 20, 2018

The Moon and Beyond

Full Moon

We humans are a migrant species—at least most of us. Out of Africa, across the world. There and back again. We have itchy feet and restless natures. That’s what comes of having a big brain, inventive ideas, and a general dissatisfaction with the status quo.

Of course, over the ages pockets of people have settled down and remained content. Consider the West Africans at the dawn of humanity, who found rich valleys around the Congo and Volta rivers and did not follow the rest of humankind out of East Africa’s stark Rift Valley and into the wider world. And since the dawn of agriculture we have seen the rise of various empires based on water or some other natural resource: the Mesopotamians, the Egyptians, the Qin and Han Dynasties in China, the Mayans and the Incas in the Americas. Once you build infrastructure around a resource, like an irrigation system beside a broad river in a fertile plain, some people will stay put to harvest and use it.

But for the most part, humanity has been on the move ever since we learned to walk. We are one species that has adapted itself, through its brains, muscles, imagination, and courage, to environments as difficult and varied as desert oases, rainforest jungles, and Arctic permafrost.

The story of Europe—to take just a small corner of the globe—has been one of successive overruns from outside. From back before the beginning of recorded history, we have tantalizing pockets of unrelated languages in the continent’s far corners: Finno-Ugric in the far north, the Ural Mountains, and the Hungarian Plain; Pictish at the northern end of the British Isles; and Basque in the northern mountains of the Iberian Peninsula, in the awkward corner between France and Spain. These are mostly places out of the way of regular migration routes. For the rest of Europe—and strangely, parts of northern India—we find a common root language, Indo-European, which is the father of the Norse, Germanic, Greek, and Romance languages.

I attribute this spread of common language to what I call a “people pump” operating out of the Caucasus Mountains. For ages since antiquity it has fed restless groups of people north onto the steppes. There they got up on horses and rode west into Europe and east into the Indus and Ganges valleys. The history of the Greek peninsula and Asia Minor, or modern Turkey, is the story of invasion by the Dorians, Ionians, and the mysterious Sea Peoples, who got moving about the time of the Trojan War. The story of the Mediterranean as a whole is the movement west by Phoenicians, Greeks, and perhaps those misplaced Trojans, who fetched up in Etruscan Italy to become Romans. While the Romans were building their empire, the Celts crossed from Turkey into Austria and progressed through Germany and northern France into Britain. And as the Romans were losing their empire, the Goths and Vandals moved out of the Baltic region and Poland to pass through southern France and Spain and sack Rome itself. The story of the British Isles is the invasion of Celtic lands by Frisians and Saxons, Danes, and finally by those Vikings who had settled in Normandy, became Frenchmen themselves, and then went off north to conquer England.

Europe is a restless place. The movements appeared to subside in the Dark Ages after the collapse of Rome, and it looked like people were finally settling down. But then the art of building seaworthy ships—thanks in large part to the Vikings—caught up with people’s yearning to travel, and Europeans braved the Atlantic Ocean starting in the 15th century. De Gama went south around Africa to find a route to India and its riches. Magellan went south around Cape Horn to find a route to Asia. And Columbus, funded by the Spanish crown, sailed due west and discovered the richest prize of all.1

And the migration has continued ever since. Millions of Europeans have left the Old World for the New one across the Atlantic Ocean, starting almost as soon as the first colonies were established in the 16th century. And in later centuries they “discovered” and occupied large parts of Africa, India, and Australia and built enclaves and empires throughout the old, established empires of Asia.

But that doesn’t mean the rest of the world is full of pleasant, peaceable homebodies. The story of China has been one of repeated invasions from the north—the whole purpose of their Great Wall. And their Mongol neighbors conquered and briefly held the largest land empire in history. The Arabs followed the instructions of their Prophet and invaded Europe through North Africa and Spain, and through the Balkans up to Vienna. They moved into Central Asia along the Silk Road and entered India. Everybody steps on their neighbors at some point. In the 17th and 18th centuries, Iroquois of what would become Upstate New York fought the Hurons and Algonquians. And before that the Aztecs tried to conquer the Tlaxcalans, among other groups, in modern-day Mexico. Everybody invades. Everybody fights.

What does all this have to do with the Moon? Simply that we are a restless people by nature. When one place becomes too settled, too predictable, too bound by property rights and rules, too hemmed in with political alliances and charitable organizations, a certain percentage of the people are going to rebel. Some will opt for revolution and social upheaval, but many will just light out for the new territory, the next frontier, the land beyond the mountains.

In the 1960s, we Americans went to the Moon. It was the capstone of a space program begun in the Eisenhower Administration as a response to Russian rocketry and then promoted by President John F. Kennedy—“not because it is easy, but because it is hard.” The Apollo Program was a science experiment, a seed crystal for developing new technologies focused on outer space. In that sense, it was not a migration or colonization effort. It was in the nature of De Gama’s and Magellan’s voyages: go there, prove it can be done, come back.

Since then, we have sent robot probes all around the Solar System and even out beyond the heliopause to interstellar space. We have focused our human presence and efforts on science experiments and scientific and commercial satellites in Earth orbit. But most people, at least in the developed countries, believe we will go back to the Moon and travel to Mars—not just as an experiment or to gather data, but to colonize.

I am one of those people. Whether it’s a government program or funded by private entrepreneurs like SpaceX and Virgin Galactic, and whether it’s a base on the Moon or a colony on Mars, those are details. The Moon is nearby and completely airless, washed by the harsh radiation of the solar wind. Mars is farther away and has more available resources, including an atmosphere rich in carbon dioxide2 and possibly water in the form of ice, but it still gets a hard blast of radiation because Mars’s core is dead and no longer generating a magnetic field.

Either choice will be hard and will launch us on a new wave of technological discovery. Given the logistics and the ambient environment associated with either place, it would be easier to build a five-star hotel with an Olympic-sized swimming pool on the peak of Mount Everest—or, say, at Camp 4 on the South Col of the mountain, which approaches the “death zone” and its lack of breathable oxygen. Or you could build the same resort 500 meters (1,640 feet) down in the Red Sea. That would probably be easier, because years of submarine building have taught us how to handle water pressure at those depths.

But we will go, if not in this century, then in the next. Once we were a land-wandering people who only looked out on the deep blue with longing, until we acquired the technology to cross the oceans. Now we are an ocean-faring people—a people who routinely fly over the ocean’s vast barrier—who look at the deep black among the stars with longing.

One day, we will go there. And then it will be easy.

1. Except for the Vikings, who had ventured out long before and discovered and settled Iceland, Greenland, and—so rumor has it—Newfoundland. Of course, the greatest migration into the Americas came at the end of the last Ice Age, when Siberian hunters crossed the land bridge that is now the Bering Strait and flooded both the northern and southern continents.

2. Mars’s atmosphere, however, with a pressure less than one percent that of Earth’s, would qualify as a good laboratory vacuum with trace gases.

Sunday, May 13, 2018

The Original Jedi Mind Trick

Volcanic opening

Supposedly, in the Star Wars universe, the Jedi knights could control the thoughts and perceptions of other people in order to slip through the world without conflict or incident: “These are not the droids you’re looking for.” Whether they used telepathy or simply changed the appearance of the world and the other person’s apprehension of it—rippling the Force to their own advantage—it was a neat trick.

My parents taught me something similar, except it didn’t work on other people. It was a form of mind control directed at yourself. This is nothing new or exotic: we see posters all the time, more than ever on social media like Facebook, advising that you can’t change what happens to you, but you can change how you feel about and react to it. Like the Jedi Mind Trick, it’s a Zen thing.

A story from Zen Flesh Zen Bones concerns two monks walking down the sidewalk in the rain. They come to a corner where a beautiful geisha in her fine silk kimono is dithering about having to cross the muddy street. The older monk says, “Come on, darling,” picks her up, and carries her across. This horrifies the younger monk, who fumes about it as they walk along the next block. Finally, he cannot contain himself. “You know we’re not supposed to have anything to do with women, let alone geishas. Yet you handled her in a very familiar way.” The old monk turns to him in surprise. “Are you still carrying her? I put her down back at the corner.”

The world may exist in itself—objective data and incidents do exist outside your field of perception—but how you perceive it, what you make of it, and how it affects you is the Jedi Mind Trick. You can stare into the open caldera of an active volcano, or walk the steaming lava fields of Kilauea, fear fire and death, and become paralyzed. Or you can experience these things and see their wonder and beauty. Your response shapes the world.

When I worked in the Kaiser organization, one of the many stories about its founder, Henry J. Kaiser, came from the end of World War II. He heard at a dinner party that the U.S. government was putting up for sale some aluminum smelters it had built along the Columbia River to supply metal for manufacturing aircraft as part of the war effort. The war was over and the smelters were being sold as surplus. Now Kaiser knew nothing about aluminum. But when he got home that night he called his vice president in the iron and steel business, Tom Price, and asked for a report on the aluminum business. Kaiser wanted it on his desk by eight o’clock the next morning.

Price didn’t know anything about aluminum, either. So, according to the story, he went to his children’s encyclopedia and looked up about mining bauxite (which is just a form of dirt that concentrates a common mineral, alum), then chemically processing that dirt into pure aluminum oxide powder (Al2O3, also known as alumina), and electrolytically smelting that powder into aluminum metal. Clearly, just owning the smelters was not the whole business; you needed facilities in two or three areas. For example, the smelters had to be near a ready source of electricity, which the dams of the Columbia River were already supplying, while the mines might be a continent away on ground rich in bauxite, and the chemical plants could be anywhere in between where it was profitable to operate them. Tom Price copied this all down in a couple of handwritten pages. Kaiser read them and bought the smelters the next morning.

The difference between what the government was going to let go as surplus and what Kaiser wanted to buy as the core of his new business was vision. They were the same smelters either way. But the government was through building airplanes for the war, didn’t want to pay to run the smelters anymore, and was willing to let them go for scrap. Kaiser saw how this lightweight but strong metal had served in one application—becoming fighters and bombers—and was willing to bet that it could be useful in any number of other applications, from lawn furniture to house siding to soda cans and the trays for TV dinners. He wasn’t the only one to see this business, but he saw the opportunity and was willing to act on it fast.1

Henry Kaiser had a positive outlook on life. When he was in the cement business, he wanted to paint his trucks pink, even when other officers in his company suggested a more sedate gray-and-green pattern. “Pink is a happy color,” Kaiser responded. People also said that his negotiating style was that of the “happy elephant”: when confronted with opposition, he would just lean and smile, lean and smile, until he got his way. That man understood the Jedi Mind Trick; it just took longer than waving your fingers and speaking in a reassuring voice.

Another aspect of the Mind Trick is not letting personal hurts matter you. A scene in the movie Lawrence of Arabia has Lawrence demonstrate to some young officers how he puts out a match with his fingertips. When one of the others tries it, he exclaims, “It damn well hurts!” Lawrence smiles and replies, “The trick, William Potter, is not minding that it hurts.”

The world is full of burning matches and a lot worse. One is reminded of Hamlet’s “slings and arrows of outrageous fortune.” As a fully functioning human being, we can either dwell upon them, take offense, file a grievance, and nurse a grudge,2 or we can accept that being alive in the world comes with an infinite number of bumps and stings, hard looks and rude responses, and we can let them roll off as if we were personally coated in Teflon.

And when we die, as we all must, we can look back on that life as we pass out of this world. Whether you believe that you will go to some elsewhere mystical place, a heaven or hell, or that you will simply go out, like Buddha’s candle flame or Lawrence’s match, you can bet that you yourself will definitely be beyond caring, and probably be beyond even knowing, what effect you had in life and whether it was positive or negative. In that situation, your life as you live it here and now in this world can either be a futile waste, just one more surplus human being taking up space and consuming value, like those government smelters, or you can see the same sort of opportunities for a better future that Henry J. Kaiser saw all around him. You can make your space in the world as big and happy, as pink and elephantlike, as your imagination allows.

The trick, as Lawrence would say, is not seeing fiery death in the volcano but seeing the beauty of nature that surrounds you. The rest is simply walking the path that you see.

1. Another Kaiser venture after the war, when he tried to turn the business of making Jeeps into a car company to go up against General Motors, Ford, and Chrysler, didn’t work out so well. But then, Kaiser also knew the motto of every venture capitalist: “You pay your money and you take your chance.”

2. Or to quote the painter Paul Gaugin: “Life being what it is, one dreams of revenge—and has to content oneself with dreaming.”

Sunday, May 6, 2018

Biological Nanotech

Algae making biofuel

I can remember, oh, twenty years ago and maybe more, seeing on television the microscopic image of what was supposed to be the world’s smallest electric motor. It showed a rotor that had been cut—inexactly, so that it was not a perfectly round circle—from some kind of metal. It spun—not fast and not smoothly—against a stator plate made of some other metal. It wasn’t good for much else than the gee-whiz factor, but it was a motor smaller than, say, the period at the end of this sentence. That motor was probably the beginning of humankind’s dreams of nanotechnology.

The world of tiny motors has gotten a lot smaller since then. What is now supposed to be the smallest on record is a slim fraction of the width of a human hair, and the current effort is supposed to have a rotor that is just one molecule. Not a material one molecule wide or thick or high, but the whole rotor is composed of a single molecule. That makes the manufacturing process more a matter of chemistry than metalwork.

The idea behind nanotechnology is to design machines that work at the submicroscopic level, down at the scale of micrometers (millionths of a meter) and more likely nanometers (billionths of a meter).1 At the nano level, we’re not just talking about active dust—more like tiny mites compared to which dust is a boulder the size of a house. What any of these machines might do is in the nature of “If you build it, someone will find a use for it.” And that may be why the whole enterprise has been so slow to start: it is a world of theory looking for a purpose, rather than, as Henry J. Kaiser used to say, “Find a need and fill it.”

One thing is certain though: nobody is going to build just one of these nanites or nanobes or whatever you call it and expect to accomplish much of anything. One of them would be a technological wonder, which might be examined with a scanning or tunneling electron microscope, applauded, and then dismissed with a shrug. To achieve any real effect at the quantum level—which these machines are approaching—you have to make and launch thousands or rather millions of them and then rely on statistical measurement to observe their effects. That is, however many of the nanites or nanobes you make, a certain percentage will be defective and not work at all; a larger percentage will technically work but may never find the “shop floor” on which they are supposed to operate; and an even larger percentage will work for a while and then hit an air pocket or a vacuole or some other dry spot or barrier and wander away. This is like counting the number of molecules of acetylsalicylic acid in an aspirin tablet and asking how many of them you actually need to relieve a headache: as many of them as find the right nerves.

On this basis, with the machines so tiny and their singular effects so negligible, I can’t imagine that anyone is seriously going to try making them using the traditional methods of materials processing. That is, nobody will be buying raw materials, molding and cutting individual pieces and parts them (like that tiny metal motor), and then assembling these components in the same way Ford puts together the chassis, engine, wheels, and doors to make an automobile in Dearborn. Nobody is going to drop a molecular motor into a molecular framework—not even with a tiny molecular eyedropper2—and hook it up to molecular axles and wheels.

Down among the microbes and the nanobes, you have to stop thinking of this technology as some kind of machine. You have to treat it as a life form. Why would you try to design and fabricate metal wires, springs, and motors, manually pack them into tiny plastic shells and metal frames, and hope to have everything work at the molecular level, the nano-scale? It would be so much simpler to program these components in DNA and grow electro-chemical control circuits with actual nerves, achieve motor function with the elastic expansions and compressions of muscle fibers and proteins, and house everything in shells made of cellulose or keratin and frames made of calcium.3

When I was working at the biotech company, I heard about Craig Venter sending his 95-foot sloop Sorcerer II around the Sargasso Sea, then the Baltic and the Mediterranean seas, to sample the world’s oceans. He wasn’t looking for new sea creatures, although his team did discover that what we normally think of as isolated plankton species are usually whole genera that evolve and change every twenty miles or so. No, he was looking for novel proteins, in novel combinations, and with novel functions, along with the DNA genes and promoters that would code for them. His idea was to find ways to change the life-cycle, the operation, and the metabolic inputs and outputs of existing microbes to make them more useful to human beings.

For example, adding the right set of new genes might give algae a way to turn their photosynthetic processes to making lipids—fatty liquids with properties similar to crude oil—and then secrete them through their cell walls, so that each cell can go on making this oil substitute without becoming engorged and either stopping production or exploding. Such an algae cell—or a whole pond full of billions of them—would lie there in the sunlight and produce a form of oil that could be siphoned off the surface and refined to make gasoline. And then, with a bit of chemical tinkering, the cell might even be coaxed into making gasoline itself, if the stuff weren’t so toxic.

Of course, these would be cells that have been modified with the DNA sequences, proteins, and functional relationships between proteins that are already present in nature.

All such DNA is currently purposed to design and repair living creatures. Any adaptations either serve to improve the living body or else they become discarded over time—immediately if they are lethal to necessary functions. But the principles of coding and self-assembly might easily be adapted to small machines that operate in the submicroscopic environment, like single-celled creatures, for other purposes, ones designed by human beings. It would, after all, be easier to grow a microprocessor as a network of neurons than to etch one in silicon at the nanoscale, install it inside a mechanism, and wire it into sensory and motor systems.

Purposefully designing DNA to create new nanomachines might even employ metals and other materials we don’t currently think of as organic. For example, the epithelial cells in the mammalian jaw that form tooth buds secrete a mineral called hydroxyapatite, a crystalline form of calcium phosphate, which becomes the enamel surface of our teeth. Enamel is the hardest substance in the human body, and it contains the highest percentage of minerals. With a bit of chemical tinkering, such cells might be taught to absorb—from the managed environment of a bioreactor—and secrete other minerals and compounds. A pure structure or surface of, say, vanadium steel is not likely, or not at first. But hard parts made of bonelike and stonelike materials should be possible. And of course, making anything with polymers and resins, like plastics, should be a DNA-coding snap.

Nanomachines—or certainly micrometer-scale machines—might be made by groups of preprogrammed cells. Like tooth buds, or the embryos of living beings, they would form a cocoon of tissue that produced each part in place and then would be programmed to die and wash away,4 leaving the new micromachine in place and ready to operate.

And what would the new machine do? Well … it’s hardly likely we’ll need anything that small to pave roads or drive on them, or to manufacture complex machinery like automobiles or kitchen blenders. And we already have little cellular machines that can make usable oils and even drinkable beer and wine; they’re called seeds and yeasts. Complex little machines might be designed to repair the human body, or even repair and resurface the bodywork on your car. It will all depend on what you want.

Find a need and fill it.

1. Remember that a meter is just over a yard, 39.3701 inches. So a millionth or a billionth of that length is a significantly reduced measurement.

2. For which the technical term in biochemistry is “pipette,” and those things can only be accurately calibrated at scale of about milliliter, or thousandth of a liter—which itself is about a quart.

3. Of course, one-celled animals already have chemical motors that can whip around in circles, powering flagella for their movement through the liquid medium; so even the electric motor can be replaced with a living example from the biological world. That might be the easiest part of the machine to design, because the prototype already exists in nature. But circular motion has limited use in the submicroscopic environment. We use round wheels at the human scale mostly to propel loads over level terrain, and when the going gets too rough we revert to horses or mules. Motive power from limbs articulated by mechanical joints and muscle fibers might be more useful in the world of the really tiny. Other wheel-like functions, such as the gears in clocks, can be achieved in other ways.

4. The technical term for “programmed cell death” is apoptosis, and the word for “wash away” is lysis, involving the chemical dissolution and destruction of the cell membrane and its contents.

Sunday, April 29, 2018

The Immediacy of Life

Motorcycle on a curve

I’ve been riding a motorcycle—on and off but more on than off—for forty-four years. It is a whole body experience, much more than driving a car or even riding a bicycle. A motorcycle is like Harry Potter’s broom, moving with the speed of thought and requiring you to gauge every curve, every obstacle in the road, the nearness of every car, the gap between fenders if you are splitting lanes, and a dozen other things that happen practically all at once.

One of the things motorcycle riding teaches you is to perceive and move now.1 When you are approaching a curve at fifty miles an hour, and your balance—your very life—depends on making the correct adjustments to throttle speed, front and rear braking, handlebar position and force inputs, distribution of your body weight, choice of apex and your route through the curve to stay in your lane and avoid obstacles like gravel, potholes, and dead animals … you don’t have time to sit down and figure things out. In three seconds or less, you are going to be committed to making the maneuver. If you try to stop, jam on the brakes, or in any way reject the oncoming maneuver, you are going to be in a much worse place, with possibly catastrophic results, than if you miscalculate or misapply any of the above variables.

In such a position, you cannot wish that you were anywhere else, or that the choices before you were something other than what they are. You don’t have the luxury of time to decide. You don’t have the option of a do-over, or taking a mulligan, or apologizing to the gods of speed, balance, gravity, and tire adhesion. You are in the maneuver, and all you can do is face it, make your best approach, and ride it through.

This is one of the things that makes motorcycle riding so exhilarating, so much more a confirmation of life and skill than driving a car or riding a bicycle. You are in the moment, your life is on the line—figuratively and literally—and you have no choice but to succeed. If you stopped and thought about it, paused to dwell upon the consequences of a bad maneuver in terms of broken bones and abraded skin, if not concussion and death, you would probably not be there. But since you are there, you must put those thoughts and fears aside, swallow your heart, and ride it through. The moment is now, and the action is irretrievable.

There must be other experiences that match this: perhaps landing an airplane, hitting a fastball, throwing a long pass, or blocking a punch. These actions all have points in time that require you to commit to your skill and follow through. But even in these moments, you know that you have options. The airplane pilot can always apply power at the last second before the wheels touch down, fly out of a bad landing, and go around for another pass. The hitter with two strikes or less scan usually let the fastball go by and wait for another pitch which might be better. The quarterback can fake the pass and try a running play. The fighter can duck his head or even let the blow land, because a punch to the face is not likely to be lethal.

But life itself offers us many immediate and irredeemable moments. These are sudden choices, like rounding a curve at speed on a motorcycle, like facing a collapsing bulkhead in Lord Jim, where we have only a second or two to make a decision, with consequences that may affect and alter the rest of our lives, and then ride it through.

In the moment that you are offered a deal—like landing a new job that will require you to relocate, or a contract whose fulfillment will call for more effort or time or money than you were prepared to give, or a new house that is more space and upkeep than you planned for, or a mortgage that is more in monthly payments than your budget calls for—you have a sudden decision to make. Sometimes you can say, “Let me think about it,” and go away to discuss the situation with your business partner, your family, you wife. And you believe you can trust that, when the offering party says, “Okay, you think about it,” the parameters of the deal, or the deal itself, won’t change while you’re doing your thinking.

This is the “fish or cut bait” moment. This is the point of commitment. And an honorable person knows that if he or she commits to the deal, there can be no backing out. A person’s word—even if not backed up by oaths, is a bond, a signature is binding, and so is a handshake. The coward or the games player thinks of after-the-fact alternatives: that if things don’t work out, he or she can always get a lawyer, invoke an escape clause, annul the marriage, or just walk away from the obligations and the payments and let a court sort out the breakage. But the honorable person doesn’t have such thoughts. The commitment of a moment, made with a lifetime of careful planning and measuring, is binding for the term of the agreement—or until death do us part.

In this I am reminded of the assessment of managerial style by the majordomo Moneo Atreides in Frank Herbert’s God Emperor of Dune: “The difference between a good administrator and a bad one is about five heartbeats. Good administrators make immediate choices. … They usually can be made to work. A bad administrator, on the other hand, hesitates, diddles around, asks for committees, for research and reports. Eventually, he acts in ways which create serious problems. … A bad administrator is more concerned with reports than with decisions. He wants the hard record which he can display as an excuse for his errors. … Oh, they depend on verbal orders. They never lie about what they’ve done if their verbal orders cause problems, and they surround themselves with people able to act wisely on the basis of verbal orders. Often, the most important piece of information is that something has gone wrong. Bad administrators hide their mistakes until it’s too late to make corrections.”

That is, the good administrator can make a decision, ride through the curve—and have the honorable intention to accept and deal with the consequences. A bad administrator cannot make a decision and will either crash or absent himself—wish himself elsewhere—at the moment of impact.

Life itself offers all of us such moments. How we deal with them is a true test of character.

1. See SIPRE as a Way of Life from March 13, 2011.

Sunday, April 22, 2018

Life as a Mad Scramble

Word pile

Because I had already written one novel in high school—not a good one, a derivative space opera, but still it was a complete story in 472 double-spaced pages—I determined to go into English literature as my major at the university. I believed that, when I got out, I would become a novelist, a fiction writer, in order to make my living.

Most English majors around me at the university wanted to become teachers, either at the high school or college level. To teach in grade school or high school, they would need teaching credits from the College of Education, as it was then called at my university. To teach at the university level, they would need to take a master’s degree and then a doctorate to become a full professor with tenure rights. But this matter of learning English literature in order to teach English literature seemed self-perpetuating to me, like learning Japanese swordmaking in order to teach others to make Japanese swords. So long as society felt a need for students and the adults they would become to have a grounding in the literature of their civilization—or an in-depth knowledge of the artistry of swordmaking—the process might be self-sustaining. But let the faith in either literature or historic arms fade, and the teaching paradigm collapses.

But I was on a different path, learning to write novels. I actually proposed as my senior thesis to write a work of fiction instead of some scholarly dissertation, and my advisor agreed to consider it. But when the time came, I didn’t have an idea for a long story or novella. And when it came time to graduate, when in my dreams I would step away from the university and begin writing the novels that would make my reputation and earn my living, I discovered I didn’t have much of anything to say. That one novel back in high school, the story I had been dying to tell, appeared to have stood alone in my mind.

On a feverish Saturday morning in the winter term before my graduation, I actually went across the street to the building that housed the School of Journalism and found a professor in his office. I described my situation and asked about the prospects for an English major in journalism. He was kind but also, of course, amused. In his world, learning to become a journalist took years of training and practice—and this was back in the day when print ruled supreme and reporting with the written word counted for everything, while the mechanics of radio and television were taught as an interesting sideline, and the internet had not even been invented or imagined. He suggested that I should have started back in my sophomore year, switched majors, and dedicated myself to learning the reporter’s trade.1

In the end, I had the good luck of having impressed two of my professors. One recommended me to an open position at the university press, and the other immediately told me to bring my first editing assignment over to his house where, after dinner, his wife gave me a crash course in copy editing and manuscript markup.2 I held that university press job for all of six months, until the current depression hit home, the state university was squeezed for funding—in competition with the budget for plowing the state’s roads—and I was laid off two weeks before Christmas. But it was a start in the publishing world.

After being laid off, I came west to California, where my parents had established themselves the year before, and worked in my father’s business until I found a job editing trade books at a small press that specialized in railroad histories, western history, and Californiana. The job didn’t pay much, but it was interesting and served as a great introduction to my adopted state and a part of the country I knew only vaguely.3

From publishing, I went into technical editing at a local engineering and construction company, and that job turned into a position in their public relations department. And from there, I went to the local gas and electric utility, first as a technical writer and then in internal communications. After a stint of writing and publishing my first eight novels of science fiction—and getting caught up in the tidal wave that swept over the publishing world in the late 1980s and early ’90s (see The Future of Publishing: Welcome to Rome, 475 AD from September 9, 2012)—I went back to the working world at, first, a biotech company that made pharmaceuticals through recombinant DNA and, then, a manufacturer of genetic analysis instruments and reagents. In both cases, I started with technical writing and editing, then worked my way into internal communications, interviewing scientists and managers about new products and company issues.

At every step in this forty-year career, I had to scramble and reinvent myself. And I had to keep learning, asking questions, and absorbing new scientific and business knowledge along the way. But I never resorted to teaching professionally and I never once—in that old joke about English majors—had to ask, “Do you want fries with that?”

In my university days, I often hung out with engineering students. They would deride my English courses because, in their view, all I had to do was “bullshit” my way through a term paper, while they had to solve difficult equations and get the right number. Imagine their surprise, some years later, when I met the same kind of young engineers at work, and they were just discovering that “getting the right number” was the job of junior engineers. To advance to senior positions like project manager and vice president, they would have to write and speak well, entertain clients who were not all engineers or had grown beyond the engineering techniques they had learned so long ago, and understand a lot more about business and economics. That is, those engineering students would have benefited from a healthy dose of the humanities that any English major or liberal arts student was supposed to take along with his or her major. These talents last longer and go farther than the particular knowledge an engineer learns in school—although you have to keep up with the technical advances in any specialty.4

And when I got into the biotech world, I saw any number of postdocs who had devoted their academic lives to one submicroscopic scientific specialty. When they got into the working world, they were beginning to learn that, outside of academia or a government lab, they would need a lot more generalized understanding of both their area of science and the business principles that make a successful product function and sell. If they remained within their scientific “comfort zone,” they would limit themselves to just the one or two laboratories that did their own specialized line of inquiry.

Liberal arts education used to prepare generalists for work in management and government, where breadth of view, a knowledge of history and its patterns, and a familiarity with different viewpoints and ways of thinking—not to mention the common sense gained from this exposure—were valued more than any particular technical knowledge. Now they teach even social science and public policy as a specialty where students think they can safely ignore literature, art, and music. We are training a generation of termites for fitting into the narrow functions of the hive.

And in all of this, I am reminded of the quote from Robert A Heinlein in Time Enough for Love: “A human being should be able to change a diaper, plan an invasion, butcher a hog, conn a ship, design a building, write a sonnet, balance accounts, build a wall, set a bone, comfort the dying, take orders, give orders, cooperate, act alone, solve equations, analyze a new problem, pitch manure, program a computer, cook a tasty meal, fight efficiently, die gallantly. Specialization is for insects.”

1. I have since learned a bit about journalism, because writing for an employee newsletter or magazine—which I had to do as my main job in internal communications—uses the forms and principles of professional reporting: the 5W lead (i.e., Who, What, Where, When, and Why, or sometimes How) and the inverted pyramid style (i.e., tell the most important facts at the top of the story and then gradually descend into more minute details—because readers may not stay with the story to the end and want to know they are not missing anything important if they stop now, and because editors are constantly repositioning the stories in their newspaper and want the convenience of shortening the text from the bottom up without having to call for a rewrite). An internal communicator, even though he or she is presenting a viewpoint congenial to the company’s interests, must also present the story objectively or risk losing the employee-reader’s trust. And then, articles in a magazine will of course have a different structure—with a traditional beginning, middle, and end—because they are not necessarily jostling for page space and the reader’s attention, time, and interest.

2. Twice since then I have paid the favor forward, taking people who had just been hired into the editing profession, showing them about copy marking and typography, and presenting them with their very own copy of The Chicago Manual of Style, the Bible of the book publishing world.

3. Because this publisher had started out as a job printer, they were was one of only three houses in North America at the time that put all the elements of publishing under one roof. Right back of the office where I worked on manuscripts and read galley proofs, we had the linotype machines, which set the copy for all their books; the photography department, which made screened photo reductions and stripped the plates for the printing presses; the small four-color press for book jackets and the large sheet-fed Harris press for the interior pages; the bindery where the folios were folded, sewn, and bound; and the stock room and shipping department, where the books were stored, packaged, and sent out. Manuscripts and cartons of loose photos came in one door, and bound books went out the other. They even had a massive flat-bed letter press in the back, which printed directly from the cast type, one sheet at a time. Working there was a daily history in the civilization of the printed word: This is a Printing Office.

4. Except, perhaps, in the study of English literature itself. Since I left the university, the English Department there and everywhere has been taken over by critical analysis and the theory of deconstruction, which focuses more on the mechanics and limitations of language than the quality of literary expression. As I understand it—and I never studied the deconstructionist philosophy professionally—the approach would have you believe that no modern reader can fully understand, say, Shakespeare, because the language and the meaning of words and application of concepts has changed so much since the plays and sonnets were written. This would seem to undermine the idea that great literature presents us with themes and concepts that are universal and that resonate with human nature itself—and that’s a faith I tend to live by.

Sunday, April 15, 2018

Fake News

Total honesty

Back in the late 1960s I remember starting to hear people say that there is no such thing as absolute or objective truth. That truth could only be relative and conditional. That truth and the concept of knowing and believing something to be true applied differently in various spheres of knowledge and belief and among various peoples and societies.

Since this was also the time I was leaving the enclosed certainties of home, family, and high school and going into the wider world of the university, to this day I am not completely sure whether this variable approach to truth was a new concept in the world or just new to me in my suddenly wider reading among philosophies, religions, social theories, and the other grist processed in the mill of the humanities. The slippery nature of truth may long have been a tenet of some modern European and ancient Asian philosophies to which I was just opening my eyes.

But I’m willing to bet that, at the time, the disappearance of absolute or knowable truth was not generally accepted in the wider population. Certainly, our news reporters and commentators, the people who spoke to the public from a position of authority, acted as if the facts they were relating and analyzing descended from some knowable, supportable, and unchanging source. That they were speaking “the truth.”

But now, fifty years later, it seems that the acceptance of relative and conditional truth—as opposed to the absolute kind—has become common in the general population. Variable reality has become, in the modern usage, a “meme.”

Some of this change has been helped along by wider public knowledge of those ancient Asian philosophies. Zen Buddhism, for example, treats the world we can see, taste, smell, and touch as a sensory illusion that overlies a larger reality—or perhaps no reality at all. In the Hindu religion, from which Buddhism was originally a protest against the endless cycles of return and rebirth, the world around us is maya, an illusion, delusion, a magical trick of the demigods and not the underlying reality.

Some of the change has been adapted from the European existentialists, who taught that philosophic inquiry begins with the subjective, human viewpoint rather than from any universal, Platonic, or Aristotelian examination of an external reality. Later theorists, particularly in the areas of language and literature, brought in the concept of deconstruction, which attempted to separate the language and context of any communication from its meaning and the reader’s understanding. The assumptions of the religious, political, and emotional environment, the language, and the knowledge base of, for example, Shakespeare’s current readers and theatrical audiences are so interwoven and complex that modern readers cannot hope to thoroughly understand his work as it was intended and indeed as it is written.

And some of this rejection of absolute truth comes from a wider knowledge and acceptance of modern physics, particularly quantum mechanics. In the realm of the very small, where bits of matter sometimes act like bits of energy, exact knowledge becomes a chimera, a ghost of the old-style, mechanistic thinking of early physicists like Newton and Galileo. A quantum theorist knows that you can establish by observation either a particle’s position or its momentum and direction but not both, because to observe the one is to change the other. And, in dealing with such innumerable quantities of particles, the physicist relies on statistics and probability rather than trying to account for each photon, electron, or proton and neutron in an experiment. In the underlying philosophy of physics, reality is what you can actually detect and measure, not some presumed or imagined vision of what’s supposed to be happening.

All of these influences, which started to become mainstream in the intellectual, political, and moral upheaval of the late 1960s, helped erode a public belief in some kind of absolute, definable, true-for-all-people-and-times, knowledge of “the truth.”

In this country, the mainstreaming of this slippery version of reality was helped along by a growing agnosticism, acceptance of public atheism, and outright hostility to established religion. It gradually became fine for a person to be “spiritual” in the manner of the Zen Buddhists1 or Hindu mystics, but it was old-fashioned and absurd to still subscribe to the bourgeois, stifling, close-minded teachings of the Judeo-Christian religions.

And finally, the verbal tactics from the left side of the political aisle were brought to bear in this change. The National Socialists’ use of the “lie big enough, told often enough” to change public perception caught the attention of the entire twentieth century. Add Lenin’s earlier directive, “A lie told often enough becomes the truth,” which embedded the practicality of manipulating the truth in every political tactician’s mind. Challenging the apolitical teachings of the church and creating your own version of reality, especially a version based on statistics and false premises, forever changed the public trust in some externally verified, universal, and absolute truth.

So long as the politicians and their allies in the newsrooms of the big daily newspapers, broadcast television, and the professional commentariat could limit the scope of discussion, it was relatively safe to allow the public to understand that truth was relative, conditional, and negotiable. It served to undermine the old assurances, the established viewpoints, and the life lessons that Rudyard Kipling called “The Gods of the Copybook Headings.” Undermining popular assumptions and creating new myths and legends are what politicians and professional opinion leaders do.

And the process worked, after its fashion, for about forty years. But then in the past ten years or so that funny little science-and-technology-sharing computer circuit called “the internet” sprang into full flower. Not only could every person on the planet—except maybe those in censorship-heavy societies, like the People’s Republic of China—have his or her own Facebook page and Instagram messaging account without doing much more than press a few buttons, but everyone who wanted to put in the effort could publish his or her own books and pamphlets in electronic and paperback form, put up a web page that had all the visual authority of any “legitimate” news source, create podcasts that sounded exactly like radio and television interviews, and go into the news business with the same tools as the big city dailies and broadcast journalists. Add to that the quality of video imaging through your average smartphone being equal or superior to the best handheld equipment of a decade ago—and a lot more available and present on the scene—and you have a dozen eyes and a hundred voices ready to report “the news.”

Then the meme of truth being relative and conditional began to backfire. The established media, with their highly paid teams of reporters, editors, and commentators, their worldwide news bureaus, and their tested and accepted political narrative suddenly had to compete with god-knows-who from god-knows-where. The “news” and the “truth” started coming out of the woodwork and flying around the room—around the internet—like a storm of bats exiting their cavern at dusk.

The paid media can cry “Fake news!” at all the controverting crosstalk of the benighted amateurs, and the million voices of the internet can cry “Fake news!” right back at them.

In this, I am not advocating for a return to an oligopoly of opinion, held by the magnates who own and run the largest printing presses and most expensive broadcast newsrooms. That cat is out of the bag and long gone by now.2 I am not pining for the established voice of religion, or the oldest political parties, or my grandfather’s choice of books in the family library. In a way, the little-D democracy of letting every person have a voice is refreshing. The choice of which truth to believe is now the responsibility of the average citizen. The story is not handed down by the party in power or its media moguls, but flutters on the wind like a thousand little birds. Each of us must use his or her best intelligence and widest reading of history, science, and sociology to determine where an acceptable truth might lie. As Gurney Halleck said in the movie version of Dune,Now, guard yourself for true!

But maybe it was a bad idea, in the first place, and for political reasons, to sell the public on the belief that truth was not absolute or discernable but subject to preference, opinion, and varying conditions. In the internally wired society, you end up in the unfortunate position where my lie is just as good as your truth. But there we are.

1. Buddhism and its offshoot Zen are not actually mystical, mysterious, or even spiritual. Buddhism is a philosophy bounded by a practical approach to life and human existence. It is more of a psychological practice than a religion. But that’s just my “truth.”

2. Those national governments that think they can control their public’s opinions and perceptions by editing, filtering, and banning websites on the internet are playing a losing game. No matter how many censors you employ, the worldwide web will have a hundred times as many chattering sources who are well versed in employing euphemisms, creating metaphors, and hacking the code.

Sunday, April 8, 2018

One True Religion

Ancient of Days by William Blake

The popular image of Muslims these days is mostly that of raging fanatics who shout “Allahu Akbar!” while crashing trucks into crowds and blowing themselves and others up with suicide vests. I have no doubt that such people exist and that they are fanatical and dangerous. But the Muslims I have met in the West have been educated, reasonable, thoughtful, middle-class-seeming people.1

I’m thinking, in particular, of the husband of one of my supervisors at the biotech company. He was an Egyptian, formerly in diplomatic service, who was well-read and courteous, with a great sense of humor. He had no more interest in punishing unbelievers and throwing bombs than my grandmother. Of course, he was also from a well-to-do background. But in the matter of Islamic fanaticism, a person’s intelligence, breeding, and background are no guarantee of their gentility. Look at Osama bin Laden. Look at the young Saudis who could afford to take piloting lessons in order to fly jetliners into buildings.

In my experience, however, anyone who is well read and liberally educated tends to become more tolerant of other religions, not less. After studying the tenets of Judaism, Christianity, Islam, Buddhism, Hinduism, the ancient Greco-Roman pantheon, and the surviving forms of animism and pantheism, it becomes impossible for an enlightened person to insist that “my reading” of the scriptures, “my sacred book” itself, and the belief system “I was taught as a child” represent the one true religion and that all the rest are pernicious poppycock.

But I understand where and how many modern Muslims have acquired their fanaticism. About 1,300 years after Christ, Christianity went through its own antiheretical phase, where unbelievers and schismatics were bullied, tortured, and sometimes burned at the stake. Much of that rock-hard belief was inspired by external pressure from the previous expansions of Islam into Spain, France, and Eastern Europe, and some of it was a response to internal pressures from the early Reformationist impulses, which arose through political manipulations and frankly heretical tendencies.

Now, about that same 1,300 years after Muhammad, Islam is going through a similar phase, where apostates are executed and unbelievers are beheaded. Islam is responding to much the same set of pressures, too: an encroaching spirit of Western culture that started with the Crusades in the Middle Ages but has vastly accelerated with the blossoming of a worldwide political, commercial, and electronic cultural tide. And Islam has always had tension among its various sects—mostly between Sunni and Shi’a—which is now being pushed by political pressure from Iran against the rest of the Middle East.

I can only hope that most Muslims will come out on the other side of this turmoil and adopt the worldview of most of today’s western Christians: “I know the Bible says I must believe this and follow that—such as the injunction about ‘not suffering a witch to live’2—but … you know … I just believe in a loving God who wants the best for all of us.” In time, their Allah may likewise become a polite, country-club gentleman who makes no life-threatening demands and creates no unpleasant waves.

Among the things that a well-read, thoughtful person is likely to dismiss from the sacred scriptures are the fantastic stories, such as that the God of all creation walked in the Garden of Eden and hobnobbed with Adam and Eve, or that He spoke the only absolute truth for all the ages into the ear of Muhammad in his dreams, or that He had a human son with the Virgin Mary in order to sacrifice him to the world. Such a well-read—and doubting—person will tend to discount these stories as literal truth and instead accept them figuratively, as metaphors for God’s loving personal relationship with all men and women, both as individuals and as a group.

When you have read enough of other religions, the best you can hope for in terms of a relationship with the one true God is a kind of Vedantic liberalism, where the godhead—however defined—is simply the ultimate conscious reality, a metaphysical all-soul that encompasses all human thought and perception, and each of us is merely a part of this reality which has broken off, is now spending time in the changing physical cosmos which is comprised of body and matter, and will one day return with newfound insights to rejoin and add to the whole. That’s a bleak and not very comforting view of the situation, but it works as a kind of bedrock reality that would explain the religious impulse in human beings that we don’t find in, say, spiders or great white sharks.

What I am groping towards here is uncertainty. A well-read and thoughtful person cannot believe that every religion—in all its specific and fine-grained details—represents the ultimate truth of the whole world, let alone a universe of a hundred billion galaxies.3 Oh, the details may be true, whole, and perfect in the mind of each believer, no doubt! But is it true for the world, or for every world? Is that one story actually the way the cosmos—and whatever metaphysics lies behind all that spinning star stuff—actually work?

A thoughtful person, having been exposed to different ideas and points of view, will have doubts about the nature of singular truth. Such a person may still hold to the principles of the religion he or she was taught as a child, delight in and believe as “mostly true” the stories in his or her particular sacred book, and try to follow the precepts and guidelines of its particular morality. This is part of any person’s acculturation. But he or she will have a hard time insisting that his or her own personal religion is the one true religion, the only correct interpretation, and that people who believe otherwise are apostates, infidels, wicked, vicious, and worthy of sudden death.

Tribalism seems to be part of our basic human heritage. But to overcome it in our search for something better, truer, more real, and more universal would also seem to be part of human nature.

1. Our condo complex seems to be a landing spot for people emigrating from societies in distress and arriving in this country, because we are located in an attractive Bay Area community with a good school district. During the 1980s and ’90s, we had an influx of people from Iran and Lebanon, as now we have Russians and Chinese. It makes life interesting.

2. Exodus 22:18.

3. In this, I am reminded of the legends of Britain’s early war chieftain, King Arthur. For a legend arising from the fifth century, it has such well-defined and specific details: the passionate attraction between Uther Pendragon and Ygraine of Cornwall, resulting in the illegitimate and orphaned Arthur; the royal sword Excalibur embedded in the stone by the wizard Merlin; the betrayals of the half-fairy Morgana and her illegitimate son Mordred; the search for the Holy Grail by the Knights of the Round Table; and the illicit love affair between Queen Guinevere and Sir Lancelot, which sundered the perfect kingdom of Camelot. It’s a rich tale that seems too culturally modern to have come down from barbarian Britain in the early years after Rome’s legions had packed up and left. And of course, it’s a tale that has been adapted and extended in the telling, both in English and French legend, reaching a final form with Thomas Malory’s Le Morte d’Arthur in the fifteenth century, further refined by T. H. White’s The Once and Future King in the twentieth century, adapted again in Mary Stewart’s The Crystal Cave series, and put on Broadway as Camelot by Lerner and Loewe. This sort of quirky specificity comes from telling and retelling the culture’s favorite story over the ages. I’m sure the same sort of expansion and enrichment happened to Homer’s Iliad and helped shape most of the stories in the Judeo-Christian Bible.