Sunday, August 28, 2016

Fractured Reality

During the trip back East for my fiftieth high-school reunion this summer, I stopped in Cleveland to visit with one of my cousins. As part of our sightseeing in that lovely and too-often maligned city, we visited Lake View Cemetery, which features—along with the family plot of some early settlers my cousin was researching for a project—the Wade Memorial Chapel, with its Tiffany-designed interior, and the Garfield Mausoleum. At the latter, I learned about our twentieth president, a native son of Ohio, a scholar who raised himself out of poverty through education and strong values, and a beloved politician who was shot by a delusional office seeker just three months after his inauguration. The story of that assassination—along with much about the medicine, technology, and politics of the time—is told in Candice Millard’s Destiny of the Republic.

This current election year is a time of unusual and unbelievable surprises. One party has nominated a man with no political background whose id seems to be directly connected to his mouth. The other nominated a woman with a political history so shadowed that people casually dismiss emails suggesting she connived at rigging the primary election and sold facetime at the State Department for donations to her family’s charitable foundation. Nobody much likes either of them but adherents clings to their party’s candidate, either out of loyalty to one side or hatred and fear of the other. In the throes of this hot mess, I found the Millard book a refreshing and surprising return to an earlier, perhaps more innocent time. It was an age very different from our own.

First, the scars of the Civil War, which had ended just sixteen years earlier, had gone very deep. Half the county, laboring under an onerous Reconstruction that rivaled the reparations to which the Allied Powers had subjected Germany after World War I, had reason to distrust and fear the other half. James A. Garfield was not anyone’s first choice for president on the Republican ticket. In fact, his mission at the convention in Chicago had been to nominate a fellow Ohioan, former secretary of the treasury and brother of General William T. Sherman. But Garfield—who had spent his early years as a teacher, then fought as a general himself in the war, and finally served as a quietly respected U.S. congressman and senator—was such a polished speaker that people began to think of him as a candidate. With a rich field of other candidates, however—including a lackluster politician named Chester A. Arthur, who was backed by the most powerful and corrupt patronage wielder in the party, and who would later become Garfield’s vice president—the balloting went on for days. But when Garfield was finally nominated, the country rallied to him. His message—when he chose to give it, which was seldom, because he did not think it proper to campaign actively—was full of reconciliation, good sense, and fair dealing for all citizens, including former Confederates and the recently emancipated black population.

And then, when Garfield was shot by Charles Guiteau, a man suffering from psychotic, disorganized, and delusional thinking who today would be diagnosed as schizophrenic, the nation reacted with grief and horror. When this disappointed office seeker let it be known that he supported Chester Arthur and his corrupt political backer, Roscoe Conkling, the pair had to go into seclusion to avoid being lynched. Even in a country as divided and confused as America was in the wake of bitter internal war and a failed peace effort, people could come together in a common feeling of disdain and outrage.

The bullet wound in Garfield’s back, which had avoided the spine and major organs, was not immediately fatal. In fact, many soldiers during the war had taken similar wounds and lived on, carrying the bullets or shrapnel inside their bodies for years. However, President Garfield received the best medical attention of the time, which included his physicians’ determination to find and remove the bullet. Since American doctors did not yet subscribe to the germ theory of disease and the antiseptic practices of British surgeon Joseph Lister, the president’s attendants probed his wound repeatedly with metal and ceramic rods and even with their own fingers, none of which they washed first, let alone sterilized. The wound became infected repeatedly as Garfield lingered, flat on his back in bed, for almost three months. In an interesting sidelight, telephone inventor Alexander Graham Bell tried to develop and use a device, originally created for neutralizing electrical interference on phone lines, as a means of locating the bullet but this early form of magnetic scanning was not successful.1

As the president lay dying in the White House, then recovering, then dying again, the doctors gave out bulletins several times a day describing his progress. These were transmitted to major cities and towns and posted outside telegraph and newspaper offices, where citizens gathered by the thousands to follow the news of Garfield’s hoped-for recovery. Those who lived near Washington camped out in the park across from the White House awaiting developments.

Why do I find this story so interesting? Because it makes such a marked contrast with our own turbulent times.

In 1880 this was a country, less than twenty years from a state of political secession and all-out war, that could come together both politically and emotionally. First, they would nominate and elect a relatively unknown senator to become president, responding to his good nature, positive qualities, and inspiring personal story. Then, when he was struck down, they would unite with even stronger feelings of outrage and grief.

Today, we have endured in our recent history no break or disjunction so great as the Civil War. In fact, we have much to celebrate and experience forces that should unify us: the end of the Cold War within most people’s living memory; unprecedented growth in scientific discoveries and inventions, and the spread of their application into people’s daily lives; and a robust economy that—despite a series of booms and busts in the past fifty years—continues to grow and expand, providing a standard of living for the average American that’s still the envy of the world. Given these good times, with no obvious, existential threats to our way of life and national security,2 one would think we might look for the gentlest, wisest, most inspiring people to lead us. Instead, we fight and tear down, and claw at the eyes of those who would try to lead—until only the toughest, most armored, eyeless creatures will choose to compete for … not for the dignity of leadership, but for the opportunities of pure power.

In 1880 this was a country with a relatively primitive communications system. True, the railroads and the telegraph had been operating and opening the countryside for several decades, and the telephone had just been invented and was rapidly spreading in and between the urban centers. But people still took in their account of events and absorbed their political opinions through their local daily newspapers, which had seen nothing of the amalgamation of news empires that would take place in the coming decades and into the twentieth century. And yet the story of the president’s condition could disseminate rapidly and relatively uniformly from the doctors at his bedside to the posters and placards outside the telegraph office in every community. What’s surprising is that, while people might speculate about Guiteau’s motives and connections, and Garfield’s advancing and retreating waves of infection whipsawed people’s hopes and fears, the stream of information seemed relatively unburdened by wild fantasies.

Today, with so much consolidation of our news feeds—but also with the dispersion of public opinion through alternative news sources and directly through social media—unfettered rumors and swirling conspiracy theories would tend to overlay such an event. A president who lingered for months at the point of death would, like the cat in Schödinger’s box, be both alive and dead, as well as sighted simultaneously golfing on Martha’s Vineyard and whooping it up at the Bunny Ranch in Nevada. The president’s assassin would be identified with three different personalities and six different co-conspirators, all discussed as verifiable truth in the media, as well as flocks of supporters who would idolize him as a saint and detractors who would see in him the Antichrist. And every one of 320 million Americans, along with several billion more active viewers and listeners around the world, would have their own understanding of and opinions about the facts of the case. Everyone has a camera, an imagination, expertise with Photoshop, and access to the worldwide web. Everyone makes up their own story.

Back in 1880, when people had to ride the rails for days to go from one side of the country to the other, and had to sit down and read page-long stories in the print media to remain informed, the country seems to have been a lot more unified. Today, with the ability to fly from breakfast in New York to lunch in San Francisco—well, a late lunch, considering the travel time plus time zone differences—and with information freely flowing in spoken words, images, and seven varieties of text through the airwaves and different networks, the country is fragmented and our reality fractured.

As a science fiction writer, I try to understand all this. Anyone looking forward from 1880, and surmising about—or even actually being told about—the advances in media technology that were coming in the twenty-first century, would predict greater public cohesion, more uniformity of thought, and greater access to and reliance on provable facts.3 Instead, we have just the opposite. Not a thousand points of light, but a thousand points of different and irreconcilable worldview.

This tells me that, with the way our technology is advancing exponentially, and despite the best analytical imagination I can supply, life and the nature of our political, economic, moral, and spiritual reality at the dawn of the twenty-second century will be essentially unknowable.

Damn! And I thought I was just getting good at this prediction thing.

1. With modern medicine, of course, the bullet would have been found by x-ray and removed in a sterile surgical procedure that first afternoon. For comparison, consider how quickly President Reagan recovered from a similar attack.

2. Yes, we have troubles: weak allies in Europe who are floundering under a broken political and economic system; a resurgent Russia flexing its muscles to the east of them and looking to resurrect the old Soviet hegemony; a resurgent China seeking to expand into its ancient cultural hegemony over all of Asia; economic stagnation in Central and South America driving waves of job-seeking immigrants across our southern border; and political chaos in Africa and the Middle East driving religious and political refugees into the modern, developed states of Europe and perhaps eventually into this country. But all this is business as usual, because the world has been in flux and turmoil ever since the rise of nation-states four thousand years ago, except for those brief periods when Rome ruled the Mediterranean or Britain governed a patchwork of colonies across the globe. None of the current turmoil spells an imminent threat to our continued existence as a nation or a society—not unless we let it.

3. Oh, for the days when a photograph was accepted as visual proof of a single, solid reality.

Sunday, August 21, 2016

Utopia and Dystopia

I grew up with the dystopian1 novels 1984 by George Orwell and Brave New World by Aldous Huxley, and in college I studied the utopian works The Republic of Plato2 and Looking Backward by Edward Bellamy, and the dystopian We by Yevgeny Zamyatin, among others. More recently, I’ve read Suzanne Collins’s dystopian Hunger Games trilogy and seen its movie adaptations, and I’ve watched the movies of Veronica Roth’s dystopian Divergent series, although I have not yet read the books.

I understand why an author would write either a utopia or a dystopia. The former is to test out ideas about how the human condition and human society might be made better if not perfect. The latter demonstrates how, in the presence of some human failing or absent some leavening force, society might become very much worse. All of this is the proper sphere of science fiction: to envision an alternate future for humanity and the mechanisms that might drive us toward its attainment. I understand the purpose of writing a utopia or dystopia, but I’ve never tried to write one myself.

One might also see in much of our current politics, especially among American Progressives and European Socialists, a real-life attempt to erect utopia here on Earth, marked by universal and equal access to education, health care, job opportunities, plentiful goods and services, and proportionate personal wealth. In similar fashion, the goal of the early Marxists was to create a worker’s paradise of individual labor freed from the strictures of market forces, return on capital, authoritarian bosses, religious coercion, and government interference. This was also the theme of the John Lennon song Imagine: no religion, no countries, no possessions, “… nothing to kill or die for … a brotherhood of man.” And it was the drive of every 19th-century utopian commune in America that was run along socialist lines to create a separate space of equality, friendship, love, and shared physical labor.

I can understand the motive for writing such stories and dreaming such dreams. But, for me, the books and the political schemes they represent simply do not work. I don’t think I could ever write a thoroughgoing utopia or dystopia.

These books and political programs are all based on distorting human nature. In utopias, the distortion is toward a good and positive spirit of sacrifice and selflessness, which is indeed found in pure form in some human beings, but is by no means the dominant characteristic of our species. In dystopias, the distortion is usually toward a spirit of gullibility and passivity on the part of society as a whole, and callousness, manipulation, and greed among its leaders—which again marks some humans to a high degree but is not the nature of all or even most human beings. The result is that most of these stories present societies that revolve around only one political premise, one positive or negative cultural value. And most of the characters in them are mere caricatures or cartoons, reacting to that premise, sustained or squashed by that value, and not real people with complex emotions and motivations. Paintings done in two dimensions with a limited palette are boring. Music played on a simple scale without sharps or flats, or the rudiments of harmonics, is boring.

Probably the most true-to-life—and therefore most frightening—of the dystopias is 1984. Its depiction of an all-powerful state with an ever-watchful leader feels real when compared with Nazi Germany, Soviet Russia, or Communist North Korea. The protagonist Winston Smith and his paramour Julia are small, powerless, feeble in their attempted resistance, and ultimately left naked in their defeat. The mass of society is obedient to a fault, innocently accepting when people and events are written out of the history books and sent down the “memory hole.” They try eagerly to comply with the state-sponsored changes in language which make productive thought harder and harder. There is an apparently active underground, but it turns out to be only a state-sponsored fiction created to cement state control.

You read the book and despair—until you remember that even the most draconian states, like the Nazis and Soviets, had their internal resistance groups which were not state approved. Real people have a conscience and a working memory. Yes, you can imitate public allegiance through the regulation of civic action, like forced attendance at parades and regimented salutes, and the manipulation of popular speech, like substituting “Heil Hitler” for “Good morning.” Yes, you can turn some people into frightened sheep and others into social-climbing wolves. But in the privacy of the home, in the intimacy of the family, the leavening of human nature will react with doubt and scorn. The Soviets, for all their propaganda machine, oiled with the likes of the dreary newspaper Pravda and the even drearier humor magazine Krokodil, reaped a harvest of popular samizdat, or underground publications and Western music laboriously copied out or rerecorded manually and passed along from hand to hand.

Utopias and dystopias both seem to founder on questions of scale and issues of absolutism. In Ray Bradbury’s Fahrenheit 451, the prohibition on books and reading matter was so complete that, at least in the movie adaptation, people were looking at newspapers printed as comic-book imagery with empty word balloons, and they seemed to enjoy television plays with empty drama about who gets to sleep in the Blue Room. Real people don’t put up with such nonsense.3 Presumably, then, statutes and public announcements, and technical documents such as instruction manuals and operating procedures, would all be transmitted through verbal recitations and video demonstrations rather than written text.4

The matter of absolutes comes home in the idea of the perfect society having “no possessions.” I can understand how that would mean one person not owning, as his or her sole fiefdom, a chain of factories with monopoly power over some national good or service, employing hundreds of thousands of people who respond to the owner’s instructions as if they were government edicts, and paying him or her billions of dollars in profits. I can also imagine this deprecation of the human impulse toward acquisition as applying to the farm that a family works to both provide a private food source for themselves and generate a cash crop, or the corner store that a family runs to sell food and sundries to the local neighborhood while generating work for themselves and a personal income. “No possessions” might also apply to the house a person lives in or the car he or she drives, as communal housing and public transportation can be thought more equitable and efficient. But do I get to own the pants that I wear today, wash myself tonight, and then put on again tomorrow? How about my shoes or my jacket? Or the toy my child plays with and loves to distraction? Where does one draw the line when decreeing “no possessions”?

In 1984, the telescreen in every room not only played out endless exhortations and propaganda but also watched as the occupant went about his or her private business and eavesdropped on every conversation. The slightest deviation from party discipline could presumably be detected and punished. This sort of continuous surveillance was science fiction when George Orwell wrote the novel. However, as many people discovered when the Patriot Act was approved in 2001, powerful computers and the transition of our telephone system from analog copper wires to networks switching around digital packets meant that the National Security Agency and other government bodies could sample every conversation flowing through the system, pick out key words and subversive ideas, and build a case against any citizen. This was the absolute control of the 1984 telescreen brought terrifyingly to life.

Except … it’s not. Even if a network of supercomputers could analyze, weigh, and flag subversive speech among the billions of words that 320 million Americans—and about as many more foreign nationals—speak into the telephone system every day, the government prosecutors with their limited—though still impressive—resources would go crazy trying to track down and take action on every lead. Even to become a “person of interest,” a potential assassin or saboteur needs to do more than speak a few predesignated words into a telephone. And, as we’ve seen from the news analysis of recent terrorist acts, even people of interest with proof against them tend to be vetted and dismissed into a sea of suspicious but not indictable characters. Supercomputers may do the flagging, but human beings with their limited imaginations, faulty attention spans, and imperfect understanding of every situation will still do the follow-up interviews.5

The world is neither wholly good nor bad; instead, it blends both characteristics in equal measure. Human beings are neither wholly self-sacrificing and subservient, nor selfish and grasping; instead, they are a mixture of both, in different measures at different times. Emphasizing one aspect of the world, society, or human nature might make a strong point in an interesting study, but it does not make for a good story. Real people make for good stories because they are not caricatures, not entirely predictable, and they follow a story arc that the reader or viewer can only bet on but never know for certain in advance.

1. Thomas More coined the word utopia from Greek roots meaning “no place” for the title of his 1516 book.

2. Although The Republic is commonly described as a utopia, I wouldn’t want to live there. Like Thomas More’s description of an ideal island full of selfless people farming the land and being rationally distributed, and when necessary redistributed, around the countryside in groups to maintain an unnatural state of balance, Plato’s ideal city-state treats its people more like puppets than citizens: deprived of family life and private property, subjected to an educational system strong on physical conditioning and mathematics, forbidden to read poetry and fiction, and drenched in public-spirited martial music. In either situation, I would be planning my escape … and I suppose that’s the point.

3. Even American television programs—or at least those that survive past the cancellation point of their first season—have some content or features to which a conscious, self-aware, adult human being can relate.

4. When I worked as a documentation specialist in the pharmaceutical industry, the question arose about using photos and illustrations in our operating procedures—and presumably this would extend to reliance on training videos. The U.S. Food and Drug Administration regulations require written procedures, which can be cited and specifically enforced, rather than imagery and demonstrations, which are subject to the viewer’s attention span and interpretation. The force of law will still be codified in words.

5. But with all of this, I cannot account for North Korea. There the population lives in primitive darkness—see the nighttime satellite photos of a blacked-out country—under gulag conditions, and on the edge of starvation. The Kim family and their military supporters seem to have achieved total, interpersonal, locked-down control of the country in the sixty-odd years since the Korean War ended. Perhaps you cannot breed humans into self-sacrificing sheep, but in three generations of unrelenting surveillance and punishment you can shape them into mice that hide from the daylight.

Sunday, August 14, 2016

Rational Thoughts on Suicide

Suicide—the taking of one’s own life or allowing oneself to die with or without a fight—is not always or by itself an irrational act. As a novelist, I can think of many situations where a calm and rational person might be willing to face certain death in order that others may live. This is the wounded soldier, found in many stories, who stays behind to hold off the approaching enemy while the rest of the company escapes. Or the Sydney Carton1 who offers himself in the place of another, better man.

But rational suicide might not always involve self-sacrifice. A person faced with an inevitable and painful death, such as burning alive or succumbing to a ravaging disease, might choose to accept a quicker, less painful way out of life. This is not an irrational act, although it might be a desperate and despairing one.

Our species could not be fully self-aware, or even fully human, if we could not rationally contemplate our own personal destruction, the end of a time that we must know is finite. Indeed, I have always favored the definition Robert A. Heinlein gives for an adult: someone who knows he is going to die. Once a person has come to terms with the inevitable, he or she knows what is possible, understands the value of his or her own life, and can decide how best to spend it. That is, how to choose between the potential achievements of the remaining years against the goal that is on offer now. Someone who does not know those years are already numbered, no matter how many they may be, and death is inevitable, whenever it comes—such a person remains a child with the fancies and illusions of a child.

As an adult, we want our lives to mean something: to serve some purpose greater than ourselves. Even if that purpose is one we have chosen for ourselves and serves some internal ideal—such as painting a beautiful picture or writing a thoughtful novel, something only we ourselves can judge and appreciate—it is still a greater purpose than satisfying our personal wants or gratifying our senses. In the same way, we want our deaths, the last act of our lives, to mean something as well, to serve a purpose greater than demonstrating our own foolish choices and carelessness.

As human beings, we strive for purpose in a world, and in an ecological niche, that does not automatically provide sense and meaning to our lives. Yes, we have the commandment, written into our genes as in our Bible,2 to go forth, be fruitful, and multiply. But this is not an individual mandate. Being one link in a chain that stretches backward to the first one-celled microbes and forward to whatever comes next in the evolution of life on this planet is simply a biological necessity. And indeed, to fail to reproduce is an act of cellular suicide all in itself. But merely having children—for most people, males especially—does not satisfy the rational part of the brain that celebrates the individual, the ego, the “I” that is not merely a collection of cells but an autonomous, free-willed being.

Nothing in life can supply the ego’s purpose from the outside. Well, except perhaps for a parent or kindly grandparent who bends the imagination of a young child toward a certain pursuit, amenable to the child’s talents, experience, capabilities. Such a lucky child may grow up with an ingrained sense of purpose that he or she might think came out of the air, naturally, as a directive from some higher power.

But for the rest of us, we flounder. We must decide for ourselves what our destiny and our fate will be. And many of us never rise to the awareness that this is a natural choice at all, that we must put thought and energy into deciding what path our lives will take and what kind of person we will become. For those who do not ever recognize the choice and its importance, life is a matter of drifting on the currents, like a not very interesting character in a not very well written novel. For such people, suicide might come easily.

The wish to continue in life and fulfill that purpose is also a matter of projection, expectations, and the weighing of chances. For those of us who make the arts our personal goal and the focus of our extra-biological attentions—that is, aside from the daily routine of eating, sleeping, bathing, dressing, and other self-maintenance activities—the realization that our own talent may not meet expectations, that a future of study and practice won’t improve our odds of success, and that we will end in obscurity can be a crushing blow. “Ego death,” as one of my wargaming friends describes a total, ignominious defeat.

Yes, we are assured that the effort is the goal, that simply doing the work is its own reward, and that fame and fortune come to but a few. If an artist or a writer can be satisfied with his or her own work, no matter what the critics and the buying public think, then these palliatives will satisfy the demands of ego and purpose. But what happens when the creator looks at the work, the total oeuvre, and sees only trash?3 Then he or she has failed not only the expectations of the public, friends, and family, but also of oneself. And then nothing is left. Ego death for real.

Given this potential for critical self-doubt, perhaps it is better to make the personal goal simply one of offering service to others, in the manner of Mother Theresa. We can make personal meaning out of helping wherever there is a need and we can supply a willing pair of hands or a problem-solving intellect. In these cases, the overall quality of the work and the personal responsibility for the outcome are less important than the will and vigor with which the effort is made. The outcome lies in other hands, the responsibility with the fates or the gods.

And, as to ego death, people go through calamities all the time. Into each life comes the loss of a loved one, alienation from family and friends, disappearance of fortune or reputation, devastation by storm or fire with the loss of a home or property into which the person has put so much of his or her time and effort. The things we value turn to ashes and dust. Our hope for the future, of living out our lives in a time bubble where these perishable things remain forever unchanging, is dashed. And yet into the void created by such losses there sometimes seems to creep—at least for those of us who are lucky in attitude, or have learned from early training, or persist by some cellular vitality—the restless turning to other loves, other goals, other vessels for our sense of self, security, attachment to life, and hope for the future.

The lucky people can bend with misfortune, shift gears, find new roads, and move forward. In fact, they may never look far enough down the road on which they are traveling to ask what happens when it ends. They know that all roads eventually end, but that most roads also branch out, that goals are malleable, and that people—every person, regardless of past history—are capable of remaking themselves into something new. There is always something new that a human being can try. All it takes is bravery and patience.

Life is persistence. And it can be a long time until the candle finally burns out. That is also something every adult knows.4

1. From Charles Dickens’s A Tale of Two Cities.

2. Genesis 1:28 in the King James Version.

3. Public radio personality Ira Glass has an especially apt thought in this regard, animated by the following quote.

4. After reading all this, Odin asked, “Does he have any idea what’s coming?” And the Three Norns replied in unison, “Nope.”

Sunday, August 7, 2016

Sparkly Shoes

Recently in our condominium garage we came across our neighbors from down the hall, who have a little girl about three years old. She was stomping across the pavement, and with every step her tennis shoes gave off red, blue, and green sparkles. Clearly, she was delighted with the effect, and so were her parents. And that made me think …

When I was growing up, batteries were bulky things—mostly C and D cells—that tended toward fragility and leaked various corrosive liquids. Tiny, powerful, long-lived batteries based on rare minerals like lithium were decades away from commercial use. Back then, too, strain gauges were exotic devices in the hands of NASA and possibly the military. And light-emitting diodes (LEDs) were either unknown or still in deep development in the laboratory. If someone told me that in my lifetime an entrepreneur would put them all together to make sparkly shoes for toddlers … No, that someone would think of putting these exotic and expensive devices into shoes for which there is no naturally perceived need, and that parents would buy them just to get a smile from a child’s face—well, I would have marveled at the thought.1

In another amazingly silly use of high technology, we now have millions of people all over this country using their smartphones—which have embedded applications such as timekeeping, photo imaging, global satellite positioning, and software programming—to track down and “capture” mythical Japanese pocket monsters, or “pokémons,” so they could win non-monetary credits or kudos or some kind of recognition, even if it’s only their own self-satisfaction.

Please understand that I’m not against sparkly shoes and pokémons. In fact, as a convinced free-market capitalist, I find this frivolous use of advanced technology absolutely wonderful. We live in a world where whimsy and fun still matter. And smart entrepreneurs can still make a buck inventing clever ways to amuse other people. You might call that buck-making a cynical manipulation of people’s emotions. I call it, in the words of Henry J. Kaiser, “finding an [as yet unspoken] need and filling it.”

A socialist or communist society would never come up with these things. In such societies, the Ministry of Shoes would be dedicated to making sober, sensible, box-toed Oxfords for all the serious, pre-grownup children. And when every last child had at least one pair of regulation shoes—as if the children of America are not actually swimming in shoes—the ministry would turn its attention to other worthy causes, like preserving cattle hides, preventing deforestation, or engaging in Muslim outreach. The Ministry of Shoes would never think to develop, manufacture, and offer sparkly shoes as a secondary and delightful addition to a toddler’s wardrobe. And the Ministry of Communications would never think to put a camera, programming, or GPS function into a telephone in the first place. After all, the sober, sensible bureaucrats in charge of new product development would never let frivolity and fun enter the fixed-market equation while there were still hunger, want, and homelessness somewhere in the world.2

The amazing thing about this rise in the marketplace of sparkly shoes is that the national supply of batteries, strain gauges, and LEDs has not been in any way depleted. Neither has the playing of Pokémon Go cut into the availability of telecommunication or satellite positioning services for the rest of the country. Sure, there are children in Ethiopia and South Sudan who are deprived of their fair share of sparkly shoes—as I am sure the military establishments in those places are also suffering a dearth of batteries, strain gauges, and LEDs. But their lack was not caused by putting sparkly shoes on the feet of American toddlers. And stripping the sparkles from American sneakers would do nothing to put more shoes on the feet or food in the mouths of African children, nor would it improve their local economy or raise their educational prospects.

As I’ve noted elsewhere, the economy is not a pie. Slicing economic rewards thinner for me does not create more wealth for you, or vice versa. Rather, the economy is like a rain forest ecology: the more life there exists under its canopy—capturing the energy of sunlight and preserving it as fruits, seeds, sap, edible leaves, insects, birds, beasts, and compostable mulch3—the more niches for life there can be. The more people who are out there in the economy creating sparkly shoes and pokémon games, the more incentive there will be to demand, and more wealth to fund, the next wave of miniaturization in batteries, strain gauges, LEDs, megapixel cameras, computer controls, GPS satellites, and a host of related technologies.

This has been the story of our amazing escalation in technology since the invention of the steam engine as a coal-mine dewatering machine in the late 1700s. Someone thinks of a new application—put the engine in a boat with a paddle wheel, put it in a cart on steel rails—and soon the technology is growing and changing, becoming more ubiquitous. And, with the human capacity for learning, retaining, and sharing experiences and discoveries, the technologies usually become smaller, better, more efficient, and less expensive. If you doubt this, think back to the first cell phones in the 1970s and ’80s: usually mounted in cars, because of their bulk and power requirements, then more portable but still the size of a brick, with a Western Electric–style handset on a cord. A rich man’s toy. Now you can buy a mobile phone for the cost of a good lunch, and in some countries it’s easier to get cellular service than a landline.

Wars have sometimes helped with the development of some of this technology. Certainly, World War I saw an improvement in the mechanization and automation of the battlefield, with benefits drifting over to civilian technology in the form of more robust automobiles and airplanes. World War II saw vast improvements in radio technology, radar, codes and code breaking, the first computing technology—generally associated with code breaking, artillery firing solutions, and development of the atom bomb—and large-scale production and use of aluminum in aircraft manufacturing. These advances then provided a boost to everyday civilian life in the decades that followed.

But television also came along between the wars, served no real military purpose, and advanced just as rapidly in purely civilian usage. And the discovery and manipulation of the silicon transistor—progressing from individual devices that emulated old-style vacuum tubes to integrated circuits that put a huge number of gated operations onto something the size of a postage stamp—were first a civilian invention. Sure, military technology benefited from using integrated circuits, but so did whole civilian industries of electronics applications for entertainment, automotive controls, and mobile computing. Now we are entering the biotech age, and that owes most of its advances to the sequencing of the human genome—a purely civilian project—and almost nothing to work on bioweapons.

Barring the civilizational devastation of a global economic crash, nuclear war, or asteroid strike, this advancement in technology will continue for as far as the eye can see. Some advances are predictable, and as a science fiction writer I try mightily to get ahead of them: like more convenient and personalized communications, new clothing options, transportation modes, and medical procedures, all based on computerized automation, artificial intelligence, and the linkage of systems and technologies that once operated in isolation. Some advances I defy anyone to predict or even imagine: like sparkly shoes and Pokémon Go.

The world of the next twenty years, hundred years … thousand years is going to be unrecognizable to our most modern eyes. I can hardly wait!

1. And then I would have asked, when do I get my Jetsons-style jetpack? Oh, yes, that’s almost here—and here, too.

2. Of course, in a socialist or communist society, where ever-declining government tax revenues must chase ever-increasing economic and social problems—“eventually running out of other people’s money,” in the words of Margaret Thatcher—there would never be any money to spare for frivolity and fun.

3. In this view of economics, the energy from sunlight captured with hydrocarbon compounds in the rain forest is analogous to the energy of human work and imagination captured in goods, services, and the money to pay for them in the marketplace.