Sunday, January 5, 2020

Twenty-Twenty

Crystal ball

The first time I had occasion to write the calendar date “2020” on anything was a few weeks ago, in preparing the draft of the January-February edition of our local NAMI chapter’s newsletter. And as I keyed in those digits, a chill went through me. It was as if I knew, in that sleepless but dormant part of my brain that reads the tea leaves and maps them onto the stars, that this year would be significant.

Oh, I know, we’re going to have a watershed election. And given the political emotions that are running in this country, the campaigning, the election day, and the aftermath are going to be traumatic for one group or another. But that was not the reason for my chill. This was more a moment of existential dread.

Twenty-twenty marks the start of something, but I don’t know what. Well, yes, in one way, we are all past the dystopian vision of the year 2019 as given in the movie Blade Runner, and we still don’t have Aztec-style corporate pyramids and towers shooting flames into the murky night sky of Los Angeles. (And my car still doesn’t fly, dammit!) So that’s a plus for the new year. But that’s not what concerned me.

I can remember from thirty-odd years ago hearing a speaker from The Economist describe how the character of each decade doesn’t change or become consistent and recognizable on the year ending in zero, but on the fourth year into the new decade. So the decade of the 1950s didn’t really end and the 1960s start until 1964. By then we’d been through the Cuban revolution and our ill-fated invasion, the first moves in the war in Vietnam, and the assassination of a beloved president. And then we had the Sixties.

In similar fashion, the Sixties didn’t end in 1970 but not until 1974, with the culmination of that war and the impending impeachment and resignation of the president who came in promising to end it. And the economic malaise—recession and inflation, “stagflation”—that started with the oil embargo and gas crisis in that year marked the Seventies and didn’t really end until about 1984. (Another year whose mention will give you willies.) And then the computer revolution and economic growth of the Eighties, which started with wider acceptance of small, desktop computers and personal software, fueling the “tech stock” market boom, changed the economic structure of the country and continued through the early Nineties.

You can define your own parallels to this theory, in cultural norms, music, clothing styles, and sexual mores, but I think the pattern holds true. Decades don’t change on the zero year but about four years later.

I think something similar happens with centuries. But there the change is not in the fourth decade but in the third, the years counted in twenties..

The 18th century, which was marked—at least in the Western European sphere—with the wars between England and France, culminating in revolution in the American colonies and then in France, the rise of Napoleon, and the struggle for all of Europe, extending to the shores of the Nile and the heart of Moscow, did not really end until the middle of the eighteen-teens. Similarly, the breakaway of the United States from England and the finding of this new country’s own character did not really end until the War of 1812. After that decade, the 19th century came into its own.

And then, the Victorian, imperial, colonial über-culture that—at least in Western Europe—took the superiority of one race and one extended culture for granted, did not end until the shattering madness of World War I. And what came out of that war was the new century, with a perhaps more enlightened view about races and cultures—at least for some people—but also a clutch of hardened counter-ideologies and the technological means to pursue them to horrific ends. After that first global war, the 20th century came into its own.

And finally, the 20th century has been with us ever since. The fall of the Soviet Union and putative end of the Cold War in 1989 was a highway marker, but the effects of that lingering aggression and its bunch of little colonial brush wars and invasions (Korea, Vietnam, Grenada, Kuwait, Iraq, Afghanistan) continued along the same lines, although those lines were becoming blurred with the rise of Arab and Muslim nationalism and the flexing of Chinese muscles. And over them all has loomed the technological changes in warfare that started in World War I, with the machine gun and chemical agents, and continued in World War II, with the jet engine, the rocket, and the atomic bomb. The century saw war become less about armies marching against each other on a common battlefield and, because of “the bomb,” more about political and ideological maneuvering, guerrilla techniques, and terrorist tactics.

You can define your own parallels to this theory with the way that political and cultural norms, music, clothing styles, and even sexual mores changed in the 1920s. The theory still holds: centuries don’t change on the zero-zero year but about twenty years later.

So, although we have been living in the 21st century for the past two decades, it still feels like an extension of the 20th century. We have the same international tensions, the same small wars in far-off countries, the same political differences at home. And, aside from the power of my handheld phone and the number of small computers controlling everything from my car’s ignition to my coffeepot, this century doesn’t feel all that different from the one in which I was born and raised. But still, I sense big things coming, and hence the existential dread.

What are these “things”? My internal leaf-reader and star-mapper doesn’t know. But my sense as a science-fiction writer can offer a few guideposts.

First, the technical revolution that brought us the small computer as a workhorse in everyday life will bring us artificial intelligence in the same role, but maybe more aggressive. And unlike the one big IBM 360 that sat in the basement of the computer science lab at Penn State in the 1960s and ran most of the university’s administrative functions on campus, or the one all-pervasive “Skynet” that destroys the world in the Terminator movies, artificial intelligence will be distributed entities, functioning everywhere. As soon as we work out the neural-net (i.e., capable of learning) architecture and marry it to the right programming, these intelligences will proliferate through our technology and our society like chips and software.1

But don’t think of “intelligence” as being a human-type soul or a little man in a software hat or even a Siri or Alexa. You won’t be holding a verbal discussion with your coffeemaker about whether you feel like dark or medium roast this morning. Instead, you will find that your world is going to have an eerie sense of precognition: answers and opportunities are going to come your way almost seamlessly, based on your past behavior and choices. Your phone is going to become your life coach, trainer, technical expert, and conscience. This trend is only going to expand and multiply—and that’s just on the personal level.

On the macro level, business operations and relationships, lawyers and judges, doctors and hospitals, and any sphere you can think of where knowledge and foresight are valuable, will change remarkably. You won’t go into a contract negotiation, a court appearance, or a diagnostic session without the expert system at your elbow as a kind of silicon consigliere. We’ve already seen the Jeopardy-playing IBM Watson prove itself as a master of languages, puzzles and games, and historical, cultural, scientific, and technical references. The company is now selling Watson Analytics to help manage business operations. This trend is only going to expand and multiply.

Second, the biological revolution that brought genetics into the study of medicine—and the sequencing of the human genome was completed on the cusp of this 21st century—will see a complete makeover of the practice. In this century, we will come to know and understand the function of every gene in the human body, which means every protein, which means almost every chemical created in or affecting the human body. That will change our understanding not only of disease but of health itself. Sometime in this century—all of the ethical handwringing aside—we are going to be modifying human genes in vivo as well as in egg, sperm, and embryo. From that will come children with superior capabilities, designer hair and eye color (just for a start), and stronger immune systems, among other things. The definition of “human” will be rewritten. Adults will benefit, too, by short-circuiting disease and regaining strength in old age. This trend is already under way with gene therapies, and it will explode in practice and popularity.

Moreover, the nature of our material world will change. Already, scientists are examining the genetic capabilities of other organisms—for example, Craig Venter’s people sifting the seawater on voyages across the oceans, looking for plankton with unique gene sequences—and adapting them to common bacteria and algae. You want a pond scum that lies in the sun and releases lipids that you can then skim up and refine like oil? That’s in the works. You want to harvest beans that have valuable nutritional proteins, bleed red, and taste like meat? That’s in development, too. You want synthetic spider silk and plastics as strong as—or stronger than—steel? You want carbon graphene sheets and Bucky-ball liquids that have the strength of diamonds, the electrical capacitance of metals, and the lightness and flexibility of silk? Just wait a few years.

As they say in the song, “You ain’t seen nothing yet.”

1. See Gutenberg and Automation from February 20, 2011.

1 comment:

  1. Thomas, I have quietly accepted the year 2020. This date began to appear in December 2019 in the reporting documents for my work. This year my granddaughter will be 20 years old. She is a 2-year student at the University. Will be a biologist. I asked her to invent a "macropoulos remedy" - a cure for old age. Ha ha Joke. Looking back at 57 years old. I remember the first time I saw a detector receiver. And I became interested in Amateur radio forever. The semiconductor transistor was invented in 1948. And we're talking about artificial intelligence. I installed a program on my computer where I put information about myself from the moment I remember myself. This program copies me. I'm honest with her.I've been doing this for 5 years. Communication with her in question form. I'm sorry I was born in 1952. I will not be able to see the world of the future that is progressing so quickly.

    ReplyDelete