Sunday, July 4, 2021

Looking Backward, Looking Forward

 
The Fool The Magician

Time, it seems, goes in only one direction. Or that is our current understanding of the laws of physics. Of all the dimensions of this place we call the universe, the only one that is unidirectional is time. Of course, we may not understand everything yet. And perhaps our sense of “time’s arrow” is based on our perceptions of cause and effect: if one thing follows another, they appear to be ordered in time. But time is malleable, stretching and collapsing according to the theory of general relativity, although still just going in the one direction we call forward, toward the future.

I am not here for a meditation on physics, however. The human brain is capable of working time both forward and backward. We encode short- and long-term memories that capture images, words, feelings, sounds, smells, and other artifacts of experience. Some people’s brains record everything they experience, every day, every minute, going back to first awareness. For most of us, the capture is more selective, usually affixed to experiences that left us with a strong emotional response. But some memories just get stored, almost randomly, and repeat themselves at the oddest of times or in scrambled contexts.1

We also use the mechanisms of the prefrontal cortex for planning and decision making, which are both projections of today’s activities into the future. We not only try to predict the future with our plans and decision, but we mentally inhabit a fictitious future with our hopes and daydreams—a false future that exists in our emotional lives as an alternative to the choices and possibilities that exist in the front part of our brain.

Human beings—and to some lesser extent other mammals—are the animals that can move forward and backward in time, using our brains. Our bodies, however, only travel the forward route, seeing planning and decision points pass by us like signposts along a highway.

A balanced human in an integrated life uses both functions. We look forward in expectation for the protection of ourselves and our families. We look backward in remembrance to give meaning to that expectation.

Some people live too much in the past. Those who have bleak or unknowable futures, or the very old with almost no future at all, tend to dwell on the past and their memories of a former life and loved ones as a means of sustaining their individuality. Some who have trauma or misdeeds in the past tend to dwell on them in a fretful attempt to change what has already occurred and, in their dreams, create a new current reality.

Some people live too much in the future. The very young look forward because everything that has gone before—especially if their life has been the mundane daily round of play and school, and interactions with parents and friends—is merely preparation for the life to come, creating a set of skills and responses that will carry the person into adulthood and beyond. Others more adult live for the future because their lives so far have been wasted—especially if drugs and alcohol, bad relationships, or bad choices and actions have been involved—and they can only look to the future to make amends, make a better life, or create new meaning for that life.

My own life, I realize now, toward the end of it, has always been frontally focused. I have always been considering, planning, and dreaming about the next day’s work, the next job, the next book to read or write, the next experience. I do recall the past and have pleasant memories of most of it, but I do not live there. I live in tomorrow, next month, next year. My head is always somewhere six months out.

When I was at the university and was already focused on studying English literature and a future as a writer—novels were always my first choice, although not the most lucrative part of my eventual writing career—I took a course on Predicting the Future. This was part of that academically silly season in the late 1960s, when campus radicals were demanding courses with more “relevance”—by which they meant to steer away from the traditions of Western Civilization. But my mentor, the science fiction author who wrote under the pen name William Tenn, took advantage of the opportunity to inject a bit of his favorite subject into the teaching. We read the current crop of futurist authors and a bit of predictive science fiction, and we studied the ways people have tried to know what’s coming next. For example, I wrote term paper on Tarot cards as a method of fortunetelling.2

But now, in my seventy-third year, I find that looking forward has disturbing possibilities. There aren’t that many new experiences, possibilities, or choices out there ahead of me. Sometimes the future seems like a narrow, gray space, like the last few pages under your thumb in a book that you are reading and enjoying, whose plot you are following, and whose climactic moment has not yet come, and you’re not sure there are pages enough, time enough, to make a suitable ending. Six months out used to be a long time for me. Now, on some days, it seems to be all the time that is left—even though I am still healthy, strong, eating right, exercising, healing well, and hopeful. But just … how much more can I expect from life?

It’s not a terrifying thought … yet. But when you can sense the Great Darkness somewhere beyond that gray space, it makes you pause and consider your past life choices.

1. And, as research into “false” or altered memories has suggested, every time we recall a memory, our brain does a little editing—maybe a bit of improvement, sometimes a bit of damage—that changes the memory for future recall. Nothing is fixed in our brains, like an engraving on a steel plate. Instead, everything is more malleable, like the silver nitrate in a film emulsion, which can be affected by later exposure to light, or the digital bits in a computer memory, which get translated out of storage and then translated back into storage, with changes and degradations going both ways. Our brains are more a fluid “chemical pot” than a hard-wired “electric box.”

2. Actually, the Tarot—particularly in the 22 cards of the Major Arcana—is a story of human struggle and conflicting values that stands in opposition to the Judeo-Christian tradition. It relates the life transition of every aware soul from the insouciant and careless Fool of Card 0 to the powerful and careful Magician of Card 1. It’s a story of personal development.

Sunday, June 6, 2021

On Personal Boundaries

Teasing with apple

We all have them, boundaries—or frontiers, if you prefer—the edges of our souls, the demarcations of our “comfort zone,” the limits to which we will go, where beyond lie trespass and possibly danger.

We build these boundaries over a lifetime. Sometimes it’s by choice: we have a bad experience and say, “Not going there, never again.” Sometimes the boundary is set by habit: “I’ve never been there, but it looks dangerous, or demeaning, or ‘just not my style.’ ” And sometimes the territory beyond the boundary, the frontier, is someplace that simply lies outside our imagination, it’s not part of the image of ourselves and the world that we have built up as “the right and proper me.” Or we view it from the vantage point of imagination and decide, “That doesn’t look right.”

We build these boundaries—establish these frontiers—like an engineer building a fortress wall. Or, if you prefer a softer metaphor, like a gardener defining the region of conscious cultivation as separate from the wild lands outside, and maybe there’s a wall, too. As I say, everyone does this, because it’s part of living and deciding who we are, what we will be, which conditions of living we will accept and which, based on that personal image and experience, we reject.

And everything goes well until life rises up and smacks us in the face. It may be a new job, where we are required to expand our skills or handle crises we’ve never encountered before. Or it could be a change in life direction, like going off to college or joining the army, suddenly becoming rich or just as suddenly becoming poor. But for each of us who have the capacity, that smack in the face is sure to come when we fall in love. Then we must, simultaneously, cross our own boundaries and enter into another person’s frontiers.

It’s all well and good to imagine your “perfect woman” or “ideal man,” or your “soul mate.” But those are creations of your own imagination. They want the same things you want, share your interests and dislikes, conform to your vision of yourself, and never question or make demands about the things—thoughts, activities, sacred beliefs—you hold dear. Your soul mate is a fiction: a boundless, smooth orb that is congruent with your idea of self, with appropriate gender alterations. You don’t have to deal with your perfect woman or ideal man. You don’t have to cross any boundaries.

And the reverse is true. You don’t have to let that ideal person cross your frontiers and invade your home territory, the center of yourself, because that fictitious person already lives there, in your imagination. And just as it’s scary to go and try new things you’ve never wanted to or even imagined experiencing, because the object of your growing affection loves them or demands them, so it’s scary to let a real person cross your frontiers and learn about—and try to deal with—the real you, your likes, fears, foibles, habits, and sacred beliefs.

This is when life comes up and smacks you in the face. If you can, you then lower your defenses, cross the frontier, and try to deal with a real person with real wants and needs. Maybe the differences are too great, the new territory too unknown, or too dangerous, or just too bizarre for you to enter and be comfortable. And then maybe the object of your affection, in her or his real self, becomes less desirable to you, less possible for you to be with. Not that they simply fail in being your ideal, or that they become something opposite, all sharp corners and bad angles, nothing like congruent at all. But just some of the ground you have to cover—the experiences you have to embrace and pursue in order to be with that person—are simply impossible for you. And then you learn from that encounter, adjust your definition of boundaries, and move on.

And perhaps your boundaries or frontiers are so wide, your walls so high, that no one but the ideal soul mate can get through, because they are already inside. Then, I am sorry to say, you will never have what you want. You will be alone in your garden, safe in your castle, and never know the terrors of opening up to someone unknown and the joys of finding that you can expand.

But if you can venture outside your comfort zone, take risks, cross boundaries, and in some cases redefine yourself, you will find the happiness of discovering that you are not alone in the universe, that someone else can share your joys and burdens, and you can walk the road of life together.

That’s the cold, sober reality, and you must make the best of it.

Sunday, May 23, 2021

Nothing on My Mind, Again

Black square

It appears that the concept of zero, and the negative numbers that precede it—follow from it?—came originally from India and were brought into the Western world by Arab traders. All of this happened in the 7th century AD, long after the fall of the Western Roman Empire and the earlier eclipse of the ancient Greek scientific culture. The Greeks, who apparently based their mathematics on geometry, never considered negative numbers or zero, as all their measurements in space were positive. The Romans, who were practical engineers and not theorists, drew straight lines and simple arches with positive numbers, which they identified with alphabetic notation. So, once again, I’m wondering where all this nothingness, this absence, came from.1

Modern life is permeated by numbers. This is part of the 17th-century scientific revolution—think of the countdown to zero in a rocket launch—and also part of the social revolution in personal literacy, which began with Gutenberg-style printing in the 14th century, and the economic revolution in finance and banking, which was started by the Italians and their letters of credit at about the same time. Today, almost everyone has a bank account, a checkbook, a credit or debit card, and a line of credit. We are adding and subtracting numbers all the time, and we all—or most of us—watch as that dreaded lower limit, zero balance, or even a potential overdraft, a negative number, approaches. We also watch clocks made of numbers and count the hours negatively until quitting time.

We think easily in terms of null and negative arithmetic. “How many supermodels did you date last year?” Zero. “How are you doing at the blackjack table?” Down by five hundred bucks.

Did the ancient Greeks or Romans not have these and similar, context-sensitive conversations? Well, probably. But not in a mathematical framework. “How many sheep do you have?” None—I don’t keep sheep. “What did you win betting on the chariot race?” Oh, nothing—I lost.

In a world where we are not so conscious of modern mathematical concepts. A person focused on what was there, in existence, in front of their face. Yes they could do subtraction: I had five apples and gave you two; now I have three apples. But the concept of zero was the simply concept of not having, not being, not knowing. It didn’t have a number. The idea of having fewer than zero apples, because you gave away more than had, didn’t arise very often. And if you owed a debt, you didn’t think of it as a negative number in your bank balance but instead as a positive number that you eventually had to pay to someone else.

Are we better off for being more sophisticated about all this? Certainly, our kind of mathematics has enabled us to calculate with both positive and negative forces, compare tradeoffs, and create simulations of complex systems. It was modern mathematics that took us to the Moon and Mars, and some variant will take us to the stars. Zero is a real number, and negative numbers have real meaning, when you’re making these calculations.

For the ancients, the world was made up of substances. Their elements were earth, water, air, and fire. Even though the last two are gases, and air itself is invisible, anyone who has taken a deep breath and blown it out—or blown into a trumpet or a flute—would notice air’s liquid nature. Anyone who has watched fire curl around a candle wick or tremble in the wind could see that it is also semi-liquid.

The ancient Greeks and Romans never had mountains high enough that they could notice the air getting thinner the higher you went. They would not have come to the obvious conclusion, then, that at some point such thinness might lead to nothing at all, a vacuum. For them, the space above the Earth was a series of concentric spheres that held and propelled the orbits of the Sun, Moon, planets, and stars—and there might as well be air between those spheres than nothing at all. Only with modern aircraft and rocketry do we know that the air runs out about twenty miles above the Earth’s surface and all the rest is empty space.2

For the ancients, these substances were solids, not divided into tiny bits, and then even tinier bits, until you arrive at subatomic fragments too small to see or weigh. Democritus of Athens did theorize about atoms and empty space, but he probably thought of those atoms as jostling around each other like marbles in a bag. The idea that atoms themselves are mostly empty space occupied by subatomic particles, and that everything we can see and touch is a lot more nothing than something, is a concept out of modern physics. On both the quantum and cosmological levels, we have to get our heads around the idea of nothing, non-being, emptiness on a mind-boggling scale.3

The poor human brain evolved in a world full of—and was adapted to deal with—real things. We survived by knowing about the tangible environment and manipulating objects and forces that could hurl a spear to bring down a deer or gather and carry roots and berries to a place of familial consumption. Our peripheral vision is cued to a trembling in the bushes—even if it’s only the wind, because it just might be a predator stalking us or a human enemy trying to ambush us. We are primed to experience and work with what’s there, and not what’s not.

But nothing is on our minds now. And its reach is growing, especially since the 20th century, when existentialist philosophers began to question why humans, the world, and everything in it even exist. Apparently, for them, nothing is the default state of the universe and the only thing that doesn’t have to be explained. So, at least for the French avant garde, our thinking on being and nothingness has come full circle.

Forgive me for occasionally having nothing on my mind. It’s been a slow week, and I’ve run out of ideas.

1. See About Nothing from June 25, 2017.

2. Except that modern physics insists on filling space with somethings. For example, Stephen Hawking solved the problem of the apparent evaporation of primordial black holes by positing the instantaneous creation and mutual destruction of particles and anti-particles in a vacuum, going on all the time, invisibly, everywhere. When one of those pairs happened to spontaneously erupt along the event horizon of a black hole, one of the paired particles fell into the hole and the other drew out a quantum of energy in response. And so micro black holes disappeared over time. … Oh, hell! Hawking might as well have said that pixies ate them.

3. How big is an atom? I’ve read somewhere that if the nucleus of a small atom like hydrogen or helium were the size of a fly inside Notre Dame cathedral, then the electron shells and their potential orbits would occupy the entire enclosed space. And all the rest would simply be empty.

Sunday, May 16, 2021

Living with Ambiguity

Girl with magic box

The quantum mechanics conundrum of Schrödinger’s Cat1 is not an actual physics experiment but a famous thought experiment about the state of human knowledge and observation. Basically, it says that the universe goes on about its business and doesn’t reveal itself unless human beings, our active intelligence at work, actually stop and look. And sometimes, at least in the subatomic realm, the mere act of observing interferes with the outcome—as when the detection of a subatomic particle in flight by an instrument using a beam of photons interferes with either that particle’s position or its direction.2 So some things, at the most remote scales, are truly unknowable.

I would posit that the ambiguity of quantum mechanics has a lot more to do with everyday life than we normally admit. We are always faced with situations where what’s going on and what we know about it are separated. For example, did I get the promotion? Somewhere on one of the upper floors someone, or a group of someones, knows who got the nod for the job, but there’s no way—at least, no ethical way—of finding out until they announce it. And if you find out before it’s announced, that will likely change the decision. Does she love me? She knows, or maybe she doesn’t know yet, but there no way for you to know until she declares herself by word or action. And your pestering her for an answer would change the relationship. Will the jury find me guilty or not? Again, the twelve members of the jury know, or soon will know, but you won’t find out until the verdict is read in court. And if you learned the verdict ahead of time, it would cause a mistrial.

In each of these instances, from the time the question arises until you open the lid and observe the cat, the question remains both “yes” and “no” at the same time. Both choices are in a state of superposition—at least as far as you are concerned—until you learn the answer and the two states are resolved into one. This is not a question of probability, although you can take odds or make bets with yourself about how you think the question will be resolved. But all of your weighing of factors and listing of pros and cons will not make a bit of difference when the question itself lies in the hands of others, of the management team, the girl, the jury … or the Geiger counter attached to the vial of poison.

This is the sort of ambiguity we have to live with all the time. In most cases, the superposition will resolve itself eventually. But sometimes the company’s fortunes change and the job is never awarded or announced. Sometimes the girl moves away or dies before she can accept or reject you. (And sometimes she says “yes” when what she means is “maybe” or “wait and see.”) Sometimes you get a hung jury, no verdict, or a mistrial. Some issues may never be resolved in your lifetime.

For example, I’ve always wondered about the true story of the Kennedy assassination. Did Oswald act alone out of disaffection, or was he a plant by the KGB after the embarrassments of the Cuban missile crisis and Bay of Pigs invasion? Did Jack Ruby kill Oswald out of patriotic sentiment, or was he sent in by the CIA to keep the lid on a foreign decapitation action that might have led to Congress declaring World War III? The entire Warren Commission report has been unsealed by now, years ahead of the actual date, due largely to the Freedom of Information Act. The commission’s findings suggest that Oswald and Ruby both acted alone, and supposedly there was no evidence of a coverup or international involvement. Still, I wonder. Since that’s as far as the investigation went, despite 552 witness depositions, 888 pages of documentation, and 3,100 exhibits, we will never know who outside of persons in the immediate U.S. might have been involved. So, in my mind, “Russian plot” and “angry gunman” remain in superposition, as do “CIA coverup” and “angry patriot.” At this point we will probably never know.

Another example of ambiguity is the mystery of the universe’s origin. When you rewind the expansion of the galaxies that we observe back over the 13 billion years of the universe’s calculated existence, you end up with a putative point, a tiny dense particle that exploded in the Big Bang. That is supposedly our cosmological creation story. But if you expand the observable universe from a single point to its current size, even allowing for everything to move at light speed, the calculated radius is smaller than the universe in which we find actually ourselves. This problem was supposedly corrected by the “inflationary period,” proposed by cosmologist Alan Guth in 1980, in which the whole shebang accelerated instantly a few microseconds after the Big Bang, so that it went from something with a radius of less than a subatomic particle to—and here various calculations give different answers—a cloud of matter somewhere between the size of a grain of sand to something on the order of nine meters in diameter. And then it all continued to expand normally from there.

A third example of the currently unknowable—although not for lack of trying to detect it—is the relationship of matter and energy in the observable universe. From the way that the stars in spiral galaxies spin around their center—as if they were painted on a disk, rather than freely orbiting in the void—it would seem that these galaxies have more gravitationally bound material in them than the matter that shines brightly as stars. A lot more, as in several times as much. This is the “dark matter” that plagues cosmology. Either galaxies contain much more dust, gas, and both central and primordial black holes than our observations account for, or the universe is permeated by particles that affect gravity but are otherwise invisible and undetectable in every other way. And then, the universe itself is not only expanding, as if still impelled by that initial Big Bang explosion, but also its speed of expansion is accelerating at an alarming rate. So either the vacuum of space contains a mysterious force that increases with space and distance—a “dark energy” that is otherwise undetectable in our immediate neighborhood—or we don’t understand the basic structure of the universe and the real nature of the effects we observe as “space,” “time,” and “gravity.”

We can theorize about these things, but until we create better instruments and take better measurements, I think we have to live with the ambiguity of not actually understanding the universe. Many possibilities are in superposition, and not all of them can be true.

And finally, on the human scale, is the matter of human life, spirit, and what may lie on the other side of death. Is there a God or not? Do we vanish at death, like a candle flame when it’s blown out, or does some part of us—soul? ghost? brain wave? personality? memory?—exist for a time or perhaps for eternity? And there you can theorize, rationalize, believe, or doubt all you want, but only the actual experience of death will reveal the answer. And by then it may be too late to do anything about it.

Given all of this, and the example of Erwin Schrödinger’s cat to begin with, I must remain comfortable with ambiguity. I must accept that some things cannot be known until they are revealed, that others may not be revealed in my lifetime, and that some may never be revealed to any of us, no matter how long we live.

1. For those who do not know it, you imagine putting a cat into a box with a vial of cyanide and a striking mechanism that will break the vial and kill the cat when triggered by a random event, such as the decay of a radioactive element. Then you close the lid. You have no way of knowing whether the particle has decayed and the cat was killed until you actually open the lid again. So, from your perspective, the cat is simultaneously in two different states—called a “superposition”—of being both alive and dead. This composite state is not resolved until you open the lid, and then the cat is either alive or dead. But all of this pertains only to you, as the observer; for the cat, the effects are more immediate.

2. However, this question of observational interference is not part of the Schrödinger’s Cat thought experiment.

Sunday, May 9, 2021

Predicting the Future

Immortal dream

Last week I wrote about the rising curve of human technology since the 17th century and suggested what it might mean for the future. Now I’m going to dip a toe in the perilous waters of future gazing and see where, in the short term, such technologies might lead us.

But first, a few looks back, to see how quickly things have changed.

When the Washington Monument was completed in 1888, it was capped with a small pyramidal casting made of aluminum, chosen because it was likely to be a good lightning conductor. At the time, aluminum was fairly rare as a metal. It was not more valuable than platinum, as some have suggested, but it cost about $1 per ounce, which was the typical daily wage of the average construction worker on the monument.1 Although aluminum is a fairly common element in the Earth’s surface, being bound with oxygen in crystals of alum, Al2O3, found in the red earth bauxite, it took large amounts of electricity to drive off the oxygen—and electricity was in short supply in the mid-19th century. Within fifty years, after the building of large hydroelectric dams in the U.S. West, aluminum became widely available—plentiful enough to dominate the skies with airframes made during World War II, and then cheap enough to make building siding, lawn furniture, and throwaway cans in the years after. Aluminum is the wonder metal of the 20th century, along with titanium and stainless steel.

And at the turn of that last century, according to the U.S. Department of Agriculture,2 41 percent of the American workforce was employed in farming. By 1930, that number had shrunk by about half to 21.5 percent; by 1945, to 16 percent; by 1970, to 4 percent, and by 2000, to less than 2 percent. Credit for the change goes to improved use of machinery and its overall efficiency; changes in land use, with larger “factory farms” and the loss of the picturesque but noncompetitive “family farm”; and the “green revolution” in the production and use of fertilizers and weed and pest controls, as well as the genetic modification of crops. All of those chemical and biological developments are still in play and will only increase in their use and effectiveness. So we can count on the next hundred years bringing us new hybrid crops and tougher, more robust, more nutritious food resources. The only limiting factors will be arable land and fresh water—and even the supply of those may change.

Okay, let’s start with water. Right now, humanity depends on two sources for the water it uses for drinking, bathing, toilet flushing, and irrigation: rain and snow falling from the sky and running off in the local river; and ancient rains trapped in subsurface groundwater and aquifers that are tapped by wells. The more use we make of those aquifers—which collect slowly over the ages and drain quickly under pressure and pumping—the less we have for the future. But the world’s surface is three-quarters water, just that most of it is laden with mineral salts and thus undrinkable and unsuitable for farming. We have known how to filter out the salts by reverse osmosis since the middle of the 18th century. We could live by this process today, except for the fact that it’s costly. But the cost is not so much in the plants themselves and their placement as in the energy they require. (And the cost has actually dropped for flash desalination, from $30 per cubic meter in the 1960s to about $1 in 2010.) If you have abundant electric energy to run the pumps, you can have all the desalinized sea water you can drink.

So the key to the future of our water supply, our agricultural irrigation, and our population growth in general is going to be energy. Right now, like it or not, our most abundant energy resource is fossil fuels: coal, oil, and gas. Coal is abundant—we have about a thousand-year supply in North America at current consumption—but bulky to move and messy to clean up. Oil and gas can be piped to their users, and while we recently thought we were running out of easily tapped reserves—the specter of “peak oil”—technological advances in the form of horizontal drilling and hydrofracturing of oil and gas shales have extended our future. But—thinking in terms of centuries rather than years or decades—one day we will run out. And, in the meantime, hydrocarbons are much more precious as a chemical feedstock than as an energy source. Wind and solar power, being diffuse resources dependent on adequate siting, will not replace hydrocarbons in our energy future—not unless we change the landscape, harvesting solar power in orbit and beaming it down to huge diode fields on the planet’s surface, as in my novel Sunflowers, or planting windmills in mechanical forests along every ridgeline, as if they were trees.

But what I call the “enterprise of science” is well aware of the energy problem. Physicists and engineers all over the developed world are studying its production and storage from many angles. Although fusion power, electricity from the deuterium and tritium in sea water, always seems to be ten years off into the future, always receding from our grasp, one day we will figure out how to produce it, even if we have to invent artificial gravity to make it work. And once we have the design and formula worked out, we can adapt and scale it for efficiency. Biologists are at work, too, trying to use our newfound genetic ingenuity to manipulate algae into growing and secreting lipids, or hydrocarbon substitutes, from water and sunlight without adding to—but rather subtracting from—the atmosphere’s carbon-dioxide burden.

And about that carbon dioxide, greenhouse gases, climate change? All of that is this year’s daily fright, like Malthusian overpopulation or the collapse of “peak oil.” Listen: climate has always been changing, and people have always adapted to the new conditions. When winters came early and lasted longer at the beginning of the Dark Ages, they migrated south. When coastlines shifted, they moved inland. You can say the difference today is we have billions invested in shoreline real estate that is too valuable to lose. But in the Bay Area, where I live, a lot of that shoreline was saltwater marsh a hundred years ago; in another hundred years, it may be saltwater marsh again. Shrug. Changing climate, like most effects of weather—is an inconvenience, not a catastrophe. Consider the inconvenience of having to shovel tons of snow repeatedly every winter and what it does to the economy to dig out homes and plow the roads after every blizzard. And on the plus side, pushing the snow line further north into Siberia and the Canadian tundra will open up new lands for agriculture.3

And speaking of the Reverend Thomas Malthus and his prediction that human population would rapidly outgrow agricultural resources, leading to worldwide starvation, that didn’t happen, did it? When I was growing up, we heard dire predictions about vastly overpopulated countries like China and India, where people were regularly starving. And Africa, where apparently people are still starving—although much of the famine appears to be genocide by political manipulation of the food supply. We have seen that, as countries develop economically and technologically, with a greater proportion of their population moving into the educated and skilled classes, the population and its growth rate tend to shrink. Much of Europe—at least among the demographic that populated Europe over the past thousand years—is now reproducing below its replacement level, figured at 2.1 children per couple. Japan has been below that rate for decades, and the U.S.—again for the population mix of the last century or so—is trending that way. When people become prosperous and educated, and their medicine saves most of their babies who would otherwise disappear into the infant mortality statistic, they have fewer children and generally treat them better, so they live longer, more productive, more satisfying lives. China, India, Africa, and South America will eventually catch up with this curve before the planet implodes.

Add in the advancements that will come with the genetic revolution in biology and medicine, and most of the medical problems we see today will fade away. We will find ways to target and repair cancer cells. We will resurrect failing hearts and brains through tissue repair. Organ repair and replacement will become a matter of manipulating your own stem cells, as in my two-volume novel Coming of Age, rather than receiving an organ donation—willing or not—from another human being, with a lifetime of immune suppressants to follow. Issues of congenital and developmental conditions, susceptibility to the environmental causes of degeneration and disease, and the mystery of differentials in health and fitness among people will unravel as we analyze, predict, and eventually control all the biological processes of life. People will then live a lot longer, with even richer, more productive lives.

Normally, you would then expect the Ponzi scheme of Social Security and Medicare—where you need more and more young people working to pay for the care of ever more retired parents and grandparents—to collapse. But I don’t expect this to happen, and not because I believe the U.S. Congress will pay back the money it has drained from these funds for other uses. With more automation of materials extraction, manufacturing, and the supply chain and infrastructure that support them, the need for human hands to dig, make, and trade things is rapidly diminishing.4 That trend is only going to accelerate with artificial intelligence, 3D printing, and other physical amplifications of the computer age. The question is not whether we will have enough people to work in the economy, but how people will work to support themselves in the cornucopia of food, goods, services, and entertainments that is going to be showered down upon them.

I once thought that some form of Universal Basic Income—a global and permanent government dole—would be necessary to replace the “Protestant work ethic” with which my generation was raised (or, as my mother would say, “No work, no eat!”). But people are inventive and creative, and a life of easy handouts is not part of human nature. I think, instead, that people faced with an economy of predictable, unexciting, machine-made and -supplied goods and services will return to valuing human artistry and craftsmanship, at least in the areas that interest them. Yes, you can get a basic, particle-board-and-veneer desk at IKEA, but you’ll pay more for something hand carved with a flourish from a renowned local craftsman. And yes, you can watch computer-generated movies on the widescreen most evenings, but you will still hunger to go out and sit in a theater with other human beings to watch real actors speak from story lines reflecting varied human thought.

I am not a Pollyanna. The future won’t be all rosy. There will be dislocations, disruptions, and growing pains in the new world into which we are venturing. But we’ve had those difficulties along the way already and survived them. Our life today is unimaginable to someone living two centuries ago. If you had told a 19th-century farmer that in the 21st century one person would do the work of 100 of his kind, he would have despaired and wondered how his children will survive. For him, the change would be a huge, looming, unsolvable problem. But now we look back and we can barely comprehend the drudgery and futility of his life, laboring sunup to sundown, hacking at the land with a horse-drawn plow, hauling buckets of water from the well and manure out to the field, just to feed his family and still have something left over to sell at market.

And come what may, people will still have imaginative and enticeable human spirits. They will still able to look at a flower in the dawn light or the sea at sunset, breathe a sigh, and find a measure of contentment in the moment.

1. See The Point of a Monument: A History of the Aluminum Cap of the Washington Monument by George Binczewski, JOM, 1995.

2. See The 20th Century Transformation of U.S. Agriculture and Farm Policy by Carolyn Dimitri, Anne Effland, and Neilson Conklin, USDA Economic Research Service.

3. And climate change may not always be toward the warming side, regardless of the CO2 burden. Consider the change in sunspot cycles over the past twenty years at spaceweather.com. The last cycle, No. 24 since the Maunder Minimum of the 17th century, was much weaker than the previous cycle that peaked in 1998. This and the even weaker cycle No. 25 that we are now entering suggests we may be headed toward another solar minimum.

4. Watch any episode of the television series How It’s Made and count the number of human hands at work vs. the number of machines. We live in a mechanized age.

Sunday, May 2, 2021

The Rising Curve

Compound steam engine
Steam turbine blades

The rates of increase between a slope and a curve have different mathematical properties. A steady slope is usually generated arithmetically by adding single units (1, 2, 3, 4 …), while a curve is usually generated exponentially by adding squared or cubed units (2, 4, 8, 16 …). A parabolic curve, at least the part that proceeds upward from its low point, is generated by the formula ax2+bx+c,1 and it can rise really fast. My contention here is that our technological advancement since about the 17th century has been on a parabolic curve rather than a slope.

In ancient times—think of Greece and Rome from about the 8th BC, the first Olympiad for the Greeks, or the mythical founding of their city for the Romans—there was technological advancement, but not even a slope. More of a snail’s pace. The Greeks had their mathematicians and natural and political philosophers, like Pythagoras and Aristotle, but aside from writing down complex formulas and important books which probably only a fraction of the populace bothered to read, their works did not materially improve everyday life. The Greeks never united their peninsula politically, for all their concept of democracy, remaining stuck at the tribal and city-state level of conflict. And from one century to the next, they drank from the local wells, shat in the nearby latrines, and traveled roads that washed out every year with the spring floods. They built in marble the temples of their gods, but otherwise the average people lived in houses of wood and mud brick not much different from those of their predecessors in Homeric times, five centuries earlier.

The Romans did somewhat better, being short of actual philosophers but abounding in practical engineers. They developed a democratically based political and military system that united their peninsula and went on to conquer most of their known world. They built huge aqueducts to bring fresh water into their cities from distant springs, underground sewers to take away human wastes, and roads dug many layers deep into the ground that could reliably move goods—and armies—from one end of the empire to the other. They built temples and palaces in marble laid over brick but also invented a synthetic stone, concrete, that their engineers originally made from a volcanic ash known as “pozzolana.” Common people in the city lived in apartment blocks called insulae, or “islands.” They bathed regularly and made a civic virtue of the practice. Life was better under the Romans, but technological advancement was still glacially slow.

Rome, at its fall in the Western Empire during the 5th century AD, was technologically not much different from the Rome of Julius Caesar, five centuries earlier. And that fall—due largely to climate change and the ensuing barbarian migrations—plunged Europe into a Dark Age that saw small advancement in any of the arts, although we did get some practical technologies like the wheeled plow and the stirrup. Those, along with gunpowder adopted from Chinese fireworks and movable type adapted from the Chinese by Gutenberg in the 15th century for printing bibles, carried us through to the 16th century, the time of the Tudor reign in England or the Medici in Italy.

After that, technologically speaking, all hell broke loose.

Some might credit Rene Descartes and his inventions of analytic geometry and the scientific method, based on observation and experiment; or Isaac Newton and his invention of the calculus (also developed independently by Gottfried Leibniz in Germany) and his studies of gravity and optics; or Galileo and his work in physics and astronomy. Intellectually, it was a fruitful century.

But from an exhibit that my late wife prepared at The Bancroft Library years ago, I learned that a more immediate change came about with the exploration of distant lands and when the European trading companies set up to exploit them began importing coffee and tea into the home market in the 17th century. Before then, people didn’t drink much water because of rampant contamination; so instead they drank fermented beverages—sweet wines, small beer, and ale—because alcohol helped kill the germs, although they didn’t think about it in those terms. So they would sip, sip, sip all day long, starting at breakfast, until everyone was half-plotzed all the time. But then along came coffee and tea, which were good for you because you had to boil the water to make them. Everyone brightened up and began thinking. The denizens of Lloyd’s Coffee House in London invented insurance companies to protect the sea trade, which required estimates of risk and probability, and that led to a whole new branch of mathematics and the spirit of investment banking.

Put together scientific investigation with the widespread availability of printed books and the clear minds to read them, and we’ve been on that rapidly rising parabolic curve ever since.

We are just over three hundred years from the first steam engine, patented in 1698 to draw water from flooded mines. In the time since then, the engine has gone from triple-condensing cylinders to turbine blades. And that is the least of our advances. This year, we are just two hundred years from the first primitive electric motor, built by Michael Faraday in 1821. And now we have motors both small and large driving everything from trains, elevators, and cars to vacuum cleaners and electric shavers.

In my lifetime, I have seen music go from analog grooves cut into vinyl disks and magnetic domains on paper tape to digital representations stored on a chip, and photography go from light-sensitive emulsions on film and paper to similar—but differently structured—digital sequences on chips. My electric typewriter—again driven by a small motor—has gone from impact-printing metal representations of the alphabet on a sheet of paper to storing different digital sequences on that same chip in my computer. All of this puts the stereo system, camera, and typewriter I lugged off to college fifty years ago into a single device that started out on my desktop, migrated to my laptop, then moved into my hand inside a smart phone, and now lives on my wrist instead of a watch. And the long-distance call I made every week from college to my parents at home was once a direct wire connection established by operators closing switches; it would now be a series of digitized packets sent out through the internet and assembled by computers at each end of the conversation. Gutenberg’s process for printing words on paper is now embodied in the photo-masking of electronic circuits on silicon chips. And we’re not done yet.

In 1943, Alan Turing invented the first computer, designed to crack the Enigma code for the Allies. In 2011—just 68 years later—that machine’s linear descendent, IBM’s Watson, was playing although not necessarily consistently dominating the game of trivia based on history, culture, geography, and sports, dependent on linguistic puzzles and grammatical inversion, known as Jeopardy! While that was a stunt, similar “artificially intelligent” systems based on the Watson design are now being sold to businesses to analyze and streamline operations like maintenance cycles and supply chain deliveries. They will take the human element, with its vulnerability to inattention, imagination, and corruption, out of processes like contracting and medical diagnosis. Any job that involves routine manipulation of repetitive data by well-understood formulas is vulnerable to the AI revolution.2

Add in separate but related advances in materials, such as 3D printing—especially when they learn how to make metal-resin composites as strong as steel—and you get disruption in much of manufacturing, along with the global supply chain.3

Any theory of economic value that depends on human brawn—I’m looking at you, Marxists—or now even human brains is going to be defunct in another half century. That’s going to be bad news for countries that rely on huge populations of relatively unskilled hands to make the world’s goods, like China and India.

Intelligent computers are also able to do things that human beings either cannot do or do poorly and slowly. For example, in November 2020, Nature magazine reported on an AI that can predict and analyze the 3D shapes of proteins—that is, how they fold up from their original, DNA-coded amino acid sequences—almost as well as the best efforts of humans using x-ray crystallography. And this was just 20 years after the first sequencing of the human genome using supercomputers, and only 66 years after the first glimpse of the DNA molecule itself using x-ray crystallography. Knowing the structure and thereby the function of a protein from its DNA sequence is a big deal in the life sciences. It will take us far ahead in our understanding of the chemistry of life.

Ever since the 17th century, our technology has been riding a curve that gets steeper every year. And the progress is not going to slow down but only get faster, as every government, academic institution, and industrial leader invests more and more in what I call this “enterprise of science.” Anyone who reads the magazines Science and Nature can see the process at work every week.4 We all stand on the shoulders of giants. We stand on each other’s shoulders. We build and build our understanding with each advance and article.

This rate of increase might be slowed, marginally, by a global depression. We might be set back entirely by a nuclear war, which might revert our technological level, temporarily, to that of, say, the telegraph and the steam engine. But it will only be stopped, in my estimation, by an extinction event like an unavoidable asteroid or comet strike, and then so much of life on this planet would die out that we humans might not be in a position to care.

As to where the curve will lead … I don’t think even the best science philosophers or science fiction writers really know. Certainly, I don’t—and I’m supposed to write this stuff for a living. The next fifty years will take us in perhaps predictable directions, but after that the effects on human economics, culture, and society will create an exotic land that no Asimov, Bradbury, or Heinlein ever imagined. Fasten your seat belts, folks, it’s going to be a bumpy ride!

1. That’s a quadratic equation. And no, I don’t really understand the formula’s properties myself, having nearly flunked Algebra II.

2. But no, the computer won’t be a “little man in a silicon hat,” capable of straying far outside its structural programming to ape human intellect and emotions—much as I like to imagine with my ME stories. And it won’t be a global defense computer “deciding our fate in a microsecond” and declaring war on humanity.

3. It’s become a commonplace that the U.S. lost its steelmaking industry first to the Japanese, then to the Chinese, because they were more advanced, more efficient, and cheaper. Not quite. This country no longer makes the world’s supply of bulk steel for things like pipe, sheets, beams, and such. But so what? We are still the leader in specialty steels, formulations for a particular grade of hardness, tensile strength, rust resistance, or some other quality. Steelmaking in our hands has become exquisite chemistry rather than the bulk reduction of iron ore.

4. For example, just this morning I read the abstract of an article about adapting the ancient art of origami to create inflatable, self-supporting structures that could be used for disaster relief. I read and I skim these magazines every week. And frankly, some of the articles, even their titles, are so full of references to exotic particles, or proteins, or niches of mathematics and physics that I can only guess as to their subject matter, let alone understand their importance or relation to everyday life.

Sunday, April 25, 2021

Understanding Alien Psychology

Borg Queen

I have been thinking and blogging about the potential for finding and understanding aliens a lot recently, ever since reading Avi Loeb’s book on the interstellar object ‘Oumuamua, Extraterrestrial: The First Sign of Intelligent Life Beyond Earth. Now I am heading into realms that are totally unknowable—except from the viewpoint of what we know on Earth. But bear with me …

First off, I am not too interested—well, very, but not for the purpose of this meditation—in simply finding signs of life. We’ve seen things that could be confused with fossilized cells in the surface geology of Mars, and we suspect we might have found gases that could only be created by life-as-we-know-it in the atmosphere of Venus. When we get to other planets, both in this solar system and around other stars, we may well find chemical reactions and physical structures that we, from inside the realms of earthly biology and human understanding, define as “life.” Some of it may be intelligent but a lot of it, like most of the living forms on Earth, will not be what we choose to call “intelligent” or even “sentient.” Slime molds, for instance—honking huge single cells with eukaryotic nuclei—can move toward food and away from irritants in a fashion that seems to be intelligent or at least resembles neural networking.1 But it’s not going to build a rocket and come visit us.

When I think about aliens, I imagine the kind that will leave their planet and come out among the stars, as so much of Western-civilized humanity apparently hopes to do one day. And until we go out there, we’ll just have to wait for them to come to us.

So, first question. Will they look like us? Or even come close—like the various humanoid species that populate a Star Trek episode? I don’t believe it. As Carl Sagan once said, we’d have more success mating with a petunia than with an extraterrestrial lifeform.2

Earth has a long history of large, active lifeforms that might have developed intelligence but as far as we know did not. The dinosaurs come to mind: the family Tyrannosauridae and their cousins were bipedal, oxygen breathing, hunters and perhaps also scavengers, and probably—maybe—at the top of their food chain. But we have no evidence that they exhibited any real intelligence greater than that of a lion or house cat, wolf or dog, or even a shark. And yet the dinosaurs’ distant progeny, the Corvidae family—crows and ravens—as well as many other species of birds have a kind of intelligence we cannot explain. Even octopi—what? a mollusk?—exhibit a high level of intelligence. So size, shape, and mammalian ancestry are not necessarily prerequisites for intelligence. Still, none of these animals from Earth and its history is going to build a radio or a rocket anytime soon.

But the examples from Earth also suggest that we cannot expect to find our own kind of intelligence out among the stars, even if it wears an unexpected shape or inhabits an environment—like the earthly ocean of octopi and whales, or perhaps the liquid water under Europa’s ice, or the methane seas of Titan—in which humans don’t particularly thrive. Life on Earth did not, after all, start out on the land, although that’s probably the best place to build a radio or launch a rocket above the atmosphere.

One particular axis we are likely to encounter in alien psychology is that between the individual and the group. So far, on Earth, the sort of intelligence that is likely to expand to encompass curiosity, technology, and eventually space travel is fixed in individual entities. We humans are separate and complete persons inside our own bodies. We are socialized into groups, certainly, in which we can function for enhanced performance. But we do not become lost, stricken, enfeebled, and die when separated from our group—or at least not right away. And we see this pattern not only in human tribes but also in monkey troops, wolf packs, whale pods, cattle herds, and other social groupings.

Each of us knows or tries to find our place in the group, establish a niche where our capabilities and levels of aggression or empathy best fit, and seek comfort and contentment—or at least a subdued level of rebelliousness—in that placement. We are social animals. And that is the basis of all culture and intergenerational achievement. The fabled lone wolf, the mad scientist, the antisocial genius who works alone and keeps his notes in a secret code—such beings are of interest to us as fiction but they are not the creators of lasting culture or enduring civilizations. They build no great cathedrals, establish no great cities, lead no great social or political movements—and they don’t send rockets to the Moon.

So, we think, the kinds of intelligence we will find out among the stars will be like us in that: socialized individuals, each with his, her, or its own personality, preferences, anxieties, and dreams.

But we have another example on Earth to draw on: the hive mind. Whether in the beehive, the anthill, or the termite colony, the individual entities—the minds inside the separate bodies—are not really individuals as we humans understand the term. They are physically adapted to their tasks and place in the hive structure, and their minds are shaped—one might say innately programmed—to perform those tasks and not question their role. Even the queen is not a ruler or leader but simply the pampered sexual progenitor, the mother of them all, that ensures the colony’s survival and renewal.

Something of this was captured in the movie Star Trek: First Contact, where we are introduced to the Borg Queen (pictured nearby with Alice Krige playing the part). But although the Borg are a collective of mechanized humanoid lifeforms whose brains are electronically networked, they are not really a hive and the queen is not their mother nor their first member. The queen speaks with a voice and persona that can call up the collective mind but can also examine it, see its options and possible choices in context, contrast their existence with what she knows of humanity, and evaluate the Borg from the outside. She is more like a leader or first speaker than the sexual progenitor of the collective.

When we look to more imaginative literature on the idea of the hive as a society, the offerings are few. To my thinking, Frank Herbert has done some of the best work on this. His novel Hellstrom’s Hive examined what it would take to change human beings into the sort of social insects that could function most efficiently in the politically denuded world. And his novel The Green Brain imagined a hive of Amazonian insects that functioned as a single conscious entity, in the same way that the cells in our human bodies work together to create the reality—or perhaps it’s just the illusion—of a single person with independent will and desire.

We might encounter some variation of this colony structure, this collective intelligence that is not separated by the strands of individual personhood, out in the universe.

The question in my mind is whether this kind of intelligence is creative or merely reactive. A colony of honeybees or ants can adapt to its environment, find flowers or other foodstuffs when the weather is right, make its nest or hive nearby, and deal effectively with changes in environment and temperature, or else swarm to find a new nesting site. But can they only think and react to present and immediate needs? Could they eventually engineer changes in that environment? Could they look beyond the immediate locale and imagine ways to make it different? Could they look out at the stars and dream of visiting them? Or are they bound to the world as they know it, in a way that human societies are not?

Every group of socialized individuals is built of both leaders supported by their pack of generally submissive followers and then the potential outsiders, the rebellious youth and the mad geniuses, who question the social order, its structure and purpose, and seek something new or just different. At least, that’s how the human tribe has functioned and flourished. That is how we broke the bonds of merely reacting to our hunter-gatherer environment, engineered a better life through agriculture, created written records to preserve intergenerational knowledge, adapted invention and technology to improve everyday life, and then looked outward to the stars.

In our experience, based on that group of socialized individuals, progress depends upon imperfections in communication, upon differences of opinion and individual dreams, upon disagreements and conflicts. These are the one thing that the anthill or the beehive cannot survive. The resolution of these disruptions is never pretty and neat, and it’s never complete and finalized.

But without disruption and disquiet, you have the structured cooperation, the orderly processing and virtual stagnation, of the colony animal. That, or the brainless neural-net reactivity of the slime mold. And neither of them, I warrant, will be coming here anytime soon.

1. See, for example, Mycologist Explains How a Slime Mold Can Solve Mazes, from Wired.com in 2019.

2. But maybe we are connected after all, as in the panspermia hypothesis. See, for example, my meditation on The God Molecule from May 28, 2017.