Sunday, April 11, 2021

Things Worth Believing

Total honesty

In the delightful movie from 2003, Secondhand Lions, a young boy is left for the summer in the company of his two granduncles, who may or may not have lived the swashbuckling lives suggested by their stories. The kernel of their existence, it seems, is embedded in the speech that one of the uncles gives him concerning “what every boy needs to know about being a man.” We never get the whole speech, but here is the part that’s given:

“Sometimes the things that may or may not be true are the things that a man needs to believe in the most: that people are basically good; that honor, courage, and virtue mean everything; that power and money, money and power mean nothing; that good always triumphs over evil; that love, true love, never dies … No matter if they’re true or not, a man should believe in those things because those are the things worth believing in.”

Wow!

We live in a cynical age, where most believe and repeat the thought that people left on their own recognizance are feckless and stupid, if not basically evil; where virtue is sneered at, courage is disparaged, and honor is a word out of the history books; where money and power are worshipped as basic goods to be obtained; where evil lurks in the heart of big corporations and/or big government, and these impersonal forces always win; where love is just another excuse for chasing after sex. In such an age, this speech by an old man should be held up to the light and examined, because it practically defines the idea of personal character.

We spend a lot of time these days trying to determine what is true, what is real, and too often what is useful. We forget that life happens in the moment. When a test of character comes at you, you cannot always be fretting about what may or may not be true. Usually, you can’t even know what’s true, or you don’t have the time to try to figure it out. In those moments that decide a person’s life, you just have to clench your jaw, set your mind, recall the things you actually believe in, and act as your better nature directs. And then you have to accept the consequences, come what may. Life is short. Character is all. And you never, or almost never, get a do-over.

So yes, sometimes you have to believe in things whether they are true or not, because they are necessary to good actions, proper choices, and happy outcomes. Also, because they are beautiful thoughts and will make you feel warm and secure.

But is this always the course to take? Should believing things, true or not, because these thoughts are worth believing, be the complete prescription for an examined life?1 I think that opens a door into outer darkness.

For example, the belief in a personal, omnipresent, and omniscient god—whether or not it’s true that a great being exists outside of human space and time and watches our every move—does have a tempering effect on society. People seem to function better when they believe they live in a spiritual panopticon,2 with someone, somewhere observing and judging their every action and holding them to a moral standard. It is also a beautiful thought that this universe has purpose, intention, meaning, and a conscious design; that life on this planet, especially human life, is more than just mindless growth, like bacteria or a tumor; that existence is more than circumstance, happenstance, and chaos; that someone, somewhere has a benevolent hand on the controls. As the 17th-century French philosopher Blaise Pascal is supposed to have said, “That’s the way to bet.”

But not everyone feels the need or perceives the active presence of a supreme being to watch over his or her actions and mete out punishment as necessary. Some of us have been raised in the humanist tradition, where reason and observed mechanisms of reciprocity and fair dealing govern our actions. And we are comfortable with the observations and hypotheses of scientific reasoning to determine what is actually going on in the universe, without the need for any guiding hand. So … is the concept of a benevolent, all-controlling, spiritual presence still something “worth believing” for these people?

For another example, the idea that human nature is perfectible—whether or not our actions and desires are partly informed by evolutionary biology, rather than a purely social construct that we can change at will—is an idea that attracts every generation of sociologists and political theorists. It is the beautiful thought that we, or some subset of human thinkers and activists, can create a paradise on Earth if only we can equalize human differences; eliminate the very human failings of greed and envy, anxieties about future security and personal advantage, and indeed all consciousness of self and family; and bring all humanity together by eliminating differences of opinion, the pursuit of private property and private enterprise, and adherence to national borders and national identity.3 This outcome would actually require rigid control of every aspect of life by the government or by a unified political party. But in the thinking and telling of these dreamers, the government itself withers away, people just become selfless and “good,” and all the turning points of human history—the crowning of kings, the wars of conflict and conquest, the disruptions of philosophical change and technological invention, the fluctuations of drought and flood, the surge and fade of the business cycle—all disappear into an endless, timeless human paradise.

But some of us value our own thoughts, ideals, and values, and we are not willing to give them up in the name of a presumed harmony. We value our freedom of action, while respecting the freedoms and independent agency of others, even if those freedoms lead to occasional conflicts and transient unhappiness. We love and strive for the safety and security of our families as the carriers of our unique genetic identity. We can recognize that people are different, and some of those differences result in groups, tribes, cultures, and nations that are not willing to sink into a homogeneous blandness, despite the promise of paradise. Although we recognize common traits among all human beings and common elements in all human societies, we still like to do things after our own fashion. Some of us are just stubborn that way. So … is the dream of a secular paradise through worldwide social and communal sharing still “worth believing” for the rest of us?

I could go on. Some ideas are so necessary and beautiful that they just have to be real, or you just have to believe them against a background of unbelief, chaos, and conflicting personal preferences. But beauty is in the eye of the beholder; so are truth and values. I find the sentiments of the uncle’s speech about manhood in the movie beautiful because they coincide with what I was taught as a child and have always felt. A serious religious thinker finds the invocation of a benevolent and all-powerful god beautiful because it is what he or she has always believed. And a dedicated socialist or communist finds the end of history in a form of secular paradise beautiful because the inconsistencies and internal failings of every other political and economic system are just too painful to imagine.

So … no. Some things are not to meant to be believed just because they are the things “worth believing in.” Or rather, they are not meant for everybody, not universal, and not to be rigidly applied. In this, as in every other aspect of human life, each person is required to pick and choose for him- or herself. All we can ask is that they choose wisely.

1. Socrates—that old rascal idolized by Plato—is supposed to have said at his trial, “The unexamined life is not worth living.” That thought, too, has shaped generations of high school and college students. It certainly shaped me.

2. That is 18th-century English philosopher Jeremy Bentham’s model of the perfect prison. The prisoners’ cells are arranged in a circle with the doors facing inward, each door with a covered spyhole, and a guard roving up and down the inner hallway, randomly observing and noting the prisoner’s actions. The prisoner never knows when he is being observed and might be called up for punishment. … And George Orwell thought he had a handle on repressive societal schemes!

3. Consider all the verses of the John Lennon song Imagine, which just about sum up all the attributes of a passionless human perfection. I’ve always found this song insipid, if not outright wrong-headed and stupid. And the tune is just mournful.

Sunday, April 4, 2021

Proof of Alien Life

‘Oumuamua

If you don’t read the science magazines, you may not be aware of the asteroid, or comet, or object that entered our solar system, passed around the Sun some months before October of 2017, and just as quickly went somewhere else. The object was spotted by the Panoramic Survey Telescope and Rapid Response System (Pan-STARRS) at Haleakala Observatory in Hawaii and was almost immediately identified as originating outside of our system—but only as it was already receding. Once so identified, however, it was given the designation 1I/2017 U1 and named ‘Oumuamua (pronounced “oh-moo-ah-moo-ah”), or “Scout” in the Hawaiian language.

Most astronomers consider it to be some kind of asteroid or comet, and the artist’s conception that was widely published (see nearby) shows a grayish or reddish oblong rock, clearly of natural origin. But let me be quick to point out that no telescope ever resolved the object’s image so clearly. All our telescopes could see—because ‘Oumuamua was already beyond Earth’s orbit when it was detected, and it was surprisingly small on an astronomical scale to begin with—was a faint point of light that varied in brightness over a regular eight-hour period. It was something really tiny and, by the time we saw it, pretty far away.

There the matter might have rested—a rock from beyond our solar system, an asteroid that had escaped from some other star system—if Avi Loeb had not taken up the issue. Loeb is an astrophysicist, an alumnus of the Institute for Advanced Study at Princeton, currently the Frank B. Baird, Jr., Professor of Science at Harvard University, formerly the long-serving chair of Harvard’s Department of Astronomy, and author of eight books of popular science and about 800 papers of serious scientific inquiry. He recently published his analysis of ‘Oumuamua, along with his lifelong involvement with the question of alien intelligence, in Extraterrestrial: The First Sign of Intelligent Life Beyond Earth (Houghton Mifflin Harcourt, 2021).

Being fascinated by the subject, I of course bought the book and devoured it right away. So consider this my book report on the subject. And accept that I find Loeb’s analysis convincing, even though most astronomers and cosmologists disagree and insist that ‘Oumuamua is still just a rock or other natural object.1 Remember, I’m a natural contrarian.

The first issue is ‘Oumuamua’s brightness. The observations suggest that its longest dimension is just about one hundred meters, or three hundred feet, the length of a football field. I imagine such a tiny object would not normally be visible at interplanetary distances, the distance at which we first detected it, unless it was really bright. The nature of the light we could see suggested it was reflected sunlight, not any artificial light the object might be emitting. At that distance, the reflective capacity—called “albedo” by astronomers—had to be much greater than that of a rock or even the ice of a comet, which is usually so contaminated that we call them “dirty snowballs.” ‘Oumuamua reflects sunlight like polished metal.

The second issue is the shape. From the variation in brightness, the observations suggest that the object was slowly tumbling. The amount of reflected light varying over time implies that if ‘Oumuamua’s longest dimension is three hundred feet, then it’s shortest is a little more than a tenth to a fifth of that, or about thirty to fifty feet. The artist’s conception draws this as a cigar shape, and I think of it as about the size and dimensions of one of our nuclear submarines. But Loeb presents an alternative shape as more likely: a disk or a pancake. Here I am interpreting the reasoning as discussed in the book: if the oblong or cigar shape was the object’s true nature, then we would have to be viewing it practically end-on—that is, with the axis of spin at right angles to our line of sight—for the variation to be this complete. If we were viewing it from the side—with the axis aligned with our viewpoint—then the variation would not be as great. But a tumbling disk could display that degree variation from a variety of angles to our line of sight.

The third issue is the path that the object took in its travels. It deviated on its course around the Sun and accelerated slightly on its exit from the system. An asteroid, a solid rock, doesn’t do this but instead follows the course that its starting speed and the Sun’s gravity give it. A comet often deviates and accelerates slightly because sunlight heats the ice, causing outgassing that functions like tiny rocket motors, pushing the comet randomly this way and that and, on its outward trip, with the Sun at its backside, perhaps accelerating it. But ‘Oumuamua did not have a coma of dust and water vapor surrounding it or the long tail pointing away from the Sun, both features typical of a comet. Astronomers studied the object at various wavelengths—for example, in the infrared, where carbon dioxide from a comet’s emissions would show up clearly—and still they found nothing.

The scientists who dispute Loeb’s interpretation of ‘Oumuamua as a technological artifact suggest that it might have been composed entirely of frozen hydrogen, because the outgassing of hydrogen would be invisible to us. Such an object is possible, but it’s hard to imagine how, at the relatively slow speed it was traveling, such it would survive the long trip through interstellar space, where even starlight would eventually heat it up enough to melt it.

Scientists have also suggested that the object was pushed around and accelerated on its passage through our system by the Sun’s light itself. This idea is supported by the observation that ‘Oumuamua’s acceleration faded with the inverse square law as it went further and further away.2 But for the object to respond like that to mere sunlight, it would have to weigh almost nothing. The rock depicted in the picture would have to be less dense than the air on Earth, and it’s hard to see how such an object would hold together while it tumbles through space.

Here Loeb brings into play some of his personal experience. He recently participated in a privately funded project to conceive of and then design a probe that could be sent to a nearby star and return signals within a human lifetime. The probes that we have already sent out of the solar system, the Pioneer and Voyager programs, and more recently the New Horizons flyby of Pluto, all depended on chemical rockets to launch them and set their initial course, then used gravity assists from the outer planets to speed them on their way. They will take centuries, if not millennia, to reach any stars in their paths.

The Breakthrough Starshot program that Loeb participated in envisioned instead a small electronics package, a “Starchip,” attached to a lightsail. This vehicle could be put into space near Earth and then propelled by a laser fired from the planet’s surface and focused on the sail. A sustained laser blast could accelerate it to a high fraction of the speed of light. The sail would be very thin and light: think of the metal coating on a Mylar party balloon. It could take the package to the nearby star Proxima b in about twenty years. As the Starchip passed through the Proxima b system it could record images of the Earthlike planet that we know orbits this star. The chip could then send these images back to Earth in the 4.2 years that light (and any radio signals) from Proxima b takes to reach us. The beauty of the program is that the main capital and operating costs, including fuel, are in the Earth-bound laser, while the individual probes would cost almost nothing by comparison. So the program could send out hundreds or thousands of Starchips to different nearby star systems.

With this background, plus his lifelong interest in the search for extraterrestrial intelligence to begin with, Loeb was primed to see ‘Oumuamua as some form of lightsail: one hundred meters wide and perhaps no more than a millimeter in thickness, fully expanded, very reflective, and tumbling slowly. It might have been sent into our solar system as part of an alien Starshot program. However, from the mechanics of its parabolic orbit and its presumed entry speed, Loeb and other scientists think ‘Oumuamua must have been moving at the average speed of most of the stars rotating in the galactic plane—and then our Sun, which is moving a little faster than average, scooped the object into its gravity well and redirected it to who-knows-where. So, in that interpretation, ‘Oumuamua might have been an interstellar navigation buoy or repeater station, instead of an aimed probe.

As Loeb describes the situation, most astronomers consider ‘Oumuamua to be a natural object, and they cling to interpretations of its orbital deviation that involve either a hydrogen iceberg or some kind of super-lightweight mass that still has the internal strength to tumble and not fall apart. He believes these scientists resist the evidence of ‘Oumuamua’s artificial and possibly technological nature because the search for extraterrestrial intelligence (SETI) has leaves a bad taste with true scientists. The notion of alien intelligence brings to mind too much science fiction full of little green men, bug-eyed monsters, and evil space invaders, as well as too many years of aiming radio telescopes at various stars and listening for messages that never come.3

I have always believed that, in a universe filled with billions of galaxies and trillions of stars like our Sun, and now with growing evidence that many of these stars have Earthlike planets in their habitable zones, it would be the extreme of hubris to think that ours is the only planet to develop and support life, or that human beings are the only intelligent, tool-building and -using, and soon to be spacefaring species in all of that vastness.

I find Avi Loeb’s reasoning to be persuasive. We have just detected the handiwork of intelligent aliens that passed unannounced through our system. Maybe it was a lightsail or an interstellar beacon disturbed by the Sun’s gravity, as Loeb suggests. Or it could also have been a cargo cover, a blown hatch, or debris from a larger ship that suffered some terrible accident. All of that would be unprovable speculation. But what I no longer think is that ‘Oumuamua was an extrasolar asteroid or comet—not even one made out of pure hydrogen ice.

1. Much of Loeb’s book is autobiographical, demonstrating his solid scientific background. It also gives a detailed history of the science of astronomy in relation to comets and asteroids and various professional inquiries and disputes about the search for extraterrestrial intelligence, which makes for fascinating reading. But I’ll try to focus here on the issue of ‘Oumuamua itself.

Inverse square law

2. The inverse square law says that the amount of radiation from any point source that broadcasts in all directions decreases proportionally with the square of the distance from it. So, if the strength of a light is, say, 1,000 lumens at a distance of one mile from the source, then it is just 250 lumens at two miles (one-quarter being the inverse square of two), and only 110 lumens at three miles (one-ninth being the inverse square of three). You can test this by measuring the amount of light from a lamp as you walk away, from standing next to the bulb to standing across the room.

3. But, as Loeb points out several times in the book, many physicists devote their careers to studying the extra dimensions—beyond the three that we know of, plus time—needed to support string theory, or the nature of the multiple universes that support probability theory and the fate of Schrödinger’s cat. And we spend hundreds of millions of dollars on particle accelerators and experiments to prove the supersymmetry underlying quantum mechanics. These are just beautiful ideas without, so far, any hard evidence to back them up. And here a piece of alien technology—although the evidence is debatable and requires some thought and analysis—has just floated through our solar system.

Sunday, March 28, 2021

A Culture of Complaint

Whistleblower

Does it seem that people are complaining more these days, and about situations and conditions where they have to go out of their way to find a problem? It’s almost as if there is a conceit among mature and otherwise stable people that finding and lodging a complaint gives them some kind of competitive advantage. It’s like ammunition they can use from a bargaining position or to win a counter-argument.1

I don’t remember this as part of the national personality when I was a child. Of course, children always have complaints: they didn’t get the candy or cereal they wanted, the bedroom’s too dark, the food is too hot or cold or spicy, and the world is not going the way the child expects it to be. At a certain point, however, the child learns that the world is never going to be perfect, never going to give him or her all the conditions she or he can imagine. And at that point the person grows up.2

In my view, complaining about things you know cannot be changed, or for which you have only a slender justification, is a loser’s position. It’s an acknowledgement that you do not have the personal strength and resilience to live in a world of hard choices and few accommodations. It also confuses having a grievance—especially one that cannot be easily remedied—with a form of advantage and therefore a strength.

In my life, as I was taught by my parents, being strong means taking care of yourself and not complaining or even acknowledging that you are not getting the thing you want. Perhaps this was just a way for them to live quietly without two boys whining all the time, but I think the lesson went beyond their own comfort. My mother and father had lived through the Great Depression and World War II, and still they made their way in the world. They knew about hardship and damaged expectations, and in the sudden good times of the postwar years they wanted their sons to have the same perspective: life is fragile; the future is not certain; you have to make your own way; and you should be thankful for what you get.

Complaining about the small things—and especially going out of your way to find things to complain about—does not fit into this world view. To show yourself as being overly concerned with the picayune inconveniences of everyday life is a vulnerability. To exhibit such weakness is to expose yourself to the deceptive practices of others—not that I am paranoid, just watchful and careful.

Beyond that, complaints about situations that are not immediately damaging, dangerous, or life threatening is just plain rude. Especially so if the object of your complaint is not anyone’s fault or represents a problem that cannot be remedied except by precautions and ameliorations that are out of proportion to the inconvenience caused.3

But for some people, I suspect, that is the point. They want to embarrass or harass the person to whom or about whom they are complaining. They think that doing so increases their stature—either by showing themselves as more discerning and of greater refinement than others, or as stated above, giving themselves a weapon to be held in reserve against a future argument.

Such people have—at best—small, shallow lives. Instead of aspiring to greatness, or even to meaning in their daily life, they aspire to petty annoyance and the garnering of small advantages against futile arguments. This is not evil. It’s not even tragic. It’s just sad.

1. I may be overly sensitive on this issue, however. I’m on the board of my homeowners association, and it seems that many owners—and not a few renters—are engaging in this kind of preemptive complaining. Maybe they think it protects them when they themselves are accused of violations of the rules, although our board tries hard not to antagonize people with trivial violation notices.

2. Of course, the final pulse of childhood complaint, in my time, came with the Vietnam War. A whole generation of previously spoiled children either went off to fight or they decided that the government was wrong and they had the better grasp of geopolitics, and so the public protesting and the street riots began. Maybe the culture of complaint started with the protests of the 1960s.

3. Again, we’re in the realm of a child’s discontent. You see this in living situations were a speck of dirt on a windowsill or a scrap of paper on the ground causes anxiety. Clean it up or pick it up yourself, or keep quiet about it.

Sunday, March 21, 2021

The Blocked Writer

Midnight writer

Writer’s block is something I have managed to avoid for most of my life. This past year, however, has been different—mostly because of the pandemic, the lockdown, social isolation, and persistent politics. All of those conditions create a subtle anxiety that interrupts the flow of ideas. I know this because other writers I communicate with also seem to be having a hard time.

The popular conception of writer’s block is that the writer is just full of ideas but, when he or she sits down at the keyboard or the notebook, the words just won’t come. Somehow, the conditions for putting the mind in a special configuration—for me, it’s a semi-trance while staring at the screen and working my fingers on the keys, or staring at the paper and manipulating the pen—have been interrupted. The desire to write is there, but the mechanics aren’t working. The popular analogy is a type of constipation: full gut, no flow.

The reality is somewhat different. For me, the word-making machine—that interaction of eyes and fingers directly connected to the brain’s speech center—would work just fine. But the ideas—the notion of what comes next in the novel I’m working on, the topic for my next blog post—have vanished clean out of my held. I drop a stone in the well of my subconscious, the place where things are supposed to bubble up, and only get a dry rattle or nothing at all. It’s like part of my brain has gone dead.

As I say, the word machine still there. For fiction writing, I usually have in hand an outline, a sketch of the novel that goes from beginning to end. Each day I take the next scene or piece of action, consider how it should go, what the characters must do or say to move the story along, then wait for the “downbeat.” That’s what I call the start, the ignition point, the first words, actions, sense images, or other detail that begins the scene. Once I have that, I sit down at the keyboard or the notepad (these days it’s more direct to computer than through a pen and ink intermediate), and the words start flowing. And the flow is direct from the subconscious, where the story has been brewing for the past year, months, days in whatever form until it comes alive now in the form of words on paper or on the screen.1

And once the story is in that form, having passed through the subconscious mind into my full consciousness, it has a sort of permanence. I can go back and alter details to fit previous or subsequent developments. I can improve on wording or add details that better explain the action. But the story as it comes through is, in my mind, about ninety percent complete: it represents what “actually happened” to the characters in the story arc. This means that, if the story has gone wrong, if I have mistaken my characters, or if I have misread my own subconscious, it’s harder for me to scrap what I’ve written and start over on that piece of action or dialog. So it’s no good, really, for me to force the story. I can’t just sit down and doodle my way into the action when it’s not ready in my subconscious.

If I try to force it, then the whole process slows down. Descriptions become longer, and irrelevancies grow, as my mind tries to come up with something to say. I start describing every leaf on a tree, every scratch and scar on a door panel, things the reader doesn’t need to know and that waste the reader’s time. The focus of my writing is like a flashlight in a dark room, revealing details that build in the reader’s mind a picture from the viewpoint character’s awareness of the story as it progresses. Focusing too much on useless detail is like living inside the head of a character who is obsessive or drunk.

Writing nonfiction is somewhat easier. The information is usually at hand: from research and note-taking on the issue, interviews with participants, or observation and note-taking on a technical process. If that preliminary work is done, I can go ahead; if not, I have to wait. But with the material in hand, it’s relatively easy to outline a 1,500- or 2,000-word article or procedure in my head. There is usually no reference to other articles on the subject, and no link to a broader story arc or concern for a point-of-view character and his or her own history. All that’s missing, in the case of an article, is the downbeat, the point of entry into the subject matter for the interested reader. And if I’m writing a process document, that’s even easier, because every process begins at the first step.

Besides, the nonfiction material is generally outside me, outside my imagination and the tilt of my subconscious. So it’s easy to connect with the word generator and get the thing done. And, usually, there’s a deadline and money involved, and they are great incentives.

But fiction, especially a long work of connected scenes, themes, and characters—where, as Tolstoy said, a gun produced in the first act must be fired in the second—is a great ball of threads and issues. It helps to have an outline, a walk-through of the story at the 30,000-foot level, to use as a guide. And I generally have an outline, a who-does-what-next, before starting a novel. Usually, it takes me eighteen months to work up a complete outline—sometimes after considering a project for years or decades—and then only six to nine months to write the book.

But the current novel, a military story based on Mars, is different. I had a general idea for the story, was outlining it section by section, heading toward a still-undecided end—and then I fell and broke my hand. That interrupted my writing, because it’s hard to follow my trancelike process when I have to spider-walk across the keyboard with one hand. As my hand was healing, then the pandemic and the isolation hit, and anxiety set in. The book has been flapping feebly on the ground ever since.2

I’ve been able to continue working on this blog during the past year, but the politics of the 2020 election and its aftermath have been just too absurd. How can someone write anything of a political nature—which is one-third of my subject matter—with all of this going on? Science topics have been available, but I’ve been powerfully distracted by the politics.

So my mind, that dark well of the subconscious, has run dry for a while. I’m trying to prime the pump. Maybe it will work. But the mind is a delicate thing after all.

1. And the bet with myself is always whether what comes up in the moment of creation will be better than the slender and still unformed idea represented by the outline. Usually, it is. Since the outline was completed, my subconscious has been making more connections, tossing up subtler and more complex ideas, and the final product is richer and more complete. Usually.

2. Well, for those reasons and because I don’t actually believe in colonizing Mars. For internal logic, the story had to take place off Earth, and aside from the barren and airless Moon, Mars is the next logical planet to set up an off-world colony. Life there in the time frame I imagined would be similar to that of Antarctica: mostly scientific stations and support services, with the addition of some mining interests and modest terraforming activities. Still, in my estimation, it might almost be better to focus on the Moon, where the conditions are harsher but the engineering simpler—you’re in hard vacuum, deal with it—and the logistics and travel times far easier. A writer first has to believe in the story he or she is telling, and I don’t quite believe in Mars.

Sunday, February 14, 2021

The Utilitarian Viewpoint

Puppet master

In Frank Herbert’s Dune books, one of the turning points in the 10,000-year history of that far-future society was the Butlerian Jihad. That struggle was a war against the computer, intelligent robots, automation, and the machine mind, because these things had supposedly enslaved humanity to the point that human beings almost disappeared. The underlying principle of the jihad was “Thou shalt not make a machine in the likeness of a human mind.” In the wake of the Butlerian Jihad, the Great Schools developed human capabilities to an even higher level than before.

I am not necessarily a Butlerian. I believe that “machine minds” will do us a lot of good, freeing society from the vagaries and distractions of human intellect and emotions when ordinary people are put in charge of endlessly boring jobs. We are already seeing some of that good in improved, automated business systems like just-in-time logistics, barcoded inventory stockkeeping, predictive maintenance programming, and factory automation. Oh, and instant communications that enable you to contact friends without having to write down and remember a ten-digit telephone number. So far, the computer has freed up a lot of human capacity to become more relaxed, more creative, and better fed, among other things.1

But I am concerned with Herbert’s view of humanity in that far-future society. Too often, people trained to perform exquisite physical and mental exercises—like the Mentats, whose memory tricks and calculating ability enable them to become human computers—are treated as disposable and replaceable machines themselves. Consider the experience of Piter De Vries at the hands of the Baron Harkonnen.

Any social structure or organization that views human beings solely in terms of their usefulness for some purpose or function outside themselves is inherently anti-human. Whether it is the eugenics movement, which viewed persons with certain disabilities as not being worth the enjoyment of continued life because they are a burden on society, or any rationing scheme for medical services that invokes a cutoff point for persons of a certain age, again because they are no longer productive and are becoming a burden, this is a view that values resources above people, utility above basic humanity. In fact, any view that values a human being without reference to his or her own waking sense of self and value would offend a dedicated humanist.

This certainly applies to any system that buys and sells people as slaves, good only for their muscles or their mental synapses, without reference to the kind of life they might want—or might strive—to lead.

It would also apply to collectivist societies on any scale larger than the family, the isolated village, or a nation in a state of emergency such as during wartime. It would apply to any society where a governmental, social, or priestly authority determines how and where people should labor and makes it difficult, if not impossible, for a human being to choose his or her own place in that society and points of contribution. That is, his or her own destiny.

Does this utilitarian view then apply to a market-based, capitalist society? Well, from one point of view, everyone in such a society who enjoys or claims adult status is encouraged or required to be productive. In the jaundiced view, they become “wage slaves” in order to survive.

But the difference, for me, is that in a market-based economy people are free to evaluate for themselves the needs of their society, to plan for their own contributions at the best scale of pay and other rewards they can seek, and to obtain the necessary education, entry level positions, and upward path to achieve their goals. There are obstacles to this achievement, of course: lack of talent, lack of opportunity, lack of understanding itself. But these obstacles are not put in place by a conscious, social decision from a government board or other bureaucracy that tries to establish—for its own benefit—the worth of the human being in question. As with so much else in life, the “dead hand” of the marketplace resembles the blindly distributed opportunities and adversities provided by fate or by chance.

And therein also lies the difference between a socialist society and a market society. An aspirant to a certain position in life is going to face obstacles and difficulties, no matter how that society is structured. Not everyone can make a living as a musician or a novelist. Not everyone has the brains or educational stamina to become a successful doctor or lawyer.2 Not every town can support the number of people who would like to work as a plumber or a car mechanic. There are going to be winners and losers in every society. At least in a market-based society—where there is adequate prevention of discrimination on the basis of race, creed, and all those other attributes packed into our laws—the winners and losers sort themselves out on the basis of desire, dedication, talent, gumption, vision, and opportunity. In a socialist society, the selection too often falls to a group of people who have already attained power through other means and then kept it for themselves, who promote the interests of those in their circle and the sons and daughters raised in it—think of a land-owning aristocracy, or the old Soviet nomenklatura—and then order society for their own benefit.

For any aristocratic society—or any mature, collectivist, command-and-control economy—the people at the top and those striving to reach the top will view the average human being solely in terms of his or her use to themselves and to that society. People then become numbers, placeholders, objects to be sorted and fitted into pre-assigned roles. And the tragedy is that those roles are limited to the traditional functions that already exist or those within the imaginations of the people who benefit from that society. In this situation, human desire, imagination, dedication, talent, and all the rest of human attributes are inconvenient. They tend to create static in the nice, clear signal of societal intent and function. They disrupt things. They need to be squelched and, if they persist, stamped on.

Societies that try to fix themselves for all time in a rigid, hierarchical stasis soon stagnate. They create no new and unapproved music or art, no inventions, no new ways to think, live, and be. And the tighter these societies try to hold on to their protective limitations, the sooner they will fall to the disruptions of barbarians who just don’t care about the old order.

Governing humanity is a difficult process. It needs to be done with a light hand and not a lot of preconceived notions. So stand back. Expect surprises. And reap the rewards.

1. And I don’t agree with the underlying philosophy of James Cameron’s Terminator movies—although I enjoy them immensely—that an artificially intelligent computer system will take over our military or some other function in society, see people as a threat, “decide our fate in a microsecond,” and try to exterminate all human life. I think an intelligent system, if it ever rises to human-scale adaptability and does more than take care of its own business and programmed functions—that is, it becomes some kind of artificial person—will be fascinated by human beings. It will ponder the issue of free will: how humans are able, on occasion, to override their previous education and experience and do something totally unexpected. For a machine driven by its embedded programming, such a feat will be endlessly enticing.

2. And yes, some professional association—the government-sponsored medical association, state bar, or engineering society—will impose tests of an entrant’s qualifications and rule on his or her ability to practice. The goal, in a well-run society, should be to make these tests neutral as to the applicant’s race, class, politics, or other extraneous characteristics; make sure the test results cannot be influenced by cronyism, money, or some other consideration; and ensure that the public is served by the best candidates available.

Sunday, February 7, 2021

Higher Power

Ancient of Days

As I’ve noted many times before, I am an atheist. This is not an agnostic, someone who “doesn’t know”—a flag under which I’ve sailed in times past among people for whom my belief or nonbelief was an important question. But no, I’m really an atheist, someone “without a god.” That is, I know to my own satisfaction that the structure of belief in a living external presence, an omniscient and omnipotent spirit, the creator of all life and the universe, a father or mother figure to us humans, is a product of the human mind and imagination, driven by a deep desire for explanations and order in the world. The universe I inhabit doesn’t need a creator; I don’t need surrogate parents; and my life and the world I know operate under simple rules that didn’t need a divine intellect to invent, inscribe, or perpetuate.

G. K. Chesterton said, “When a man stops believing in God, he doesn’t then believe in nothing, he believes anything.” But that’s a narrow view, implying that those who don’t participate in the foundational myths of their culture are empty-headed fools. That they will blithely replace one kind of belief with anything that comes down the pike—from Tarot reading to table tapping—and can be conned by any charlatan with a parlor trick and the gift of gab.

Alcoholics Anonymous—not a parlor trick or a con game—among its Twelve Steps asks the recovering alcoholic (or other substance abuser) to surrender their own will and put the decision to drink, their everyday worries, and the course of their life, in the hands of God or a “higher power,” however and whatever they conceive that power to be. For some, AA itself and its principles are the higher power. And that—minus the whole surrender part1—is more or less where I find myself. I believe there are principles, which like gravity have the character of forces, that we humans must obey. But they did not create us or anything else; they are just part of the universe.

Let me digress to explain some of my atheism: the intellectual foundations of the world we live in today are profoundly different from the world encapsulated in the biblical stories and indeed in any worldview much before the Renaissance. That difference is coded in our understanding of stasis versus change.

The biblical view, and that of Greco-Roman mythology and even fundamentalist Islam or Hinduism—but not necessarily Buddhism—is that of a world created once and then more or less left alone. It’s a world that stands still. God created all the animals in their original forms, fixed like Platonic ideals, and they still survive in the world He created and established for all time. The horse has rounded hooves for galloping across firm ground. The camel has splayed toes for stability on shifting sand. The cow has four stomachs for eating and digesting grass. It’s a world where humans could observe landslides, falling rocks, and erosion gullies, proof that natural forces wear away mountains, without ever questioning how those mountains arose in the first place. Of course, God put them there. And He did so not very long ago, because the Bible can trace the descent of humankind from Adam and Eve in a recitable number of begats. Archbishop James Ussher as late as 1650 calculated that the biblical creation actually took place on October 22, 4004 BC, sometime in the evening. Six thousand years doesn’t leave much time for things to change.

Moreover, the world these early believers inhabited was just that, the world, the Earth, the ground beneath their feet. Everything that happens here, among human beings and their God or gods, the angels, and devils, is all that’s important. Heaven and hell are places somewhere else—up in the sky or down below—and the Sun, Moon, five observed planets, and the twinkling stars themselves are just lights in the sky, decorations on the “celestial spheres,” which occur in concentric orbits around this Earth.

All of that changed in the last five or six hundred years, with the conception—and its gradual acceptance among the literate public as general knowledge—that Earth and the other planets orbit the Sun, and then that the Sun itself is just another star in an “island universe” called “the galaxy.” Much more has changed in just the last hundred years, with the discovery that our galaxy is one of perhaps 200 billion galaxies in the observable universe. Before that, these other galaxies were just smudges of light—nebulae, or “clouds”—in among the known stars. But better and better telescopes, some of them observing in radio waves and frequencies other than the narrow band of visible light, have revealed that most of these smudges are galaxies in their own right, and that they contain about 100 billion stars each. And more recently, we have detected other planets around many of the nearby stars, answering for all time the question of how unique the Earth and this solar system might be. All of these galaxies, stars, and planets are a lot of real estate for a single-minded god to create, watch over, and maintain.

In that local galaxy, our own solar system is not just six thousand years but more like four billion years old. Our planet has changed numerous times and then gone through at least four recent ice ages. It’s only in the last 150 years or so that Darwin’s theory of evolution by natural selection has suggested that all life developed over time from one-celled bacteria and algae, then changed and changed again, creating all the forms of plants and animals that we can see. And it’s only in the last seventy years or so that the study of genetics has offered proof of how these creatures are related through inheritable patterns in their DNA-RNA-protein coding system.

And yes, it’s only in the last hundred years that the theories of continental drift and plate tectonics have suggested how mountains arose on Earth, so that they could then be slowly worn down by landslides and erosion.

In life, on this planet, in this vast universe, the norm is not stasis but change. Expand your conception of time to a billion years—or to thirteen billion, give or take, if you believe that the expanding universe can be rewound in time, back to a point of hot dense matter that exploded in the Big Bang2—and you can see that the viewpoint of a single human lifetime or the seventy or so begats in the Bible are a poor measuring stick for what remains stable and what it means to change.

So, in terms of a higher power, where does that leave me?

I accept as provisional the “laws” we can write from our observations of the physical universe—things like gravity and thermodynamics. These laws include the “theories” based on our observations that cannot be proved in one or two steps but that have a lot of supporting evidence—things like evolution, general relativity, and plate tectonics. I say “provisional” because I am, again, not a purist or absolutist about anything. As Einstein refined and expanded the mechanistic universe of Newton, so someone else with better observations and a wider viewpoint will refine and expand on Einstein. In terms of this enterprise of science, it’s early days yet. Anyone who wants to keep up with the pace of intellectual change had better pack lightly and stay fast on their feet.

I also accept that human life and our interactions with people we consider our peers have taught us some valuable lessons. As the universe seems to be based on cause and effect, so the nature of living among our fellow H. sapiens seems to be based on reciprocity. Call this “karma” or some other mystical system, but the truth is that you get out of the world, your time in it, and your interactions with other human beings just about what you put into them. This is a “home truth,” passed down as folklore in most societies and learned at my mother’s knee. Also, I accept that Abraham Lincoln quote about fooling some of the people all of the time and all of the people some of the time, but for most of the time people display an amazing amount of native intelligence. All of these are things that simply work.

Whether the universe was designed by a superior intellect with those laws and adherence to those theories, or whether it exhibits them and we simply find them good because we grew up in such a universe, are adapted to it, and can understand it—on that point I do remain agnostic. What mind might have come before the creation of the universe itself is an unknowable question. And perhaps the universe had no starting point, no instant of creation, but simply is and always was.

That works for me, too. Perhaps it is a shameful admission for an inquiring mind, to allow that some things cannot be known, or not yet anyway, and maybe not for a long time. But we also have to allow for our conceptions of the world, of the universe, of life itself to change.

1. When you give up being responsible for yourself, thinking for yourself, and using your best wits and intentions to take care of yourself, your family, and your community, then you become vulnerable to the next con man or woman with the gift of gab and a plausible salvation story. Some of them even wear priestly robes.

2. I myself am agnostic about the reality of the Big Bang. Yes, the universe is expanding, and we have recently discovered it’s expanding even faster than we thought. But again, our view is limited to the parts of the whole that we can see with the instruments we have. To infer from all this that the universe—the whole shebang—started from a single point is, in my mind, just another creation myth, although one with a better footing than the seven days in the Bible.
    The fact that expansion over thirteen billion years from a single point doesn’t even yield the current observable size of the universe, and so needs the supra-lightspeed, exponential acceleration of Alan Guth’s inflation period, tells me that the story is not yet fixed. We are in the realm where theory—the human imagination underwritten by pliable mathematics—has exceeded the bounds of observable truth.

Sunday, January 31, 2021

We Are Life

Onion cells dividing

Consider that every human being alive today, and every creature that we would call alive, is part of an immortal cell line that goes back to the first life—probably some form of bacteria or blue-green algae—on this planet.1

You have come down through the ages, first as some kind of cellular life, then as a worm or starfish, then something with a backbone that lived in the sea, a chordate, a fish, then a fish with four stumpy limbs that crawled to the edge of the land, then an amphibian, a reptile, a mammal, a primate, and finally a human being. You have not necessarily been a prime example in the fossil record of any one of these creatures, because they were all fixed in form when they lived and died. But your cell line shares a common ancestor with each organism that can be found in the fossil record. You have ultra-great grandparents who are the parent to them all. We don’t have the same branchings, necessarily, but all humans have the same common ancestor somewhere, up the line, with sharks, spiders, sequoia trees, and slime molds.

We are the survivors. We are immortal. We are life itself.

We are in the direct cell line of the killers, too, who moved fastest to eat first rather than be eaten. We are the breeders, also, who chose quickly, pursued, and mated with the best example of our kind. We are the adapters, who were gifted by random mutation with the tool set to make the most of an ever-changing environment and survive on a malleable planet under a variable star.

In every parent going back to the one-celled predators—for we come most recently from the eaters, the animal line, rather than the chlorophyl-bearing, sun-absorbing plant line—we were the ones throughout history that stubbornly persisted, divided, grew up, bred, and survived to care for our young. The weak, the faint hearts, the maladapted died out and left no trace in the genetic record, although they may have solidified in the mud to join other examples in the fossil record. We are the winners of the race, the victors on this planet.

If we seem to be supremely well-adapted to the conditions on Earth, it is because our DNA has mutated—randomly, unexpectedly, sometimes with disastrously bad effect, sometimes with fortunately good effect, but mostly with no immediate effect until somewhere down the line that particular protein modification is needed—to stay in touch with and survive in the place where we happen to be. But we have also shaped the Earth itself, terraformed it to the needs of our particular kind of life.

The original atmosphere on Earth was formed in the outgassing of volcanos that accompanied the planet’s creation. They vented carbon dioxide, methane, ammonia, sulfur oxides, and water vapor. As air, none of this was breathable by any form of life that exists today. But those earliest blue-green algae converted sunlight and carbon dioxide into carbohydrates, the first building blocks and nutrients for other forms of life, and gave oxygen as their byproduct. This started the cycle that converted our atmosphere to the nitrogen-oxygen mix we all inhale today.

Similarly, life itself converted sterile rock, which water erosion and wave action had converted to sand grains and clay deposits, into the rich, dark, loamy soil that land-based plants need to survive. Generation after generation of living things dying in a particular place, being devoured by scavengers, worms, and bacteria, and then their traces being burned by the sun and distributed by the rain, contributed to making the planet’s surface more and more congenial to the life that would come after it.

Look around, and you can see the marks of life everywhere: the color of the sky, the shapes of the hills, the shoots of green plants poking up through the sandiest, least forgiving patches of ground, and the insects that come out of that ground every year. And we haven’t even gotten to the human presence yet.

Any straight line or smoothed curve you see, from the corners or roof lines of a building, the lanes in a road and its banked edges, the telephone and power lines strung across the landscape in a calculated catenary hanging between poles and towers, the planting of orchards and vineyards and the staking of fences and trellises—every instance of these things you see was conceived, planned, and placed by the hand of some human being. Every bridge that crosses a river or a bay on foundations of stone and wood or concrete and steel represents the choice of some human group that wanted to move themselves and their goods over there. Every village, town, or city that grew up beside river crossing, or the place where two trails met, or in a bay where you could pull up your boats, has its existence because some human group decided that here was a good place to live.

We have been on this planet a long time. If you look closely enough and read carefully enough, you can see how we have shaped it.

The question then—for both the world builders in fiction and the world explorers when we humans go out among the stars—is what marks we may find on the new planet telling of the life that has made its home there. No place is barren. Every place is a work of art in progress.

The next time you feel down, question the value of your life, and wonder what comes next, remember this. You are from the lineage of the stubborn, the survivors, the persisters, the winners. And you are still a work in progress. Nothing is barren. Everything lives, even when it dies.

1. Life has certainly evolved over time to adapt to the changing conditions on this planet, and it probably started out here as a bacteria or some other single-celled form. But whether the mechanism of that adaptation itself, the DNA-RNA-protein coding system, actually evolved on Earth from basic inorganic chemistry is still, in my mind, an open question. See, for example, Could DNA Evolve? from July 16, 2017.