Sunday, December 23, 2018

From the Many, Wisdom

Green shoot

As noted elsewhere, I am a little-D democrat and a big-R Republican. How can I hold two such divergent views? Because I believe in the power of unfettered human imagination and, ultimately, in the wisdom of crowds over time.

By “over time,” I mean in the long term. Crowds in the short term can be fickle and stupid. Sometimes, they are subject to madness and stampedes that end up killing dozens or hundreds of people on their periphery—especially when they brush up against a stone wall or a blocked door. Sometimes, crowds turn into mobs wielding crude weapons and venting their anger and hysteria on harmless people. But that is the short term. In the long term, tribes or societies or nations can reach an equilibrium of consensus, weigh possible choices, and find a path forward that works for most of their members.

Life in the United States—and in much of the developed world—today is materially and artistically richer because we have an open economy with free markets. This is in contrast with the command-and-control economies that grew out of Marxist philosophy and socialist principles, as echoed in current progressive thought. In this alternate view, the government, or the party in power, or some other overarching group of technical specialists and social scientists holds a vision of what society should be like, what its members should need and want, and then works to provide those necessities.1

There have been notable cases of relatively small groups of experts achieving great technological innovations: Thomas A. Edison’s original Menlo Park laboratory, AT&T’s Bell Labs, the Manhattan Project, and the Palo Alto Research Center. All were places where engineers and scientists with interest in a particular specialty—electricity and practical invention, telephony and radio communications, atomic physics and energy, or computer applications—gathered to play in the field of science. Whether they were looking for a certain result, such as the first fission bomb, or just seeing what new approaches could bring to a novel technology, they all achieved great things. But these were private groups, except for the Manhattan Project, and while the work that they did was proprietary, their goal was to bring forth inventions that would be useful to society as a whole—yes, even the atomic bomb.

But except for the Manhattan Project, none of these groups the claimed sole authority to delve into their particular specialty, to produce inventions that could not be challenged or surpassed by others in society, or to be the only voices heard in that specialty. Even the Defense Advanced Research Projects Agency, a function of the U.S. government, built the original internet so that scientists and engineers could share, comment, and build on technological advances. Out of that first backbone network has grown the “enterprise of science” that now girdles the developed and undeveloped world, sharing knowledge and driving a technological revolution in physics, chemistry, biology, medicine, and every other science—along with literature, law, social science, and commerce. That one technology has put us all on an express elevator to the future.

Contrast this with the sort of closed system promoted by the Soviets and the socialist Chinese as described by Aleksandr Solzhenitsyn’s In the First Circle. Scientists, engineers, and technical support people work there in closed facilities—sometimes structured as prisons for useful but otherwise untrustworthy minds—at the command of the government or party and producing only the results that the government can tolerate. It’s an extension of the gulag, even when the work is not done behind locked doors.

In a command-and-control economy, production—and, more importantly, innovation—are determined by a preselected group of specialists, technical experts, and all too often party functionaries. How much steel or aluminum will be made and shipped to fit a five-year plan written four years ago? How many shoes will be made and shipped to a city with a population of a million and a half people, whose feet fall into a statistical average of so many men and women, adults and children? How much bread does the average person need to eat—perhaps just to survive—and so factor into the annual planting for wheat production? All of these questions are left to people who decide for the nation what the nation needs and wants. A small plot of land might be left over for individual farmers to grow their own vegetables or to feed their own chickens—but nobody is invited to go into the produce or poultry business. And no one can invent a new way of making steel and set up a business to revolutionize production.

In an open market, cozy and predictable businesses like steelmaking are upset all the time. During the 19th century, the Bessemer process for turning pig iron into steel used a blast of air blown over the surface of the molten steel to burn off impurities. This process built the mills of the Ohio Valley. Then, after World War II, a Swiss engineer developed the Basic Oxygen Process, also known as the Linz-Donawitz process, after the Austrian towns where it was first adopted. LD steelmaking inserted a lance into the molten steel and burned off the impurities with pure oxygen. It made better steel and took over the market, and the Ohio Valley started to die.2

A command-and-control economy is basically conservative. The bureaucrats who run the five-year plan, and make investments in plant and people to meet it, naturally want to protect those assets. They don’t have any interest or the time to sample public opinion, keep on top of new inventions, and risk upending their industrial base to bring cost savings and technical improvements to their economy. An open market with a capitalist base is more adventurous. People with money to invest in the hope of multiplying it—yeah, basic greed and self-interest—don’t care what tidy national apple carts they might upset. They are attuned to public attitudes and pounce on new trends in both the public appetite and the technology that might serve it. They are investing their own money, or that of others they can convince to invest with them, and if they make a mistake and lose … well it’s only money. Failure is a necessary part of growth—because no one has an absolutely clear crystal ball. And the damage is limited when the bets are made by small entrepreneurs instead of national governments.

If the computing and telecommunications technology of the past half century had been in the hands of a U.S. cabinet department or national institute—or even with Bell Labs—we never would have had the personal computer (two guys at Apple plus two guys at Microsoft), the personal music system (Sony and Apple again), or their synthesis in the smartphone (Apple and a lot of copycats). Such personal use of technology would not have been considered nationally important, and so these products worth billions if not trillions in today’s economy would never have seen the light of day. I would still be writing this article on a typewriter—with no place to publish it. You would still be making telephone calls from an instrument with a dial and a handset—and connected to the world with a wire.

Is the process of capital investment and free market distribution messy? Yes. Is it sometimes wasteful? Of course, both for the products it brings forth that fail and for the stable industries it disrupts and destroys. Personally, I never saw the attraction of either the Pet Rock or Pop Rocks—those hard candies that fizzle in your mouth. They were short-lived innovations, actually joke products, that someone thought would make a buck. A serious Soviet economist would have dismissed them out of hand. But still, some people made a living, however briefly, in selecting smooth river stones, washing and packaging them, and selling them as a novelty item. Some people made a living, however small, by mixing sugar, flavoring, and whatever made the candy fizz, packaging and selling it, and raking in the bucks. These people got a living wage, fed their families, maybe bought a house, paid taxes, and supported the rest of us in the economy. Who is to say that this is a worse choice than investing in a twenty-fourth brand of deodorant or a steel plant that is only marginally competitive with the imported Chinese product?

The point is, a free market supported by capital investment leaves the decision-making up to the wisdom of the rest of us, rather than to some embedded expert in a command-and-control economy. If a product or innovation is useful, it will find a use, establish itself, and perhaps change lives—not because some expert knows this ahead of time, but because the wisdom of individuals acting in a group demonstrates that it is so. The expert might have a hunch or an idea or a subtle aversion, but it is patently unfair—not to say unproductive—to give this individual or preselected group the final say in how the economy will operate. Only the market alone, messy and fickle as it is, can establish long-term value and benefit for the rest of us.

Or so I believe.

1. This is echoed in Bernie Sanders’s comment during the 2016 election that the U.S. economy doesn’t need twenty-three brands of deodorant. His premise was that providing this level of choice to the average consumer is a waste of resources, and the money invested in developing, manufacturing, and distributing those extra products could be better spent by society—meaning, according to the plans and programs he and his party have in mind. I still ask, would he be happy using Secret—“Strong enough for a man but made for a woman”—or how about my brand, which has a sailing ship on the package? Those are both popular brands. Why shouldn’t everyone be happy with them?

2. In the story of steel, U.S. productive capacity helped the Allies win the war, and this country emerged on top of the global economy. But our steel mills were left standing with old technology: Bessemer process furnaces and finishing by casting the steel into ingots, heat-soaking them, and then rolling them out into slabs, sheets, beams, and wire coils. Meanwhile, the mills of Germany and Japan had been bombed flat during the war; so in their rebuilding during the 1950s they could start fresh with LD process mills and new finishing technologies like continuous casting of slabs and sheets from molten steel. The U.S. mills had to move quickly to scrap their old equipment and rebuild with the new processes—but not quickly enough. So the U.S. market was flooded with better, cheaper European and Asian product, and basic steelmaking declined in this country. You snooze, you lose.

Sunday, December 16, 2018

That Voice in Your Writing

Man holding a mask

Last week, I wrote about That Voice in Your Head, concerning the mental condition of hearing voices and how we all may be on a spectrum for that phenomenon. Now I want to think about how a writer—or how I do, anyway—uses the voice that speaks from the mind to create a thought, a mood, a story setting, and narrative dialogue.

First of all, every writer soon or later finds his or her own “voice.” This is the unique, or mostly unique, or at least individual and comfortable way for the person to render prose. For most of us, it is not a conscious choice so much as “doing what comes naturally.” And this “natural” is based on the reading we’ve done, authors we’ve liked, and the authorial voices we’ve consciously tried to emulate. Because I favored English class in high school, taking two Advanced Placement courses in English rather than in history or math and science, and then chose English literature as my major in college, I had the opportunity to read across a wide field, from translations of the Greek playwrights and Italian poets to Chaucer, Shakespeare, other English poets, and novelists of the 18th and 19th century, as well as a handful from the 20th century. It’s also possible to study the language and focus on just American authors, from Hawthorne to Twain to Hemingway, but my courses stayed mostly with the English authors.1

What has that done to my writing, to my natural voice? Well, while I don’t favor (rather than “favour”) English spellings, I do notice a certain preference in word order, especially around adverbs.2 For example, if I am using a “helping verb”—and when you write in the past tense, as in most novels, their use abounds for shifts in temporal perspective—I am more likely to embed the adverb between the helper and the main verb than let it straggle along before or after the verb phrase. So, I will write: “He will certainly go,” rather than “He certainly will go.” That example may seem less direct, less forceful, and therefore a bit more English, but then … so be it. I will also choose a slightly more formal and structured language in general descriptions, and I will hesitate to leave out structurally necessary prepositions. For example, I will write: “all of the cats” rather than “all the cats.”

But language is flexible, speaking and writing have their own rhythms—especially when keeping to the meter of poetry—and every rule and habit was made to be broken.

When I was actively freelancing, supplementing my corporate income by writing one-off brochures and annual reports for other companies through a well-respected local communications agency, one of the principals there paid me the complement of saying I had mastered the “business friendly” voice. I never consciously studied that voice, but I know from the example of others who haven’t mastered it what this means. The business-friendly voice is formal, although it strategically uses contractions in order not to sound too stilted. It avoids harsh, direct statements, especially if the reader might take them to heart as directed at his or her own breast. So business-friendly is cushioned with words like “sometimes” and “often,” allowing the reader to imagine that what is being described as so might occasionally not be. Business-friendly uses the subjunctive mood a lot—putting difficult concepts in the realm of “might” and “could” and “would,” instead of the simple declarative of “is” and “is not.” Above all, the writer who uses business-friendly generally likes and trusts the people he or she is addressing, encourages their dreams and desires, and forgives their foibles and mistakes. The writer tries to be someone with whom the reader would like to have coffee and conversation, perhaps even a beer.3

But these voices, whether the “natural” one that a writer acquires through varied reading or the “business friendly” voice that one puts on for the sake of diplomacy and/or sales, are the background upon which a writer works for a specific effect in creating fiction.

My natural tendency, also a derivative of English literature, is to write in compound sentences and adopt parenthetic phrasing to acknowledge the occasionally lengthy detail, necessary explanation, and honest counter-example.4 While this may seem to inflate and extend the scope of my sentences, it really is a form of economy, of condensing my thought. Imagine if each parenthesis had to be set out as a sentence of its own, with a phrase referring back to its point of departure. Imagine if I laid out each sentence as a short, direct structure in the form of subject-verb-object and had to add the explanation of how one related to the next, rather than linking them into a logical flow. The writing would then be flat, plodding, dull, and not at all engaging. My natural tendency is to spin a web within which to catch your mind.

But not all the time.

While this seemingly meandering structure, filled with oxbow bends and tributaries like a great river of thought, is adequate for most narrative purposes, there are times when it just won’t do. Descriptions of action are one of those times. Then the pace quickens. Sentences unwind. Impressions flash by. The reader’s attention spins forward. The character hardly has time to breathe.

At a deeper level, when writing from the point of view of one character or another, I not only limit my narrative to the elements of the story that this person can know directly, remember from a prior conversation, or surmise using his or her own wits. More importantly, I try to shape the language into the character’s voice, picking up his or her cadence, sentence structure, and word choices. An abrupt or contentious character will use shorter sentences and more assertive verb forms. A studious or otherwise reflective but involved character will use more complex sentences, hypotheticals, and subjunctive verbs. A disagreeable character will express the story around himself largely in negatives; an anxious one, largely in fears. And so on. In this way, the writer adopts the skin and the mindset of the character as if speaking in the first person, but using the third person voice in an extended, almost subliminal form of indirect discourse.

And finally, when writing dialog, one compresses and compacts these tendencies. Speakers in a hurry chop out their sentences. Ones who don’t care—or don’t want to show that they care—speak more slowly and sometimes glide around a difficult point. Sometimes, too, a speaker will have a tic, like a speech impediment, or an accent, or an overuse—or absence—of contractions and elisions. It’s not a good idea to play with accents and tics too much, though, as the reader can get tired of wading through and interpreting the resulting dialogue. And after a while all over-played accents start to sound affected and ridiculous. The writer must be subtle and throw in just enough spice to flavor the meal without drowning out the story’s natural spirit and flow.5

But all of these techniques and tricks depend on how the writer hears that narrative voice inside his or her head. You can adopt some of them consciously, adding them like ingredients in a recipe. But it is only through practice—the essence of building your “natural” voice and its variations—that you can do the blending skillfully enough for it to fade into the background so that the story may step forth.

1. Of course, Hemingway had a big influence on my generation of writers, too. He inspired us to employ our perceptions and our language briskly, starkly, and actively. In my case, I also picked up from my reading in science fiction the cadences of Heinlein and Asimov, who always spoke in my head as a voice of reason flavored with wry humor.

2. Yes, I know, all the modern writing instructors caution you to avoid adverbs. While I try to avoid the overuse of adverbs—so that they pop out and proclaim themselves, always a bad thing—I respect all parts of speech and use them in their place.

3. Note the “with whom” there. More English formality, and it avoids the awkward, trailing thought in ending the sentence: “have coffee and conversation, perhaps even a beer, with.” Because I would rather cut off a finger than fail to close off that appositive about beer.

4. Such as the inserted “also a derivative …” in this sentence.

5. And never forget that for the span of time that the book is open on the reader’s lap and his or her eyes are scanning your prose, you are the voice speaking inside their head. If that voice is too strange or ridiculous, the spell will be broken and the reader will close the book.

Sunday, December 9, 2018

That Voice in Your Head

Man holding a mask

Recently our group at NAMI East Bay heard a panel of consumers1 discuss their experiences with hearing voices. Two of them were from the Bay Area Hearing Voices Network, an organization that helps people with the “lived experience” of “hearing voices, seeing or sensing things that others don’t, or have extreme or unusual experiences and beliefs” come to understand and explore the meaning of these phenomena.

One of the panel members, who had taken prescribed medications to subdue the voices, found the side effects so distressing and the results so problematic that he preferred instead to live with the voices. A second panel member actively interrogated the voices and tried to understand them, asking whether they were ghosts, or pure energy from another dimension, or extraterrestrials. He would reason and debate with them when they told him to harm himself. A third felt that the voices were an inspiration, and he knew they came from outside his head because he could not feel the resonance in his skull when he himself spoke aloud. This man came to trust their answers when he needed to take an examination or make a business report.

The original Hearing Voices Network in the UK was founded in part by Ron Coleman, a consumer himself who is now in recovery and works to provide “recovery centered treatment” to other consumers. The principle seems to be that these experiences are not the symptoms of an illness but real events—as indicated by two of the panel members above—that the person feels he or she should explore in a positive way and that others—loved ones and family members—should be curious and supportive about rather than judgmental. The basic good that I can see in all this is that, if the phenomenon of hearing voices cannot be treated and eliminated with either medication or psychotherapy, at least it should not become a source of fear and anxiety for the patient. Support from and discussion with others who share the experience perhaps can approach this good result.

But I still don’t believe the voices are real—or anything more than a neurological or perceptual fault in the auditory processing centers of the brain.2

One of the panel members said that hearing voices is a common experience. He is right—in the sense that humans are a verbal species and routinely put our thoughts, however silently inside our heads, into words. We may not convert all of our sense impressions and internal thoughts into words, but we certainly try. For example, if I smell something familiar, I will usually try to identify it with a word: “This is ‘coffee.’ ” Or, “That’s ‘a rose.’ ” If I see an unusual shape, I will try to match it with a familiar shape and give it a name.

Many of the thoughts that pop into our minds are verbally arranged. For example, if I am doing something and sense it’s wrong, the thought may insert itself as a sentence: “This is a bad idea.” In the old Transactional Analysis, which was popular back in the late 1960s, the Freudian personality functions of superego, ego, and id were explained as the internalized voices of your Parent, your own Adult self, and your earlier Child self. The Adult makes rational decisions based on current needs, reason, and experience. The Parent issues decrees and warnings based on remembered authority. And the Child expresses needs and wants based on remembered emotional states. … Or something like that. The point is, these are learned and internalized reflexes that the person remembers from growing up as an immature version of self under the regime of a parent who is more mature and either guiding or punishing. Usually, these reflexes present themselves as verbal statements. When my mind generates the thought “That’s a bad idea,” it is usually in my mother’s voice.

I should note also that when I am writing, as now, the words are coming into my head as if I were speaking them aloud.3 And when I write fiction and generate dialogue, I imagine the two or three characters speaking and supplying their own favorite expressions, diction, preferred sentence structure, and even accents as they speak.

So this “common experience” of hearing voices can be pictured as a spectrum, and this matter of thinking in terms of words would be the “normal” end. It is normal in that most people are not alarmed by it, do not find it troubling, and accept it as the way their brain works. I should note that this end can have its alarming aspects. My mentor at the university, Professor Philip Klass, once told of a time he was driving faster than usual on an elevated freeway. He heard a voice in his ear say distinctly, “Slow down!” The voice was so real that he reacted instantly—and around the next turn was a wreck that, if he had not slowed, he would have plowed into. Was that voice the manifestation of a guardian angel? Or just his own mind cautioning him about driving too fast? Either way, he questioned whether the instance of hearing the voice, followed by the crash up ahead, could be mere coincidence.

Moving toward the less-normal parts of the spectrum, we have Professor’s Klass one-time warning voice, as well as the times when we hear a change of air pressure at a partially cracked window and think it’s a human moan or sigh, or the babble of the crowd in a busy restaurant suddenly resolves into an almost-familiar voice speaking our own name. Or—and I speak from experience here—sometimes a recent widower will hear a noise and imagine it’s the whisper of his dead wife. It may be imagination, but it sounds awfully real and there is a momentary pang of recognition and regret. The point is, in this part of the spectrum we are not at all sure, however briefly, whether the voice was inside our heads or not.

Toward the middle of this spectrum are the voice hearers, like the members of the panel, who hear voices that they know or believe are not their own and not coming from inside their heads. They can have it explained to them that their brains are malfunctioning and they are listening and responding to tricks of their own imaginations, but they will not believe it. The voices are too real. One of the panel members insisted that the different voices each had their own way of speaking and accents, and that was proof to him that the voice came from outside. Also, the voices often suggest something that the person would not normally do, such as inviting him or her to commit suicide. Whether the person is following the voice or resisting, he or she acts as if dealing with an alien entity.

And finally—or so I believe—the manifestation of a supposedly external voice with its own character, diction, and other qualities might become so embedded in the mind of the hearer that it develops an entirely separate personality or, in psychologists’ terms, a “dissociative identity.”4 While the causes of a brain or mind creating more than one personality or dissociating itself from the one it was born with are debatable, the condition often occurs in someone who experienced extreme trauma as a child. So, presumably, the second and other personalities develop in order to envelop and protect the tender ego. The fact that the core personality is usually not aware of these other personalities, their actions, and their intentions is where the dissociation comes into play. It appears that an alternate personality takes control of the brain and body at various times.

Normally, I would not think of putting dissociative personalities on the spectrum of hearing voices, except one of the panel members mentioned his own childhood trauma and being treated for post-traumatic stress disorder, or PTSD. So it is possible that stress and trauma play a part in the voice hearing as well as the dissociation of a personality.

On top of all this, we must remember that human beings, with our huge and vastly interconnected brains, are the “dream animal.” We live not just in the moment and inside our surroundings, as my dog does, but also in our imaginations, in our speculations, in the what-ifs and if-thens of our subjunctive language, and in the twilight realm of our dreams, where the wildest fantasies seem real and even plausible for a short period of time. Is it any surprise that this delicately balanced and incredibly complex mechanism occasionally slips a few gears and feeds us false information?

That’s just the nature of human existence.

1. “Consumer” is the new, more polite term to refer to people with a diagnosis of severe mental illness and is preferred by people in this situation to the earlier term “patient,” which implies that they have an illness. These people are consumers of mental health services.

2. One of the panel members at the meeting, who tried for a scientific understanding of the phenomenon, stated that functional MRI scans of people when they were experiencing voices showed activity in these processing centers.

3. The generation of this word stream is complex. Some of it comes from my front-of-brain thinking and deciding: here is how the article, argument, or story must go. And some of it comes from my subconscious and its intuitive sense of what the story or article might become. For more on the role of the subconscious in my writing, see Working With the Subconscious from September 30, 2012.

4. The old diagnosis of Multiple Personality Disorder has now become Dissociative Identity Disorder. Po-tay-to, po-tah-to.

Sunday, December 2, 2018

A Classic Liberal

Balloon rising

I’ve been examining my own political stance these days—especially since anyone who believes in personal freedom, personal responsibility, and free-market capitalism while being opposed to big government, statist solutions, and socialism is now considered by some to be a “racist, misogynist, homophobic Nazi.”1 I have decided that what I am, other than a stick-in-the-mud, Eisenhower-era Republican, is a classic liberal.

What are the principles of this kind of liberal, as opposed to the more modern kind?

First, I believe in your personal freedom as much as mine. Your rights to free expression, physical movement, occupation of space, and use of time are yours to exercise and govern, as mine are my own. The province of your right to these actions extends up to about an inch from the tip of my nose, or whatever else defines personal space in our culture. If you violate my space and my time, there will be consequences—and I’m prepared to initiate them. But other than that definition of pre-existing physical and temporal limits, I am not going to prejudge you or prescribe the limits to be placed on your speech, actions, and intentions. Go your way and don’t interfere with me, and we can be trading partners, potential allies, and perhaps even friends.

Second, I grant you provisional respect and allow for your personal dignity. In my heart, I want the world to be populated by—in that old phrase—“men (and women) of good will.” I want to live in a society where people can be—and do become—productive and self-sufficient in their lives, caring about their own and their families’ and their friends’ futures, and confident and comfortable in their own skins and with their current situations. This is not always possible—sometimes through personal frailty and failure, sometimes through societal lapses—but I want people to have this chance at personal happiness and dignity. And so, if I want the world to be like this, I must grant in my own mind that such people exist and that you may be one of them. I must refrain from prejudging you as a person of gnawing envy, grasping ambition, bad habits, faulty decision making, and other personal failings that can lead to chronic unhappiness. I leave it up to you to prove me wrong in this. Please don’t disappoint me.

Third, I grant your personal agency and responsibility for your current state of being. Unless I can see and detect some congenital or acquired disability in you, such as blindness, deafness, missing and frozen limbs, or—after five minutes of casual conversation—some deficit of wits, emotional stability, or active and inquiring mentation, I will presume that you are a fully functioning human being with two legs to stand on, two hands to shape the world around you, and a capable brain to guide them both. As with your right to freedom, I believe in your ability to live as you want and operate in the world. I would hope you will grant me the same and not wish to place barriers to my developing and exercising my full human potential.

You will note that these attitudes apply personally rather than to any group. I prefer to deal with people as individuals, unique beings, and not as indiscriminate members of a race, class, gender, or other aggregate. Economists and Marxists may prefer to deal with large groups—economists by their profession, and Marxists by their obsession—but I would rather follow the rule of Sergeant Buster Kilrain: “Any man who judges by the group is a pea-wit.”2 Marxists will find this attitude hopelessly bourgeois, and so be it. I was born to the middle class and raised to be private, diligent, industrious, and resilient—not a bad way to operate and view the world, in my opinion.3

Being a rabid individualist, jealous of my rights and expectations, dealing with other people as individuals, and granting them the freedom to do and become what they want, I tend to despise one-size-fits-all prescriptions and social engineering. I understand that proponents of big government and statist solutions must, as a matter of logic and fairness, strive to treat everyone equally. And socialism by design must treat all citizens as economic cogs in the great machinery of their proposed social organization—except perhaps for those enlightened experts who are doing the designing and taking control of the command-and-control economy. While I grant that some effort must be made at social cohesion if a village, a state, or a nation is going to function, I want to see the choice to join and function—and of who will be doing the deciding—made individually and democratically. Treating people as mere numbers or as “meat robots” devalues their thinking capability and their human potential.

While I believe we should all work together as a society and function in an open economic system, I take the position that I am not responsible for your health, wealth, happiness, or well-being. That is your responsibility and not mine. If you approach me as an individual and ask for help—whether you have your hand out with a cardboard sign at a street corner, or you are wandering dazed and confused after a disaster, or you are a friend or family member in need of support—it is my choice and not my responsibility to respond positively. I am the sovereign of my time, my effort, and my purse, as you are over yours. How I choose to spend them is a matter between me and my conscience or my god—if I have either one.

Note also that these are my personal and individual guidelines, attitudes, and approaches. They are not rules prescribed for me by someone else. They are subject to revision and revocation, and I can change my mind as I see fit.4 I can be flexible without worrying about my own inconsistency, based on my previous experience with similar situations and my new experience with each person. You can’t shame me by pointing a finger and exclaiming, “Aha!”

I am not a big proponent or follower of rules and regulations, policy statements, and firm positions. I deal best with people who have and practice a personal religion but who don’t make an issue of it or expect me to believe in and follow its rules myself. I am humble enough to know that I might be wrong, and not ashamed of admitting a mistake and moving along. I trust others to have the grace to do likewise. After all, life is a still-unfolding mystery. The universe is huge beyond our wildest imagining. No one has all the answers. And we are all just finding our way.

1. And the people who believe that really ought to examine their historical referents.

2. Kilrain was a fictional character in the book The Killer Angels, later made into the movie Gettysburg.

3. As opposed, I image, to being a member of the aristocracy, expecting undue deference, and looking down on everybody else. Or a member of the proletariat, or underclass, or whatever the opposite of an aristo is, and looking upward with hatred and envy at anyone better situated, more industrious, or better educated. The middle is not a bad place to be.

4. I live by the dictum that no rule is universal; there are always exceptions; and no one rule can be tailored to fit all situations. Our minds were given flexibility for a reason.

Sunday, November 18, 2018

National Novel Writing Month

Midnight writer

I’ve heard of this for several years now and even had friends participate. It’s a real thing, with an organization and a website, National Novel Writing Month, or NaNoWriMo. Supposedly, once you sign up and submit a profile, you can earn “badges” for various stages of accomplishment and then “win” by submitting the text of your novel of at least 50,000 words—which is more like a novella these days—and having it validated by the site. What is unclear to me—although I have not delved the entire site to read the fine print—is what you actually win, and then what happens to the text of your novel. Does NaNoWriMo get anything out of the process, other than the glow of helping inexperienced writers with encouragement, motivation, and a deadline? Does the organization, for example, obtain the rights to the submitted work? A professional writer would be concerned about such things.1

I have never participated in NaNoWriMo, because for me every month is novel writing month. At any time of the year, I am either drafting scenes and chapters on the current book; editing, coding, and laying out the book I’ve just completed; or plotting, outlining, and generally “noodling” the next book in my lineup. This little shop remains open seven days a week, and critical plot points and bits of dialogue may occur to me even on holidays and while traveling on vacation. Novel writing is a lifetime process.

Does writing novels pay well? Not really, if at all. Some people make it big in traditional publishing. And one hears of independents who are making the rent with their self-published books. But for me, with seventeen completed novels for sale and an eighteenth in production, the proceeds amount to coffee money each month, never more than a lunch out. To really establish yourself as a writer with a national reputation, you need to produce a novel that hits the bestseller lists, either at the New York Times or in some national sales forum. Bestsellers are not a thing you can plan, because luck is a major factor: creating just the right story to fit the national zeitgeist at just the right time.2

But that doesn’t mean you shouldn’t write a novel. Writing is a dip into the human creative process. If you have a talent for some art form, then that is the thing you must do. For me, it’s creating stories about people who never lived doing things that never happened. For others, it may be making music, putting paint on canvas, sculpting clay, or cooking gourmet meals. Not everyone—not even a fraction—among the people who love to cook gets to open a restaurant. Not everyone who paints or sculpts gets a gallery exhibition. Not every musician gets to join a band or play in an orchestra, and not every composer gets to hear her song or his symphony performed. But that does not mean you don’t try. And if your work does not make it to the public, that does not mean you don’t continue. Because the work itself is good for your soul. It’s what makes us human and different from the other animals and the robots.

Writing a novel will change you.

Unlike writing a piece of nonfiction, where the facts speak for themselves, or a short story, which you can complete in one white-hot burst in an afternoon, a novel takes gulps of time over many writing sessions. You must develop and maintain a voice that you can use, a character viewpoint—or more than one—that you can occupy, a feel for the time and place that you have created, and a mood that extends over many sessions, sometimes for months on end. I have likened the process to renting out half my brain to a troupe of traveling actors for a year at a time. They—or really, the whims and products of my subconscious3—will try out bits of dialogue or stage business in the middle of the night. They will suggest changes to the plot while I am thinking about something else entirely. And on occasion they will refuse my direction as I am trying to sit down at the keyboard and start a scene from the outline—but they just won’t let me. These actors and their internal director are a busy bunch, but they are also necessary to my writing process.

Until you have written a novel and submerged your active mind in the creation of another person in another world, you don’t know who you really are. You haven’t come face to face with the contents of your soul—which, for most of us, can stay safely buried in dreams that we forget, in daydreams we can ignore, and in random thoughts we can dismiss as bits of trivia or the whispers of the devil. A novelist has to wade into this mess and wring from the soul something that has real existence outside of his or her brain as words on paper. In the same way, a painter has to wring out a vision with a specific purpose and detail, a sculptor has to find a shape with meaning, and a composer has to find a melody with mood and coherence.

My own soul, I have found, is relatively stoic and restrained. My characters are not highly emotional people. Yes, they have their loves, ambitions, and desires, but those are usually deeply buried, giving direction to their lives but not to flights of words in their dialogue. They are too busy trying to figure out how the world around them works, and what they have to do to survive or thrive in the current situation; so they can’t spend time wishing and dreaming how the world might be different. They excel at mechanical contrivances and traps, and this reflects my own upbringing.4 Their reaction to adversity—and mine, too, on occasion—is one of amused cynicism, followed by a determination to work things out and not get crushed.5

If writing is your thing, then I would advise you to try writing a novel. It may not produce a bestseller, but that’s not the point. It will introduce you to yourself.

1. Oh, deeply concerned! Every writer who hopes to make it big—well, hoped, once upon a time—in publishing should be suspicious about the rights to the work. Once you sell or give them away, the text is no longer yours to use and publish, or even to have much say in how it will be used, altered, where and when published, and otherwise passed into slavery. If your novel is like your child—and the whole process for me takes about nine months—then you become protective of its dignity and its future.

2. And a bestseller is not a thing you can emulate or achieve by riding on another’s coattails. When I was cut loose from the traditional publishing world—and that’s a story for another time—and casting about for an agent, I heard many variations of “if you could only write a story about a boy wizard with glasses, I could really sell that.” This was a reference the phenomenal success of the Harry Potter series. Later it would have been “a story about a shy girl who meets a sadistic billionaire,” or whatever is popular right now. The trouble with chasing someone else’s bestseller with your own book is timing. If that other book is booming right now enough for you to know it, you will still need some weeks—if you are very fast—or months to write your own copycat version, more months to parade it before a string of potential agents, more months after that for the agent who accepts you to secure a sale, and then about a year for the publisher to edit, typeset, print, and promote your book with the national bookstore buyers. And in that year-plus-plus, the traveling circus of reader interest will have moved along and something else will be popular. The brass ring is for that other novelist, not for you.

3. See Working With the Subconscious from September 30, 2012.

4. See Son of a Mechanical Engineer from March 31, 2013.

5. In this way, my characters are a lot like those of traditional science fiction stories, as is my own reaction to things. Well … you are what you read.

Sunday, November 11, 2018

What Is Good?

Vitruvian blood vessels

I have never concealed the fact that I am an atheist—although I sometimes sail under the flag of agnosticism, the state of “not knowing,” in order to avoid bar fights. I do not accuse or belittle people who have had an actual religious experience, heard the voice of God or the rustle of angel wings, and are guided by the principles of their religion. Peace be unto them. But I never had the experience, and I have neither the genetic makeup nor the cerebral or psychological components necessary to perceive that unseen whisper. But at the same time, I am not in G. K. Chesterton’s famous line, “capable of believing in anything.” I have my own principles, after all.

One of those principles is evolution. I have worked at both a manufacturer of biological pharmaceuticals and a developer of genetic analysis equipment. I know enough biology and have read enough about genetics and cladistics to appreciate that all life on Earth is related. The octopus is not an extraterrestrial alien dropped into this planet’s oceans—as some sources have recently claimed—but is cousin to the squid and the cuttlefish in the class Cephalopoda, just as human beings are cousin to the mouse and the lion in the class Mammalia.

Evolution is not just the “survival of the fittest,” as the popular saying goes. The evolution of a biological organism takes tiny changes in the genetic code, potentially effecting tiny changes in form and function, and then either implements them immediately—especially if the change is harmful or fatal to the bearer—or holds them quietly in the genome as a recessive or alternate copy of the gene until the features engenders can come into play. The DNA/RNA/protein coding system has many built-in safeguards that make most random changes in the code neither immediately fatal nor immediately helpful. For example, each three-codon reading frame, in which three base-pair sequences call for any one of the twenty amino acids used in assembling a protein molecule, usually has several alternate forms for calling each amino acid; so a change in just one of the “letters” will usually still create the intended protein. The system is robust—so that we can still have viable offspring and recognize them as human—and yet just fragile enough that changes are possible over generations.

And those changes and their effects are not necessarily crude, achieving just basic survival or writing off the individual organism with a lethal deletion. The cheetah was not born to limp over the veldt in pursuit of its ambulating prey. Time and the millions of minute alterations to the genetic code governing the cheetah’s musculature, metabolism, and nervous system allow it to lope gracefully and efficiently, outrunning the swiftest antelopes and wildebeests, which are themselves adapted to run just fast enough—most of the time—to elude their predators. Evolution is not just a mechanism of survival but a mechanism of optimization, efficiency, and ultimately of temporary perfection.

I have called DNA the “god molecule,”1 but that is not because I worship it or think it has supernatural powers. The DNA/RNA/protein system is simply the instrument of evolution. It has created not only all the varied life we see on this planet but also, because of the impact that life has had on shaping the atmosphere, seeding the oceans with abundant life, and covering the hills with vegetation and grazing animals that change their erosion patterns, it has changed the surface of our world itself. The original Earth, before the first bacteria and blue-green algae evolved to give it an oxygen-rich atmosphere, was as hostile to our kind of life as the surfaces of Venus or Mars are today.

But the principle of evolution applies to more than just organic structure and function. Most of the structure and function of human society and the approaches in any human endeavor, from technology to the arts, have advanced by a form of social evolution: small—but sometimes large—changes introduced into a complex situation, there to be either discarded, adopted, or further adapted. In rare cases, like the mathematical thinking of a Newton or an Einstein, a single person will make a significant change in human society and history. But for the most part, what one person starts another will then adapt and improve on, so that the seminal invention is lost in a continuous flow of minor and incremental developments. The invention of the stirrup and the wheeled plow, with their migration during the Middle Ages from Asia into Northern Europe, are such examples.

In the same way, the structure of many human social concepts like love, justice, honesty, reciprocity, personal freedom, and other exchanges that we consider “good” and weave into the stories we tell are the products of social evolution. Human families, clans, tribes, city-states, and nations learned over time by piling one experience and its consequences on another that certain strategies of exchange either worked or did not. For example, they settled early on the basic understanding that habitual lying is harmful both to the people who must deal with the liar and ultimately to the liar himself. That fair dealing and reciprocal trade are a better system of exchange than theft and plunder. That hereditary servitude is not proper treatment for any thinking human being, and a society that practices slavery may flourish for a time but will eventually collapse. That love is a stronger bond and lasts longer than hate. And on and on. We learned these “home truths” at our mother’s knee and passed them down through the cultural wisdom of our clan and tribe long before some prophet wrote them on tablets of stone or bronze and suggested they were the teachings of the gods.

This does not mean that dishonesty, plunder, slavery, hatred, and other injustices don’t exist in the world. Or that sometimes these strategies of exchange will not work just fine in some situations—especially if there is no one stronger around to keep you from getting away with them. Ask the Romans, or the Mongols, the Nazis, the Soviets, and any of history’s other bent and crooked societies that have made a bad name for themselves. But thinking human beings, left on their own to study and consider the situation, will conclude that these negative strategies do not work for the long haul or for the greatest good of the greatest number of people.

Not only has human society as a social construct but the human nervous system as a response mechanism evolved in tune with these beneficial strategies. Try taking from a toddler the treat that its mother has given and see if that tiny human brain does not immediately register and react to the unfairness of your action. Hear children on the playground taunting each other—perhaps even with names and descriptions having a superficial gloss of truth—and see if the recipient does not explode with anger at the perceived dishonesty. We all understand how the world works and know when others are practicing falsehoods and injustices upon our person and our sense of self.

It does not take a god from a burning bush with a fiery finger to write out the rules of what is proper and good in any human exchange. We know it from before we were born, because our brains and our society had already supplied the answer, hard-wired and ready to function. In the same way, we see the world in the colors for which our eyes were adapted, breathe the air for which our lungs were optimized, and recognize the adorable cuteness of babies and puppies because it is beneficial to both that our brains release the right endorphins at the sight of them.

Evolution says that we are at home in this world because we are the products of this world. And that is enough of a natural wonder for me.

1. See, for one example among others, The God Molecule from May 28, 2017.

Sunday, November 4, 2018

The Next Civil War

War devastation

It has been suggested for some years now, at least for the past decade, that this country is in the midst of a “cold civil war.” Disagreements of both policy and principle between the progressive left and the conservative right have reached a fever pitch. Factions are marching in the streets and attacking each other with bats and chemical sprays—although the fighting hasn’t reached the stage of firearms yet. Friendships are breaking up, sides have formed, and the lines are drawn on social media. I even know of one Facebook friend who seems ready to divorce her husband over his political views.1

We’ve been here before, back in the late 1960s, when I was at the university and the young people in college and the radical activists were protesting against the Vietnam War and in favor of civil rights and free speech. Back then, we had campus demonstrations, protest gatherings on the Washington Mall, and rioting in the streets—most notably outside the 1968 Democratic Convention in Chicago. The difference between then and now is that in the ’60s the radical view and its hard-line conservative response were both on the fringes of the political spectrum, while the two main parties could still conduct business in a relatively consensual, bipartisan fashion. Today, the two parties function in lockstep with their most radical elements. Discussion and votes in the Congress and decisions on the Supreme Court are divided along party lines with almost no crossover. The White House and the top echelon of the Executive bureaucracy swing back and forth with whichever party captures the Presidency.

On the one side, we have people who want to create and celebrate a “fundamental transformation” of the country’s political, economic, social, and environmental relations according to a perceived “arc of history.” On the other are those who don’t mind moving forward into the future by evolutionary steps but resist being pushed bodily through revolutionary action. Frustrations abound on either side, and with them come name calling, social shunning, brick throwing, and tear gas.

Some people are even speculating—myself among them, and mostly since the upheavals of the 2016 election—that the cold civil war will eventually turn hot. That our political and economic differences, our social and environmental positions, will reach a point where they can no longer be resolved by discussion and bargaining, by yielding on some points and advancing on others, to arrive at a national consensus. That the political crisis will demand a clear-cut winner and loser. That internal peace will only be achieved when one side or the other can no longer stand up for its position because its politicians and their supporters have—each man and woman—been economically subdued, personally incarcerated, or rendered dead. Or when the country has been divided by physical partition and personal and familial migration, as occurred between India and Pakistan in the late 1940s, with each party maintaining its own new national government.

The first American Civil War of the 1860s was a dispute between cohesive regions, North and South, Slave State and Free. But many people think the current differing viewpoints are too intermixed for the country to break and go to war along regional lines and across state boundaries. This view says that the coming hot war will be more like the Spanish Civil War of the 1930s, with neighbor fighting neighbor for control of the cities and the countryside for each party.

I can see the reasoning for either approach. In many ways, the opposing sides in this country reflect a divergence between urban progressives and rural conservatives. We keep seeing that map comparing the votes cast in Los Angeles County—which is just the urban core of the big place we think of as “LA”—and those in the seven states of the Upper Northwest, from Idaho to Minnesota. And really, even California is not a homogenous polity, because the feeling in communities of the foothills of the Gold Country and in the Sierra is more conservative than the progressive politics of the big cities in the Central Valley and along the Coast.

But I can also see a breakup between regions. The states along the Pacific Coast, in the Northeast, and across Upper Midwest are typically progressive, while the middle of the country is typically more conservative—with a few isolated exceptions like Colorado and New Mexico.

The question of how the country will break apart if and when war comes depends, in my mind, on what incident, what spark, finally sets it off. If the decisive point is internal, say, an election that fails to satisfy one party so greatly that it simply revolts, then we might see a piece-meal collapse as in the Spanish Civil War. But if the incident is external and the shock is to the whole country, then we might see a response that takes shape along regional and state lines.

The latter is the picture I painted as a leitmotif to my two-volume novel about life extension through stem-cell organ replacement, Coming of Age. There, the incident was the repudiation of the national debt.

When I was in college, my economics text book said the national debt was irrelevant because it was just money that we owed to ourselves, financed by Savings Bonds held among the citizenry. No one was going to call in that debt; so the government could just keep financing it by issuing more bonds. As recently as 2014, however, almost half of our publicly held debt in the form of U.S. Treasuries, and a third of our total debt, is held by other governments and offshore banks. The biggest holders are China, Japan, Ireland, Brazil, and the Caribbean banks.

If these external holders wanted to collapse this country—which, given that our global economy is so interconnected, would be a foolish thing—they could simply sell off huge blocks of the U.S. Treasuries they now hold. The federal government would then have to scramble to make good on the sales, and so would likely impose massive economic restrictions and additional taxes on the American public. In my book, this prompts many of the states in the central part of the country—whose residents don’t feel they are well represented in the federal government’s spending decisions—to renunciate the debt and along with it their allegiance to the Union: either secede from the union or go broke by staying in it.

Under those conditions, many of the National Guard units would side with their home states. And many U.S. Army, Navy, and Air Force bases located in these states might weigh their allegiance to the national government against the conservative political instincts of their commanders and troops. The split would not be uniform. The choices would not be pretty. And once initial blood was spilled in the breakup, it would not be much more of a step to spill blood in establishing either national dominance or domestic partition.

In my novel, the breakup along these economic lines came in the year 2018. Of course, that year has now come and is mostly gone. But the weight of the national debt and the simmering divisions of our domestic politics still hang over us all.

I don’t look for war or want it. But my novelist’s ear listens to the rhetoric that is now splitting the county along its fracture lines, and I cannot discount the possibility of a shooting war coming to these United States sometime soon.

1. My late wife and I had opposing political views: she an old Berkeley liberal Democrat, me an unreformed Eisenhower-era conservative Republican. But we fell in love and married in an earlier time, some forty years ago, when political differences were treated in the same way as differences of religious doctrine and practice: a private, personal matter that did not touch on the essentials of what made a good person. My wife and I shared the same values about honesty, integrity, kindness, education, and fair dealing—and that was what mattered. For the rest, we joked about making sure we each went to the polls on election day so that we could cancel each other’s vote.

Sunday, October 28, 2018

Radicals and Revolution

Joseph Stalin

Joseph Stalin

I am becoming more and more concerned about the direction of politics in this country.

It used to be—back when I was growing up, and for a decade or two afterward—that Democrats and Republicans could be civil to each other. They might differ on policy issues but they agreed about the principles and process of governing. Their disagreements were more about emphasis and degree than about goals and objectives. We debated a larger or smaller role for the public and private sectors, for amounts of taxation and regulation, for our stance in international relations, for the uses of military force versus diplomacy, and for other aspects of life in this country. But we could all agree that the United States was and would remain a country of individual liberties, rights, and responsibilities, functioning in open markets that supported investor capitalism, and with sovereign powers and responsibilities in the larger world, all under a government structure spelled out in the U.S. Constitution. These things were the bedrock of our society.

Yes, there were fringe groups. In the late 1960s, radicals on the New Left demonstrated not just against a war they didn’t like but actually in favor of our Cold War enemies in the Soviet Union and China, as well as for our “hot war” enemy in North Vietnam. They envisioned an end to capitalism and the introduction of a Communist-style command-and-control economy in this country. They wanted to overthrow the U.S. government in a violent revolution. The Students for a Democratic Society and their adherents marched for it, and the Beatles sang about it. But reasonable people in the mainstream parties ignored them—or called them “crazies”—and went about the business of governing.

Some of those New Left radicals grew up and matured but never abandoned their extremist principles. Saul Alinsky became a “community organizer” and wrote his signature work, Rules for Radicals. Bill Ayers, one of the founders of the Weather Underground, became an education theorist but remained in his heart a domestic terrorist. One of Alinsky’s admirers—she actually wrote her senior thesis at Wellesley College on him—was Hillary Rodham Clinton. A Democratic politician who was early-on linked to Ayers in Chicago was Barack Obama. But Obama’s and Clinton’s associations can be written off as errors in judgment due to their youth.

Now, and for the past dozen years or more, the internal direction of this country has shifted. The radical viewpoint has grown from a splinter at the extreme left of the Democratic Party to a driving force in much of that party’s current policies and rhetoric. In recent years, many Democrats have expressed sympathy with “Democratic Socialism,” which joins political democracy with government ownership of production and distribution. Many Democrats now favor open borders and the surrender of national sovereignty. The want a free-form and non-originalist interpretation of the U.S. Constitution, including revision of the First Amendment, abolishment of the Second Amendment, and elimination of the Electoral College. The largest unionized group in this country is now the public sector, which bargains for increased pay and protections from politicians who are not negotiating with their own money. That’s an idea that even so liberal a politician as Franklin Roosevelt thought disastrous and ought to be discouraged.

Since the presidential election of 2016, many Democrats have been in open revolt. Celebrities have called for bombing the White House. And seemingly rational people proclaim the current incumbent to be “Not My President.” Since the confirmation of Bret Kavanaugh to the Supreme Court went against the wishes of the Democrats in the Senate—largely because they had previously invoked the “nuclear option” of making judicial confirmation a simple 51-vote majority instead of the 60 votes formerly needed to override a potential filibuster—the party has called the legitimacy of the court itself into question.

People on the left now want the country run as a pure democracy rather than the republic of elected officials that the Constitution established. Without the Electoral College, which was put in place to give small states with scant population some parity with large states, the citizens of Los Angeles county would be able to outvote the citizens of the Upper Northwest from Minnesota to Idaho. Presidential elections are always strategic games of chess, trying to take and hold at least 270 electoral votes. The makeup of the Senate, with two Senators for every state regardless of size, protects the interests of small states that are out of sight and of mind of large state and big city politicians. The Electoral College is part of the “checks and balances” put into the Constitution to protect minority interests from the whims and tyrannies of the majority.

What I fear in this new direction is what the left now seems openly to want: revolution. A country’s government is just laws and procedures, words written on pieces of paper or, worse, loosely preserved in bits and bytes somewhere in computer memory. What makes those words work is the belief of citizens and their elected politicians in the principles behind them. We agree to these things being true and necessary. We dismiss them at our peril.

The Constitution has set up a mechanism to change any part of it or add to its effective structure through the amendment process. But that process is hard, requiring any amendment to get a two-thirds vote in the House and the Senate, or being ratified by two-thirds of the state legislatures meeting in a special convention. The idea is that any change has to be popular enough and gain enough bipartisan support to pass this hurdle. We’ve done it twenty-seven times over the years, addressing issues as important as the voting rights of former slaves, women, and people age eighteen to twenty-one, and issues as ephemeral as the drinking habits of the average citizen.

The radical notion of changing or abolishing elements of the government or the Constitution itself by popular vote would invite a landslide. We have seen something of that kind operating in California, where initiatives are placed on the ballot by petition and voted on by the public at large. Those that pass acquire the force of law. Some initiatives have had good effects, some bad. But all are subject to the same verbal tricks and legal manipulations of any advertising campaign: claiming to support one thing while actually undermining or subverting it. Popular politics is a rough game to play.

The edifice of the U.S. government is fairly robust and can stand to have a few bricks knocked loose from time to time. But a popular onslaught in the name of “revolution”—whether meant picturesquely or in deadly earnest—could lead to a collapse. We’ve seen that happen before: France in 1879, Russia in 1917, Cambodia in 1975. The problem is that immediate and necessary changes in government always start with well-meaning people who have goals, a plan, and a vision for the future. But once the structure collapses, anything goes. The process starts with essentially kind-spirited liberals like Jean-Paul Marat or Alexander Kerensky; it ends with closed-minded tyrants willing to spill vast quantities of blood like Maximilien Robespierre and Joseph Stalin.

My fear is that the people so innocently dreaming of and calling for revolution today are the nation’s Murats and Kerenskys. If they get what they want, these people will go up against a wall within about six months. Waiting in the wings are the Stalins and the Pol Pots. And they will not be so liberal or kind-hearted.

Sunday, October 21, 2018

Mind Games

Subatomic particle

I am just finishing up Adam Becker’s book What Is Real? about the relationship between quantum physics and the real world it is supposed to represent. Becker tells a good story, especially as an introduction to the world of quantum physics, the players over the years, and the intellectual principles involved. His basic premise is that, while the equations that physicists use to predict the outcome of their experiments—and so test the value of those equations as representations of the underlying world of the very small—have consistently proven their worth, the physicists themselves remain in doubt as to whether the world that they are describing actually exists.

Without going into the entire book chapter by chapter, the issue seems to be one of describing a world so small that we cannot detect it without changing it. Atoms and their component protons, neutrons, and electrons—plus all the other subatomic particles in the Standard Model—are not fixed in space like pins on a board. As with everything else, they move, as do galaxies, stars, and planets. However, instead of occupying observable orbits and tracks across the night sky, atoms mostly vibrate with the energy of what’s called “Brownian motion,” and electrons buzz frantically and randomly around their nuclei like flies in a cathedral.

We can detect the larger celestial bodies—and even masses as small as freight trains and automobiles—with visible light without the danger moving or deflecting them much. Bounce a few hundred thousand photons off a teacup, and you will not move it one millimeter. But the subatomic particles are so small that the wavelength of light we can see is so long that it misses the particle entirely, passing over and under it with no impact. Imagine that the wavelength is a long piece of rope that two girls are spinning in a game of Double Dutch. If a human-sized person enters the game and performs unskillfully, the rope has every chance of hitting—that is detecting—his or her body. But if a flea jumps through the game area, the chances of that long, curved rope ever touching its body become vanishingly small.

To detect subatomic particles, physicists must use other particles, as if in a game of subatomic billiards, or photons with much shorter wavelengths and thus having much higher energies. A high-energy photon impacting a moving electron or proton will change its direction of motion. So the issue in quantum physics is that when you locate the particle you are observing here, it’s now no longer there but going somewhere else. In quantum physics terms, no particle has an exact position until it’s observed, and then it has some other position or direction of movement in response to the observation. Mathematically, the particle’s supposed position can only be defined by probability—actually, a continuous wave function that defines various probable positions—and this wave “collapses” into a single definite position at the place and time of your observation.

Well and good. This is what we can know—all that we can know for sure—in the world of the very small.

The first issue that Becker’s book takes up is that most of the original proponents of quantum physics, including Niels Bohr and Werner Heisenberg, adopted this lack certain knowledge to an extreme. Called the “Copenhagen interpretation,” after Bohr’s institute in Denmark, their view insists that the entire point of quantum physics is the manipulation of the results of observation. The measurements themselves, and the mathematics that makes predictions about future measurements, are the only things that have meaning in the real world. The measurements are not proof that subatomic particles even exist, and the mathematics are not proof that the particles are doing what we think they’re doing. To me, this is like calculating the odds on seeing a particular hand come up in a poker game, or counting the run of cards in a blackjack game, and then insisting that the cards, the games, and the players themselves don’t necessarily exist. It’s just that the math always works.

Other physicists—including Albert Einstein—have been challenging this interpretation for years. Mostly, they pose thought experiments and new mathematical formulas to prove them. But the Copenhagen interpretation persists among quantum physicists.

A second issue in the quantum world is the nature of “entanglement.” Here two particles—two atoms, two electrons, two photons, or two other bits of matter that is sometimes energy, or matter that oscillates with wave-like energy, or waves that at the instant of detection appear as singular objects—become joined so that what one of them does, the other will do. This joining and the parallel actions persist through random occurrences—such as passing through a polarized screen—and are communicated instantly across distances that would violate the limit of light-speed travel for any object or piece of information. Here is the sort of “spooky action at a distance” that Einstein derided as a violation of general relativity.

A third issue in quantum physics is the nature of Schrödinger’s cat. To illustrate the limitations of measurement, Erwin Schrödinger proposed the thought experiment of putting a cat in a sealed box with an apparatus that releases a poison when triggered by the decay of an atomic isotope. Since the atomic decay is unpredictable, the cat in the box might be alive or already dead. It was Schrödinger’s point that until an observer opens the box, the cat exists in two “superposed” states—both alive and dead at the same time, expressed by a wave function of probability—and that the wave function does not collapse and reveal the cat’s final nature until the box is opened. As a thought experiment, this is a metaphor for measurement and observation. But some physicists insist that the superposition is real. The actual cat is physically both alive and dead until discovered.

This superposition has led some physicists to describe a splitting of the universe at the point of the box’s opening: one universe proceeds with a physicist holding a live cat; the other with a physicist mourning a dead cat. This is the “many worlds” interpretation. Both universes are equally valid, and both continue forward in time until the next quantum change that forces each universe to split again in some other way.1

Now, I freely confess that I do not have the mathematical skills to understand the equations of quantum physics. And mercifully, Adam Becker’s book does not focus on or discuss the math in detail, just the thought experiments and their supposed meaning. I also confess that I do not understand what condition enables two particles or two waves to become “entangled,” or how they interact at a distance in this state, or what might be required to untangle them. Becker does not explain any of this, either. Further, I confess that I can sometimes be simpleminded, rather literal and obvious about what I see, hear, and know, and oblivious to distinctions and nuances that other people perceive easily.

But, that said, it would seem to me that what we have here is a misinterpretation of a metaphor. The limitations of observation and measurement, as expressed in colliding particles and probabilistically dead cats, are simply reminders that we do not have direct perception of the quantum world in the same way that we can see, hear, touch, and taste, if necessary, a steam locomotive or a billiard ball. That’s a good thing to keep in mind: we don’t have all knowledge about all things. However, to insist that this metaphorical reminder means that quantum physicists are simply doing math, and that their calculations—no matter how enticingly predictive—have no meaning in the real world, that quantum physics is just a mind game … that’s taking things too literally.

I have criticized the use of mathematics to prove the improbable before.2 And I insist again that, if all you’ve got is a series of equations to prove your point, you may just be playing mind games with yourself and your fellow physicists. But the reverse is also true: the real world must exist at the quantum level. If the math works out, if the vision behind it holds together, then it must be describing something that has actual substance and energy. The details may not be exactly as we understand them. The description may be missing some elements, forces, or bits of math that we haven’t worked out yet. But the world must exist in the smallness of subatomic particles as much as it does in the vastness of stars and galaxies.

The math doesn’t exist in a quiet vacuum. The cards, the game, and the players must also exist to give the calculations meaning.

1. I have cheerfully used the many-worlds interpretation in my novel The Children of Possibility, about time travelers from the far future, and in its prequel The House at the Crossroads. But I know I’m having fun and don’t take this stuff too seriously. So much fun, in fact, that I’m now working on the sequel that picks up where Children left off.

2. See Fun with Numbers (I) and (II) from September 19 and 26, 2010.

Sunday, October 14, 2018

Courage in Authority

King Lear’s Fool

King Lear’s Fool

We have a young man on the board of directors of our condominium homeowners association who is consistently negative. He routinely predicts disaster in every situation. If someone proposes a solution, he calls for more consultants, more bids, more analysis, more legal review. He always criticizes proposals and decisions by other board members for their lack of “doing their homework” and “due diligence,” or their failure of “fiduciary responsibility.” If he offers a solution of his own, it is numbingly complex—if not self-contradictory—and hedged with so many technical and legal caveats that it becomes simply unworkable.

He has been responsible at times for bringing the entire organization into a state of paralysis. And if other board members vote for a motion that seeks to override his objections, he always votes against it or abstains, in order to preserve his right to later criticize the decision. Yet he never considers—or offers to take responsibility for—the negative consequences of action postponed or prevented by his criticisms and time and money spent on considering his objections.

If this young man, his attitude, and his effect on the organization were unique to our homeowners association, this might make a good story but would hardly rise above a curious local anecdote. The truth is, we see this kind of negativity too often in our current politics on both a local and a national level—and too often in the corporate and other spheres. Problems are insurmountable. Solutions are insufficient, infeasible, unprincipled, illegal, or unconstitutional. Nothing can be done but, at the same time, the situation cannot be allowed to continue.

The position of the naysayer, the delayer, and the critic is an easy one to assume. It involves no great courage to demand that the organization take more time to consider, seek another opinion, gather more data, investigate all possibilities.1 The organization usually places no blame if we don’t perform an action, approve a decision, praise or support a member, or confirm a vendor. For if the action or decision is not made, or the person or situation is left in a state of uncertainty, there is no discernible result that might later be examined and criticized. It’s a no-lose position for any member of a group to take.

What requires courage is to take action, make a decision, or give your approval and blessing to another person or group. Of course, the action might fail, the decision lead to disaster, and the person in question turn out to be a liar, a thief, or a scoundrel. Those possibilities always exist. The best that anyone can do is make a judgment based on available data, personal experience, imaginative projection, good founding principles, and common sense. After that, the outcome is in the realm of probability or—in an older view—the lap of the gods.

Any position of authority requires such courage. Even when an organization has a second in command, a board of directors or council of advisors, a legal and technical staff, and an on-site actuary, most decisions come down to one person willing to act—or to formulate and spread a vision upon which others can take action. Any deliberative body, like a senate, assembly, parliament, or a condo board will, on any one issue, look to the person who will take the lead to find or imagine a solution, provide arguments for it, defend it against its critics and naysayers, and call for action or a vote.

That person must inspire confidence among those who will vote for the solution or be required to act on its implementation. They must believe he or she is a person of integrity, sound judgment, and experience. Moreover, they must believe he or she is acting in the organization’s best interest and not for personal advantage.

But still, the person in authority is taking a risk. If the action or solution fails, the proposer or promoter will be labeled a failure along with it. Even if the proposal had a unanimous vote behind it, the leader who complains, “But we all agreed …” is taking a weak position. The rest of the organization will simply respond, “Yes, but we agreed with you!

This is why we ask of people in authority that they possess and demonstrate courage along with their other qualities of experience, judgment, integrity, and sobriety. The CEO of a corporation, the captain of a vessel, the pilot of an airplane are all required to take responsibility for their actions. They must make judgments, recommend and follow courses of action—sometimes in an instant and without recourse to advice, consultation, and second opinions—and trust that the people around them—subordinates, employees, crew, vendors, suppliers—will perform appropriately. And if the performance of the people undertaking the action, or the mechanism of the ship or plane itself, were to fail, then the CEO, captain, or pilot stands ready to take the blame. If the person in authority did not have this courage, then the company would never do anything, the ship never leave the dock, and the plane never leave the ground.

It’s a simple lesson: Action takes courage. Delay is not always wise or safe. And the path forward leads upward and requires strength.

1. For the role of the leader in making a decision, see the story of “five heartbeats” in The Immediacy of Life from April 29, 2018.

Sunday, October 7, 2018

The New Conservatism

Lenin on a Tribune

A. Gerasimov, Lenin on a Tribune

I believe there’s a common feeling among those who follow politics and economics, based mostly on the labels assigned, that “conservatives” want things to stay the way they are, while “progressives” want things to move forward.

Conservatives are supposed to yearn for the political, economic, and social conditions of their youth. In my case that would be rock-n-roll, ducktail haircuts, the postwar boom, Eisenhower political blandness, and stable nuclear families living in suburban housing with good schools. There were some downsides to be sure: duck-and-cover drills, Jim Crow segregation, Formica in loud colors, and Melmac dinnerware. But all in all, for the white middle-class majority, it was a good time to be alive in America. We didn’t see the social and economic problems or, if we did, we minimized them.

Progressives are supposed to look ahead to better times, which means focusing on the things that need to change right now. For most progressives these days that would be income inequality, industrial and automotive pollution, environmental damage and anthropogenic climate change, racial inequality, binary gender inequality, capitalist winners and losers, housing shortages, healthcare governed by insurance companies, and cultural hostility for “the other” leading to rampant hate speech. Sure, there are some good things: advances in renewable energy, administrative regulations on industry and finance, progressive income taxes, union protections, feminism, and the #metoo movement. But these things are not enough—may never be enough—when what is needed is a true social, cultural, and economic revolution to make people equal in both their expectations and outcomes, happier with their lives, and kinder to each other.

But are these labels correct?

I believe many conservatives have a forward-looking approach in many areas, including politics and technology. They believe the social and economic climate is improving all the time, compared to the situation fifty, a hundred, or two hundred years ago. They believe in continued evolution in this regard, but not abrupt revolution. Much of their expectation is based on humankind’s increasing knowledge and technological capability, derived from the application of scientific and humanitarian principles originating in the Enlightenment of the 17th and 18th centuries.

In contrast, many progressives seem to be in the position of tacit conservatives. They don’t trust evolutionary change in social, political, or technological conditions, largely because such change is not predictable or guided by the principles to which they subscribe. In other cases, they actually want to preserve a static world which is safe and predictable until they choose to change it through a directed revolution.

Let me suggest three areas in which this is so.

First, union protections. The history of unionism has been one of fighting changes in technology and working conditions that might affect the number and skill levels of jobs, or require workers with seniority in a craft to learn new skills or enter new positions. The classic example of this tendency was “featherbedding” in the railroads during the 1930s and ’40s, preserving the jobs of firemen who stoked the boilers on steam engines when the railroad companies converted to diesel-electric locomotives. An earlier example was hand weavers who tried to destroy and ban mechanical textile mills because the machines put them out of work. Unions consistently choose older ways of working over new efficiencies if it means that certain jobs and skills will become outmoded. This is a bid for stasis over advancement and is, at least in spirit, non-progressive. What they will make of artificial intelligence and increasing automation in the workplace is totally predictable.

Second, capitalism itself. The basis of market-driven economics and capital investment is “creative destruction.” Every product and service, every company that provides products and services, competes in the marketplace for consumer attention and dollars. Consumer favoritism and brand loyalty only go so far—and not far at all if a product line or service deteriorates in terms of quality, usefulness, price, or some other dimension that customers value. Sometimes, however, frivolous products or variations are introduced and sold; the classic example is Bernie Sanders’s complaint about “twenty-three kinds of deodorant.”1 But by and large, new and useful products are coming all the time: consider the personal computer and the internet revolution.

Capitalism in a free market means giving people what they want, even if it means giving them what they only think they want—or what you can convince them to want, or deceive them into wanting. Capitalism is not predictable and directed, but decidedly uncontrolled. Sixty years ago, when I was a child, everyone confidently predicted that my car would fly by the time I was middle aged. But no one, looking at the basement full of vacuum tubes or single transistors that was the current state of the art in computing predicted the development of the integrated circuit, the microchip, and telephones that would eventually replace cameras, stereo systems, movies and television, telegrams, libraries, and retail stores. Creative destruction is a wild and woolly territory—just ask a taxi driver whose radio-dispatched cab is being replace by a cellularly summoned Uber or Lyft driver.

We’ve seen enough of the command-and-control economies that were spawned from social and economic revolutions in the 20th century to know how they operate. They were all focused on preserving the status quo in terms of products, processes, and services. None of them developed the advances in computing, personal communications, or consumer goods—let alone medical technology and energy infrastructure, to name a few more areas—that we have steadily enjoyed in the capitalist West.2

Third, the environment. Is the climate changing? Oh yes! It was changing before modern industrialization and transportation fueled by coal, oil, and gas began increasing the atmosphere’s carbon dioxide load. We live on a planet with a precession in its orbit, under a variable star, with an active geology based on plate tectonics. We have gone through periodic ice ages, glaciations, warming and cooling periods, and occasional long winters due to volcanic eruptions ever since humans started recording their history—even before, if you count all the cultures with a flood story in their mythology.

Sea level rises and falls, deserts grow and shrink, forests advance and retreat, rivers change their course, all without the influence of human activity. Life has evolved on this planet to adapt to these changes. Every extant individual and species was shaped to take advantage of a particular environmental niche—except humans, of course, who use their big brains and clever hands to build shelters and machines that let us exploit areas where we otherwise could not live. Since those environmental niches—particularly the ones with marginal populations—are changing all the time, some species must either adapt, move, or die out. It matters not how picturesque or precious a species might be, if it lives too close to extinction in terms of diet or tolerance for environmental stress, it will eventually disappear. In the long run, no one can save the panda.

And yet the current crop of environmentalists would try to prevent this change wherever possible. They want a static world in which every river, swamp, and forest remains unchanged, where every butterfly and exotic plant can be preserved. They want to fix the world’s climate at some preferred set point—usually around the time and temperature of their childhood—and maintain it … forever.

Even the politics of the progressives is frozen in place and time. Their view of “the arc of history” is guided by a 19th-century view of social and economic order as prescribed by Marx and Lenin and then communicated by the anti-war radicals and anti-capitalist activists of the 1960s. It is a world view that values world peace at the expense of national sovereignty and the primacy of human-muscle labor at the expense of technological advancement. If they were alive today, Marx would not be a Marxist, and Lenin would be busily adapting and promoting some other social and economic creed.

I believe we are at a time of great confusion over labels and intentions. I also think we are at a time that demands a new teaching, a new world view, a new politics and economics that is neither “conservative” nor “progressive” but adopts a new social and philosophical stance entirely.

I just wish I knew what it was.

1. I’m sure all the ladies out there wouldn’t mind using my brand of deodorant, which has the image of a sailing ship on the package. Or that Bernie wouldn’t mind using the Secret brand—“Strong enough for a man, but made for a woman.” One of the comments about life in Russia in the 20th century was the prevalence of “Soviet scent,” as if one smell would fit all bodies.

2. To be fair, none of them made flying cars, either.