Sunday, December 9, 2018

That Voice in Your Head

Man holding a mask

Recently our group at NAMI East Bay heard a panel of consumers1 discuss their experiences with hearing voices. Two of them were from the Bay Area Hearing Voices Network, an organization that helps people with the “lived experience” of “hearing voices, seeing or sensing things that others don’t, or have extreme or unusual experiences and beliefs” come to understand and explore the meaning of these phenomena.

One of the panel members, who had taken prescribed medications to subdue the voices, found the side effects so distressing and the results so problematic that he preferred instead to live with the voices. A second panel member actively interrogated the voices and tried to understand them, asking whether they were ghosts, or pure energy from another dimension, or extraterrestrials. He would reason and debate with them when they told him to harm himself. A third felt that the voices were an inspiration, and he knew they came from outside his head because he could not feel the resonance in his skull when he himself spoke aloud. This man came to trust their answers when he needed to take an examination or make a business report.

The original Hearing Voices Network in the UK was founded in part by Ron Coleman, a consumer himself who is now in recovery and works to provide “recovery centered treatment” to other consumers. The principle seems to be that these experiences are not the symptoms of an illness but real events—as indicated by two of the panel members above—that the person feels he or she should explore in a positive way and that others—loved ones and family members—should be curious and supportive about rather than judgmental. The basic good that I can see in all this is that, if the phenomenon of hearing voices cannot be treated and eliminated with either medication or psychotherapy, at least it should not become a source of fear and anxiety for the patient. Support from and discussion with others who share the experience perhaps can approach this good result.

But I still don’t believe the voices are real—or anything more than a neurological or perceptual fault in the auditory processing centers of the brain.2

One of the panel members said that hearing voices is a common experience. He is right—in the sense that humans are a verbal species and routinely put our thoughts, however silently inside our heads, into words. We may not convert all of our sense impressions and internal thoughts into words, but we certainly try. For example, if I smell something familiar, I will usually try to identify it with a word: “This is ‘coffee.’ ” Or, “That’s ‘a rose.’ ” If I see an unusual shape, I will try to match it with a familiar shape and give it a name.

Many of the thoughts that pop into our minds are verbally arranged. For example, if I am doing something and sense it’s wrong, the thought may insert itself as a sentence: “This is a bad idea.” In the old Transactional Analysis, which was popular back in the late 1960s, the Freudian personality functions of superego, ego, and id were explained as the internalized voices of your Parent, your own Adult self, and your earlier Child self. The Adult makes rational decisions based on current needs, reason, and experience. The Parent issues decrees and warnings based on remembered authority. And the Child expresses needs and wants based on remembered emotional states. … Or something like that. The point is, these are learned and internalized reflexes that the person remembers from growing up as an immature version of self under the regime of a parent who is more mature and either guiding or punishing. Usually, these reflexes present themselves as verbal statements. When my mind generates the thought “That’s a bad idea,” it is usually in my mother’s voice.

I should note also that when I am writing, as now, the words are coming into my head as if I were speaking them aloud.3 And when I write fiction and generate dialogue, I imagine the two or three characters speaking and supplying their own favorite expressions, diction, preferred sentence structure, and even accents as they speak.

So this “common experience” of hearing voices can be pictured as a spectrum, and this matter of thinking in terms of words would be the “normal” end. It is normal in that most people are not alarmed by it, do not find it troubling, and accept it as the way their brain works. I should note that this end can have its alarming aspects. My mentor at the university, Professor Philip Klass, once told of a time he was driving faster than usual on an elevated freeway. He heard a voice in his ear say distinctly, “Slow down!” The voice was so real that he reacted instantly—and around the next turn was a wreck that, if he had not slowed, he would have plowed into. Was that voice the manifestation of a guardian angel? Or just his own mind cautioning him about driving too fast? Either way, he questioned whether the instance of hearing the voice, followed by the crash up ahead, could be mere coincidence.

Moving toward the less-normal parts of the spectrum, we have Professor’s Klass one-time warning voice, as well as the times when we hear a change of air pressure at a partially cracked window and think it’s a human moan or sigh, or the babble of the crowd in a busy restaurant suddenly resolves into an almost-familiar voice speaking our own name. Or—and I speak from experience here—sometimes a recent widower will hear a noise and imagine it’s the whisper of his dead wife. It may be imagination, but it sounds awfully real and there is a momentary pang of recognition and regret. The point is, in this part of the spectrum we are not at all sure, however briefly, whether the voice was inside our heads or not.

Toward the middle of this spectrum are the voice hearers, like the members of the panel, who hear voices that they know or believe are not their own and not coming from inside their heads. They can have it explained to them that their brains are malfunctioning and they are listening and responding to tricks of their own imaginations, but they will not believe it. The voices are too real. One of the panel members insisted that the different voices each had their own way of speaking and accents, and that was proof to him that the voice came from outside. Also, the voices often suggest something that the person would not normally do, such as inviting him or her to commit suicide. Whether the person is following the voice or resisting, he or she acts as if dealing with an alien entity.

And finally—or so I believe—the manifestation of a supposedly external voice with its own character, diction, and other qualities might become so embedded in the mind of the hearer that it develops an entirely separate personality or, in psychologists’ terms, a “dissociative identity.”4 While the causes of a brain or mind creating more than one personality or dissociating itself from the one it was born with are debatable, the condition often occurs in someone who experienced extreme trauma as a child. So, presumably, the second and other personalities develop in order to envelop and protect the tender ego. The fact that the core personality is usually not aware of these other personalities, their actions, and their intentions is where the dissociation comes into play. It appears that an alternate personality takes control of the brain and body at various times.

Normally, I would not think of putting dissociative personalities on the spectrum of hearing voices, except one of the panel members mentioned his own childhood trauma and being treated for post-traumatic stress disorder, or PTSD. So it is possible that stress and trauma play a part in the voice hearing as well as the dissociation of a personality.

On top of all this, we must remember that human beings, with our huge and vastly interconnected brains, are the “dream animal.” We live not just in the moment and inside our surroundings, as my dog does, but also in our imaginations, in our speculations, in the what-ifs and if-thens of our subjunctive language, and in the twilight realm of our dreams, where the wildest fantasies seem real and even plausible for a short period of time. Is it any surprise that this delicately balanced and incredibly complex mechanism occasionally slips a few gears and feeds us false information?

That’s just the nature of human existence.

1. “Consumer” is the new, more polite term to refer to people with a diagnosis of severe mental illness and is preferred by people in this situation to the earlier term “patient,” which implies that they have an illness. These people are consumers of mental health services.

2. One of the panel members at the meeting, who tried for a scientific understanding of the phenomenon, stated that functional MRI scans of people when they were experiencing voices showed activity in these processing centers.

3. The generation of this word stream is complex. Some of it comes from my front-of-brain thinking and deciding: here is how the article, argument, or story must go. And some of it comes from my subconscious and its intuitive sense of what the story or article might become. For more on the role of the subconscious in my writing, see Working With the Subconscious from September 30, 2012.

4. The old diagnosis of Multiple Personality Disorder has now become Dissociative Identity Disorder. Po-tay-to, po-tah-to.

Sunday, December 2, 2018

A Classic Liberal

Balloon rising

I’ve been examining my own political stance these days—especially since anyone who believes in personal freedom, personal responsibility, and free-market capitalism while being opposed to big government, statist solutions, and socialism is now considered by some to be a “racist, misogynist, homophobic Nazi.”1 I have decided that what I am, other than a stick-in-the-mud, Eisenhower-era Republican, is a classic liberal.

What are the principles of this kind of liberal, as opposed to the more modern kind?

First, I believe in your personal freedom as much as mine. Your rights to free expression, physical movement, occupation of space, and use of time are yours to exercise and govern, as mine are my own. The province of your right to these actions extends up to about an inch from the tip of my nose, or whatever else defines personal space in our culture. If you violate my space and my time, there will be consequences—and I’m prepared to initiate them. But other than that definition of pre-existing physical and temporal limits, I am not going to prejudge you or prescribe the limits to be placed on your speech, actions, and intentions. Go your way and don’t interfere with me, and we can be trading partners, potential allies, and perhaps even friends.

Second, I grant you provisional respect and allow for your personal dignity. In my heart, I want the world to be populated by—in that old phrase—“men (and women) of good will.” I want to live in a society where people can be—and do become—productive and self-sufficient in their lives, caring about their own and their families’ and their friends’ futures, and confident and comfortable in their own skins and with their current situations. This is not always possible—sometimes through personal frailty and failure, sometimes through societal lapses—but I want people to have this chance at personal happiness and dignity. And so, if I want the world to be like this, I must grant in my own mind that such people exist and that you may be one of them. I must refrain from prejudging you as a person of gnawing envy, grasping ambition, bad habits, faulty decision making, and other personal failings that can lead to chronic unhappiness. I leave it up to you to prove me wrong in this. Please don’t disappoint me.

Third, I grant your personal agency and responsibility for your current state of being. Unless I can see and detect some congenital or acquired disability in you, such as blindness, deafness, missing and frozen limbs, or—after five minutes of casual conversation—some deficit of wits, emotional stability, or active and inquiring mentation, I will presume that you are a fully functioning human being with two legs to stand on, two hands to shape the world around you, and a capable brain to guide them both. As with your right to freedom, I believe in your ability to live as you want and operate in the world. I would hope you will grant me the same and not wish to place barriers to my developing and exercising my full human potential.

You will note that these attitudes apply personally rather than to any group. I prefer to deal with people as individuals, unique beings, and not as indiscriminate members of a race, class, gender, or other aggregate. Economists and Marxists may prefer to deal with large groups—economists by their profession, and Marxists by their obsession—but I would rather follow the rule of Sergeant Buster Kilrain: “Any man who judges by the group is a pea-wit.”2 Marxists will find this attitude hopelessly bourgeois, and so be it. I was born to the middle class and raised to be private, diligent, industrious, and resilient—not a bad way to operate and view the world, in my opinion.3

Being a rabid individualist, jealous of my rights and expectations, dealing with other people as individuals, and granting them the freedom to do and become what they want, I tend to despise one-size-fits-all prescriptions and social engineering. I understand that proponents of big government and statist solutions must, as a matter of logic and fairness, strive to treat everyone equally. And socialism by design must treat all citizens as economic cogs in the great machinery of their proposed social organization—except perhaps for those enlightened experts who are doing the designing and taking control of the command-and-control economy. While I grant that some effort must be made at social cohesion if a village, a state, or a nation is going to function, I want to see the choice to join and function—and of who will be doing the deciding—made individually and democratically. Treating people as mere numbers or as “meat robots” devalues their thinking capability and their human potential.

While I believe we should all work together as a society and function in an open economic system, I take the position that I am not responsible for your health, wealth, happiness, or well-being. That is your responsibility and not mine. If you approach me as an individual and ask for help—whether you have your hand out with a cardboard sign at a street corner, or you are wandering dazed and confused after a disaster, or you are a friend or family member in need of support—it is my choice and not my responsibility to respond positively. I am the sovereign of my time, my effort, and my purse, as you are over yours. How I choose to spend them is a matter between me and my conscience or my god—if I have either one.

Note also that these are my personal and individual guidelines, attitudes, and approaches. They are not rules prescribed for me by someone else. They are subject to revision and revocation, and I can change my mind as I see fit.4 I can be flexible without worrying about my own inconsistency, based on my previous experience with similar situations and my new experience with each person. You can’t shame me by pointing a finger and exclaiming, “Aha!”

I am not a big proponent or follower of rules and regulations, policy statements, and firm positions. I deal best with people who have and practice a personal religion but who don’t make an issue of it or expect me to believe in and follow its rules myself. I am humble enough to know that I might be wrong, and not ashamed of admitting a mistake and moving along. I trust others to have the grace to do likewise. After all, life is a still-unfolding mystery. The universe is huge beyond our wildest imagining. No one has all the answers. And we are all just finding our way.

1. And the people who believe that really ought to examine their historical referents.

2. Kilrain was a fictional character in the book The Killer Angels, later made into the movie Gettysburg.

3. As opposed, I image, to being a member of the aristocracy, expecting undue deference, and looking down on everybody else. Or a member of the proletariat, or underclass, or whatever the opposite of an aristo is, and looking upward with hatred and envy at anyone better situated, more industrious, or better educated. The middle is not a bad place to be.

4. I live by the dictum that no rule is universal; there are always exceptions; and no one rule can be tailored to fit all situations. Our minds were given flexibility for a reason.

Sunday, November 18, 2018

National Novel Writing Month

Midnight writer

I’ve heard of this for several years now and even had friends participate. It’s a real thing, with an organization and a website, National Novel Writing Month, or NaNoWriMo. Supposedly, once you sign up and submit a profile, you can earn “badges” for various stages of accomplishment and then “win” by submitting the text of your novel of at least 50,000 words—which is more like a novella these days—and having it validated by the site. What is unclear to me—although I have not delved the entire site to read the fine print—is what you actually win, and then what happens to the text of your novel. Does NaNoWriMo get anything out of the process, other than the glow of helping inexperienced writers with encouragement, motivation, and a deadline? Does the organization, for example, obtain the rights to the submitted work? A professional writer would be concerned about such things.1

I have never participated in NaNoWriMo, because for me every month is novel writing month. At any time of the year, I am either drafting scenes and chapters on the current book; editing, coding, and laying out the book I’ve just completed; or plotting, outlining, and generally “noodling” the next book in my lineup. This little shop remains open seven days a week, and critical plot points and bits of dialogue may occur to me even on holidays and while traveling on vacation. Novel writing is a lifetime process.

Does writing novels pay well? Not really, if at all. Some people make it big in traditional publishing. And one hears of independents who are making the rent with their self-published books. But for me, with seventeen completed novels for sale and an eighteenth in production, the proceeds amount to coffee money each month, never more than a lunch out. To really establish yourself as a writer with a national reputation, you need to produce a novel that hits the bestseller lists, either at the New York Times or in some national sales forum. Bestsellers are not a thing you can plan, because luck is a major factor: creating just the right story to fit the national zeitgeist at just the right time.2

But that doesn’t mean you shouldn’t write a novel. Writing is a dip into the human creative process. If you have a talent for some art form, then that is the thing you must do. For me, it’s creating stories about people who never lived doing things that never happened. For others, it may be making music, putting paint on canvas, sculpting clay, or cooking gourmet meals. Not everyone—not even a fraction—among the people who love to cook gets to open a restaurant. Not everyone who paints or sculpts gets a gallery exhibition. Not every musician gets to join a band or play in an orchestra, and not every composer gets to hear her song or his symphony performed. But that does not mean you don’t try. And if your work does not make it to the public, that does not mean you don’t continue. Because the work itself is good for your soul. It’s what makes us human and different from the other animals and the robots.

Writing a novel will change you.

Unlike writing a piece of nonfiction, where the facts speak for themselves, or a short story, which you can complete in one white-hot burst in an afternoon, a novel takes gulps of time over many writing sessions. You must develop and maintain a voice that you can use, a character viewpoint—or more than one—that you can occupy, a feel for the time and place that you have created, and a mood that extends over many sessions, sometimes for months on end. I have likened the process to renting out half my brain to a troupe of traveling actors for a year at a time. They—or really, the whims and products of my subconscious3—will try out bits of dialogue or stage business in the middle of the night. They will suggest changes to the plot while I am thinking about something else entirely. And on occasion they will refuse my direction as I am trying to sit down at the keyboard and start a scene from the outline—but they just won’t let me. These actors and their internal director are a busy bunch, but they are also necessary to my writing process.

Until you have written a novel and submerged your active mind in the creation of another person in another world, you don’t know who you really are. You haven’t come face to face with the contents of your soul—which, for most of us, can stay safely buried in dreams that we forget, in daydreams we can ignore, and in random thoughts we can dismiss as bits of trivia or the whispers of the devil. A novelist has to wade into this mess and wring from the soul something that has real existence outside of his or her brain as words on paper. In the same way, a painter has to wring out a vision with a specific purpose and detail, a sculptor has to find a shape with meaning, and a composer has to find a melody with mood and coherence.

My own soul, I have found, is relatively stoic and restrained. My characters are not highly emotional people. Yes, they have their loves, ambitions, and desires, but those are usually deeply buried, giving direction to their lives but not to flights of words in their dialogue. They are too busy trying to figure out how the world around them works, and what they have to do to survive or thrive in the current situation; so they can’t spend time wishing and dreaming how the world might be different. They excel at mechanical contrivances and traps, and this reflects my own upbringing.4 Their reaction to adversity—and mine, too, on occasion—is one of amused cynicism, followed by a determination to work things out and not get crushed.5

If writing is your thing, then I would advise you to try writing a novel. It may not produce a bestseller, but that’s not the point. It will introduce you to yourself.

1. Oh, deeply concerned! Every writer who hopes to make it big—well, hoped, once upon a time—in publishing should be suspicious about the rights to the work. Once you sell or give them away, the text is no longer yours to use and publish, or even to have much say in how it will be used, altered, where and when published, and otherwise passed into slavery. If your novel is like your child—and the whole process for me takes about nine months—then you become protective of its dignity and its future.

2. And a bestseller is not a thing you can emulate or achieve by riding on another’s coattails. When I was cut loose from the traditional publishing world—and that’s a story for another time—and casting about for an agent, I heard many variations of “if you could only write a story about a boy wizard with glasses, I could really sell that.” This was a reference the phenomenal success of the Harry Potter series. Later it would have been “a story about a shy girl who meets a sadistic billionaire,” or whatever is popular right now. The trouble with chasing someone else’s bestseller with your own book is timing. If that other book is booming right now enough for you to know it, you will still need some weeks—if you are very fast—or months to write your own copycat version, more months to parade it before a string of potential agents, more months after that for the agent who accepts you to secure a sale, and then about a year for the publisher to edit, typeset, print, and promote your book with the national bookstore buyers. And in that year-plus-plus, the traveling circus of reader interest will have moved along and something else will be popular. The brass ring is for that other novelist, not for you.

3. See Working With the Subconscious from September 30, 2012.

4. See Son of a Mechanical Engineer from March 31, 2013.

5. In this way, my characters are a lot like those of traditional science fiction stories, as is my own reaction to things. Well … you are what you read.

Sunday, November 11, 2018

What Is Good?

Vitruvian blood vessels

I have never concealed the fact that I am an atheist—although I sometimes sail under the flag of agnosticism, the state of “not knowing,” in order to avoid bar fights. I do not accuse or belittle people who have had an actual religious experience, heard the voice of God or the rustle of angel wings, and are guided by the principles of their religion. Peace be unto them. But I never had the experience, and I have neither the genetic makeup nor the cerebral or psychological components necessary to perceive that unseen whisper. But at the same time, I am not in G. K. Chesterton’s famous line, “capable of believing in anything.” I have my own principles, after all.

One of those principles is evolution. I have worked at both a manufacturer of biological pharmaceuticals and a developer of genetic analysis equipment. I know enough biology and have read enough about genetics and cladistics to appreciate that all life on Earth is related. The octopus is not an extraterrestrial alien dropped into this planet’s oceans—as some sources have recently claimed—but is cousin to the squid and the cuttlefish in the class Cephalopoda, just as human beings are cousin to the mouse and the lion in the class Mammalia.

Evolution is not just the “survival of the fittest,” as the popular saying goes. The evolution of a biological organism takes tiny changes in the genetic code, potentially effecting tiny changes in form and function, and then either implements them immediately—especially if the change is harmful or fatal to the bearer—or holds them quietly in the genome as a recessive or alternate copy of the gene until the features engenders can come into play. The DNA/RNA/protein coding system has many built-in safeguards that make most random changes in the code neither immediately fatal nor immediately helpful. For example, each three-codon reading frame, in which three base-pair sequences call for any one of the twenty amino acids used in assembling a protein molecule, usually has several alternate forms for calling each amino acid; so a change in just one of the “letters” will usually still create the intended protein. The system is robust—so that we can still have viable offspring and recognize them as human—and yet just fragile enough that changes are possible over generations.

And those changes and their effects are not necessarily crude, achieving just basic survival or writing off the individual organism with a lethal deletion. The cheetah was not born to limp over the veldt in pursuit of its ambulating prey. Time and the millions of minute alterations to the genetic code governing the cheetah’s musculature, metabolism, and nervous system allow it to lope gracefully and efficiently, outrunning the swiftest antelopes and wildebeests, which are themselves adapted to run just fast enough—most of the time—to elude their predators. Evolution is not just a mechanism of survival but a mechanism of optimization, efficiency, and ultimately of temporary perfection.

I have called DNA the “god molecule,”1 but that is not because I worship it or think it has supernatural powers. The DNA/RNA/protein system is simply the instrument of evolution. It has created not only all the varied life we see on this planet but also, because of the impact that life has had on shaping the atmosphere, seeding the oceans with abundant life, and covering the hills with vegetation and grazing animals that change their erosion patterns, it has changed the surface of our world itself. The original Earth, before the first bacteria and blue-green algae evolved to give it an oxygen-rich atmosphere, was as hostile to our kind of life as the surfaces of Venus or Mars are today.

But the principle of evolution applies to more than just organic structure and function. Most of the structure and function of human society and the approaches in any human endeavor, from technology to the arts, have advanced by a form of social evolution: small—but sometimes large—changes introduced into a complex situation, there to be either discarded, adopted, or further adapted. In rare cases, like the mathematical thinking of a Newton or an Einstein, a single person will make a significant change in human society and history. But for the most part, what one person starts another will then adapt and improve on, so that the seminal invention is lost in a continuous flow of minor and incremental developments. The invention of the stirrup and the wheeled plow, with their migration during the Middle Ages from Asia into Northern Europe, are such examples.

In the same way, the structure of many human social concepts like love, justice, honesty, reciprocity, personal freedom, and other exchanges that we consider “good” and weave into the stories we tell are the products of social evolution. Human families, clans, tribes, city-states, and nations learned over time by piling one experience and its consequences on another that certain strategies of exchange either worked or did not. For example, they settled early on the basic understanding that habitual lying is harmful both to the people who must deal with the liar and ultimately to the liar himself. That fair dealing and reciprocal trade are a better system of exchange than theft and plunder. That hereditary servitude is not proper treatment for any thinking human being, and a society that practices slavery may flourish for a time but will eventually collapse. That love is a stronger bond and lasts longer than hate. And on and on. We learned these “home truths” at our mother’s knee and passed them down through the cultural wisdom of our clan and tribe long before some prophet wrote them on tablets of stone or bronze and suggested they were the teachings of the gods.

This does not mean that dishonesty, plunder, slavery, hatred, and other injustices don’t exist in the world. Or that sometimes these strategies of exchange will not work just fine in some situations—especially if there is no one stronger around to keep you from getting away with them. Ask the Romans, or the Mongols, the Nazis, the Soviets, and any of history’s other bent and crooked societies that have made a bad name for themselves. But thinking human beings, left on their own to study and consider the situation, will conclude that these negative strategies do not work for the long haul or for the greatest good of the greatest number of people.

Not only has human society as a social construct but the human nervous system as a response mechanism evolved in tune with these beneficial strategies. Try taking from a toddler the treat that its mother has given and see if that tiny human brain does not immediately register and react to the unfairness of your action. Hear children on the playground taunting each other—perhaps even with names and descriptions having a superficial gloss of truth—and see if the recipient does not explode with anger at the perceived dishonesty. We all understand how the world works and know when others are practicing falsehoods and injustices upon our person and our sense of self.

It does not take a god from a burning bush with a fiery finger to write out the rules of what is proper and good in any human exchange. We know it from before we were born, because our brains and our society had already supplied the answer, hard-wired and ready to function. In the same way, we see the world in the colors for which our eyes were adapted, breathe the air for which our lungs were optimized, and recognize the adorable cuteness of babies and puppies because it is beneficial to both that our brains release the right endorphins at the sight of them.

Evolution says that we are at home in this world because we are the products of this world. And that is enough of a natural wonder for me.

1. See, for one example among others, The God Molecule from May 28, 2017.

Sunday, November 4, 2018

The Next Civil War

War devastation

It has been suggested for some years now, at least for the past decade, that this country is in the midst of a “cold civil war.” Disagreements of both policy and principle between the progressive left and the conservative right have reached a fever pitch. Factions are marching in the streets and attacking each other with bats and chemical sprays—although the fighting hasn’t reached the stage of firearms yet. Friendships are breaking up, sides have formed, and the lines are drawn on social media. I even know of one Facebook friend who seems ready to divorce her husband over his political views.1

We’ve been here before, back in the late 1960s, when I was at the university and the young people in college and the radical activists were protesting against the Vietnam War and in favor of civil rights and free speech. Back then, we had campus demonstrations, protest gatherings on the Washington Mall, and rioting in the streets—most notably outside the 1968 Democratic Convention in Chicago. The difference between then and now is that in the ’60s the radical view and its hard-line conservative response were both on the fringes of the political spectrum, while the two main parties could still conduct business in a relatively consensual, bipartisan fashion. Today, the two parties function in lockstep with their most radical elements. Discussion and votes in the Congress and decisions on the Supreme Court are divided along party lines with almost no crossover. The White House and the top echelon of the Executive bureaucracy swing back and forth with whichever party captures the Presidency.

On the one side, we have people who want to create and celebrate a “fundamental transformation” of the country’s political, economic, social, and environmental relations according to a perceived “arc of history.” On the other are those who don’t mind moving forward into the future by evolutionary steps but resist being pushed bodily through revolutionary action. Frustrations abound on either side, and with them come name calling, social shunning, brick throwing, and tear gas.

Some people are even speculating—myself among them, and mostly since the upheavals of the 2016 election—that the cold civil war will eventually turn hot. That our political and economic differences, our social and environmental positions, will reach a point where they can no longer be resolved by discussion and bargaining, by yielding on some points and advancing on others, to arrive at a national consensus. That the political crisis will demand a clear-cut winner and loser. That internal peace will only be achieved when one side or the other can no longer stand up for its position because its politicians and their supporters have—each man and woman—been economically subdued, personally incarcerated, or rendered dead. Or when the country has been divided by physical partition and personal and familial migration, as occurred between India and Pakistan in the late 1940s, with each party maintaining its own new national government.

The first American Civil War of the 1860s was a dispute between cohesive regions, North and South, Slave State and Free. But many people think the current differing viewpoints are too intermixed for the country to break and go to war along regional lines and across state boundaries. This view says that the coming hot war will be more like the Spanish Civil War of the 1930s, with neighbor fighting neighbor for control of the cities and the countryside for each party.

I can see the reasoning for either approach. In many ways, the opposing sides in this country reflect a divergence between urban progressives and rural conservatives. We keep seeing that map comparing the votes cast in Los Angeles County—which is just the urban core of the big place we think of as “LA”—and those in the seven states of the Upper Northwest, from Idaho to Minnesota. And really, even California is not a homogenous polity, because the feeling in communities of the foothills of the Gold Country and in the Sierra is more conservative than the progressive politics of the big cities in the Central Valley and along the Coast.

But I can also see a breakup between regions. The states along the Pacific Coast, in the Northeast, and across Upper Midwest are typically progressive, while the middle of the country is typically more conservative—with a few isolated exceptions like Colorado and New Mexico.

The question of how the country will break apart if and when war comes depends, in my mind, on what incident, what spark, finally sets it off. If the decisive point is internal, say, an election that fails to satisfy one party so greatly that it simply revolts, then we might see a piece-meal collapse as in the Spanish Civil War. But if the incident is external and the shock is to the whole country, then we might see a response that takes shape along regional and state lines.

The latter is the picture I painted as a leitmotif to my two-volume novel about life extension through stem-cell organ replacement, Coming of Age. There, the incident was the repudiation of the national debt.

When I was in college, my economics text book said the national debt was irrelevant because it was just money that we owed to ourselves, financed by Savings Bonds held among the citizenry. No one was going to call in that debt; so the government could just keep financing it by issuing more bonds. As recently as 2014, however, almost half of our publicly held debt in the form of U.S. Treasuries, and a third of our total debt, is held by other governments and offshore banks. The biggest holders are China, Japan, Ireland, Brazil, and the Caribbean banks.

If these external holders wanted to collapse this country—which, given that our global economy is so interconnected, would be a foolish thing—they could simply sell off huge blocks of the U.S. Treasuries they now hold. The federal government would then have to scramble to make good on the sales, and so would likely impose massive economic restrictions and additional taxes on the American public. In my book, this prompts many of the states in the central part of the country—whose residents don’t feel they are well represented in the federal government’s spending decisions—to renunciate the debt and along with it their allegiance to the Union: either secede from the union or go broke by staying in it.

Under those conditions, many of the National Guard units would side with their home states. And many U.S. Army, Navy, and Air Force bases located in these states might weigh their allegiance to the national government against the conservative political instincts of their commanders and troops. The split would not be uniform. The choices would not be pretty. And once initial blood was spilled in the breakup, it would not be much more of a step to spill blood in establishing either national dominance or domestic partition.

In my novel, the breakup along these economic lines came in the year 2018. Of course, that year has now come and is mostly gone. But the weight of the national debt and the simmering divisions of our domestic politics still hang over us all.

I don’t look for war or want it. But my novelist’s ear listens to the rhetoric that is now splitting the county along its fracture lines, and I cannot discount the possibility of a shooting war coming to these United States sometime soon.

1. My late wife and I had opposing political views: she an old Berkeley liberal Democrat, me an unreformed Eisenhower-era conservative Republican. But we fell in love and married in an earlier time, some forty years ago, when political differences were treated in the same way as differences of religious doctrine and practice: a private, personal matter that did not touch on the essentials of what made a good person. My wife and I shared the same values about honesty, integrity, kindness, education, and fair dealing—and that was what mattered. For the rest, we joked about making sure we each went to the polls on election day so that we could cancel each other’s vote.

Sunday, October 28, 2018

Radicals and Revolution

Joseph Stalin

Joseph Stalin

I am becoming more and more concerned about the direction of politics in this country.

It used to be—back when I was growing up, and for a decade or two afterward—that Democrats and Republicans could be civil to each other. They might differ on policy issues but they agreed about the principles and process of governing. Their disagreements were more about emphasis and degree than about goals and objectives. We debated a larger or smaller role for the public and private sectors, for amounts of taxation and regulation, for our stance in international relations, for the uses of military force versus diplomacy, and for other aspects of life in this country. But we could all agree that the United States was and would remain a country of individual liberties, rights, and responsibilities, functioning in open markets that supported investor capitalism, and with sovereign powers and responsibilities in the larger world, all under a government structure spelled out in the U.S. Constitution. These things were the bedrock of our society.

Yes, there were fringe groups. In the late 1960s, radicals on the New Left demonstrated not just against a war they didn’t like but actually in favor of our Cold War enemies in the Soviet Union and China, as well as for our “hot war” enemy in North Vietnam. They envisioned an end to capitalism and the introduction of a Communist-style command-and-control economy in this country. They wanted to overthrow the U.S. government in a violent revolution. The Students for a Democratic Society and their adherents marched for it, and the Beatles sang about it. But reasonable people in the mainstream parties ignored them—or called them “crazies”—and went about the business of governing.

Some of those New Left radicals grew up and matured but never abandoned their extremist principles. Saul Alinsky became a “community organizer” and wrote his signature work, Rules for Radicals. Bill Ayers, one of the founders of the Weather Underground, became an education theorist but remained in his heart a domestic terrorist. One of Alinsky’s admirers—she actually wrote her senior thesis at Wellesley College on him—was Hillary Rodham Clinton. A Democratic politician who was early-on linked to Ayers in Chicago was Barack Obama. But Obama’s and Clinton’s associations can be written off as errors in judgment due to their youth.

Now, and for the past dozen years or more, the internal direction of this country has shifted. The radical viewpoint has grown from a splinter at the extreme left of the Democratic Party to a driving force in much of that party’s current policies and rhetoric. In recent years, many Democrats have expressed sympathy with “Democratic Socialism,” which joins political democracy with government ownership of production and distribution. Many Democrats now favor open borders and the surrender of national sovereignty. The want a free-form and non-originalist interpretation of the U.S. Constitution, including revision of the First Amendment, abolishment of the Second Amendment, and elimination of the Electoral College. The largest unionized group in this country is now the public sector, which bargains for increased pay and protections from politicians who are not negotiating with their own money. That’s an idea that even so liberal a politician as Franklin Roosevelt thought disastrous and ought to be discouraged.

Since the presidential election of 2016, many Democrats have been in open revolt. Celebrities have called for bombing the White House. And seemingly rational people proclaim the current incumbent to be “Not My President.” Since the confirmation of Bret Kavanaugh to the Supreme Court went against the wishes of the Democrats in the Senate—largely because they had previously invoked the “nuclear option” of making judicial confirmation a simple 51-vote majority instead of the 60 votes formerly needed to override a potential filibuster—the party has called the legitimacy of the court itself into question.

People on the left now want the country run as a pure democracy rather than the republic of elected officials that the Constitution established. Without the Electoral College, which was put in place to give small states with scant population some parity with large states, the citizens of Los Angeles county would be able to outvote the citizens of the Upper Northwest from Minnesota to Idaho. Presidential elections are always strategic games of chess, trying to take and hold at least 270 electoral votes. The makeup of the Senate, with two Senators for every state regardless of size, protects the interests of small states that are out of sight and of mind of large state and big city politicians. The Electoral College is part of the “checks and balances” put into the Constitution to protect minority interests from the whims and tyrannies of the majority.

What I fear in this new direction is what the left now seems openly to want: revolution. A country’s government is just laws and procedures, words written on pieces of paper or, worse, loosely preserved in bits and bytes somewhere in computer memory. What makes those words work is the belief of citizens and their elected politicians in the principles behind them. We agree to these things being true and necessary. We dismiss them at our peril.

The Constitution has set up a mechanism to change any part of it or add to its effective structure through the amendment process. But that process is hard, requiring any amendment to get a two-thirds vote in the House and the Senate, or being ratified by two-thirds of the state legislatures meeting in a special convention. The idea is that any change has to be popular enough and gain enough bipartisan support to pass this hurdle. We’ve done it twenty-seven times over the years, addressing issues as important as the voting rights of former slaves, women, and people age eighteen to twenty-one, and issues as ephemeral as the drinking habits of the average citizen.

The radical notion of changing or abolishing elements of the government or the Constitution itself by popular vote would invite a landslide. We have seen something of that kind operating in California, where initiatives are placed on the ballot by petition and voted on by the public at large. Those that pass acquire the force of law. Some initiatives have had good effects, some bad. But all are subject to the same verbal tricks and legal manipulations of any advertising campaign: claiming to support one thing while actually undermining or subverting it. Popular politics is a rough game to play.

The edifice of the U.S. government is fairly robust and can stand to have a few bricks knocked loose from time to time. But a popular onslaught in the name of “revolution”—whether meant picturesquely or in deadly earnest—could lead to a collapse. We’ve seen that happen before: France in 1879, Russia in 1917, Cambodia in 1975. The problem is that immediate and necessary changes in government always start with well-meaning people who have goals, a plan, and a vision for the future. But once the structure collapses, anything goes. The process starts with essentially kind-spirited liberals like Jean-Paul Marat or Alexander Kerensky; it ends with closed-minded tyrants willing to spill vast quantities of blood like Maximilien Robespierre and Joseph Stalin.

My fear is that the people so innocently dreaming of and calling for revolution today are the nation’s Murats and Kerenskys. If they get what they want, these people will go up against a wall within about six months. Waiting in the wings are the Stalins and the Pol Pots. And they will not be so liberal or kind-hearted.

Sunday, October 21, 2018

Mind Games

Subatomic particle

I am just finishing up Adam Becker’s book What Is Real? about the relationship between quantum physics and the real world it is supposed to represent. Becker tells a good story, especially as an introduction to the world of quantum physics, the players over the years, and the intellectual principles involved. His basic premise is that, while the equations that physicists use to predict the outcome of their experiments—and so test the value of those equations as representations of the underlying world of the very small—have consistently proven their worth, the physicists themselves remain in doubt as to whether the world that they are describing actually exists.

Without going into the entire book chapter by chapter, the issue seems to be one of describing a world so small that we cannot detect it without changing it. Atoms and their component protons, neutrons, and electrons—plus all the other subatomic particles in the Standard Model—are not fixed in space like pins on a board. As with everything else, they move, as do galaxies, stars, and planets. However, instead of occupying observable orbits and tracks across the night sky, atoms mostly vibrate with the energy of what’s called “Brownian motion,” and electrons buzz frantically and randomly around their nuclei like flies in a cathedral.

We can detect the larger celestial bodies—and even masses as small as freight trains and automobiles—with visible light without the danger moving or deflecting them much. Bounce a few hundred thousand photons off a teacup, and you will not move it one millimeter. But the subatomic particles are so small that the wavelength of light we can see is so long that it misses the particle entirely, passing over and under it with no impact. Imagine that the wavelength is a long piece of rope that two girls are spinning in a game of Double Dutch. If a human-sized person enters the game and performs unskillfully, the rope has every chance of hitting—that is detecting—his or her body. But if a flea jumps through the game area, the chances of that long, curved rope ever touching its body become vanishingly small.

To detect subatomic particles, physicists must use other particles, as if in a game of subatomic billiards, or photons with much shorter wavelengths and thus having much higher energies. A high-energy photon impacting a moving electron or proton will change its direction of motion. So the issue in quantum physics is that when you locate the particle you are observing here, it’s now no longer there but going somewhere else. In quantum physics terms, no particle has an exact position until it’s observed, and then it has some other position or direction of movement in response to the observation. Mathematically, the particle’s supposed position can only be defined by probability—actually, a continuous wave function that defines various probable positions—and this wave “collapses” into a single definite position at the place and time of your observation.

Well and good. This is what we can know—all that we can know for sure—in the world of the very small.

The first issue that Becker’s book takes up is that most of the original proponents of quantum physics, including Niels Bohr and Werner Heisenberg, adopted this lack certain knowledge to an extreme. Called the “Copenhagen interpretation,” after Bohr’s institute in Denmark, their view insists that the entire point of quantum physics is the manipulation of the results of observation. The measurements themselves, and the mathematics that makes predictions about future measurements, are the only things that have meaning in the real world. The measurements are not proof that subatomic particles even exist, and the mathematics are not proof that the particles are doing what we think they’re doing. To me, this is like calculating the odds on seeing a particular hand come up in a poker game, or counting the run of cards in a blackjack game, and then insisting that the cards, the games, and the players themselves don’t necessarily exist. It’s just that the math always works.

Other physicists—including Albert Einstein—have been challenging this interpretation for years. Mostly, they pose thought experiments and new mathematical formulas to prove them. But the Copenhagen interpretation persists among quantum physicists.

A second issue in the quantum world is the nature of “entanglement.” Here two particles—two atoms, two electrons, two photons, or two other bits of matter that is sometimes energy, or matter that oscillates with wave-like energy, or waves that at the instant of detection appear as singular objects—become joined so that what one of them does, the other will do. This joining and the parallel actions persist through random occurrences—such as passing through a polarized screen—and are communicated instantly across distances that would violate the limit of light-speed travel for any object or piece of information. Here is the sort of “spooky action at a distance” that Einstein derided as a violation of general relativity.

A third issue in quantum physics is the nature of Schrödinger’s cat. To illustrate the limitations of measurement, Erwin Schrödinger proposed the thought experiment of putting a cat in a sealed box with an apparatus that releases a poison when triggered by the decay of an atomic isotope. Since the atomic decay is unpredictable, the cat in the box might be alive or already dead. It was Schrödinger’s point that until an observer opens the box, the cat exists in two “superposed” states—both alive and dead at the same time, expressed by a wave function of probability—and that the wave function does not collapse and reveal the cat’s final nature until the box is opened. As a thought experiment, this is a metaphor for measurement and observation. But some physicists insist that the superposition is real. The actual cat is physically both alive and dead until discovered.

This superposition has led some physicists to describe a splitting of the universe at the point of the box’s opening: one universe proceeds with a physicist holding a live cat; the other with a physicist mourning a dead cat. This is the “many worlds” interpretation. Both universes are equally valid, and both continue forward in time until the next quantum change that forces each universe to split again in some other way.1

Now, I freely confess that I do not have the mathematical skills to understand the equations of quantum physics. And mercifully, Adam Becker’s book does not focus on or discuss the math in detail, just the thought experiments and their supposed meaning. I also confess that I do not understand what condition enables two particles or two waves to become “entangled,” or how they interact at a distance in this state, or what might be required to untangle them. Becker does not explain any of this, either. Further, I confess that I can sometimes be simpleminded, rather literal and obvious about what I see, hear, and know, and oblivious to distinctions and nuances that other people perceive easily.

But, that said, it would seem to me that what we have here is a misinterpretation of a metaphor. The limitations of observation and measurement, as expressed in colliding particles and probabilistically dead cats, are simply reminders that we do not have direct perception of the quantum world in the same way that we can see, hear, touch, and taste, if necessary, a steam locomotive or a billiard ball. That’s a good thing to keep in mind: we don’t have all knowledge about all things. However, to insist that this metaphorical reminder means that quantum physicists are simply doing math, and that their calculations—no matter how enticingly predictive—have no meaning in the real world, that quantum physics is just a mind game … that’s taking things too literally.

I have criticized the use of mathematics to prove the improbable before.2 And I insist again that, if all you’ve got is a series of equations to prove your point, you may just be playing mind games with yourself and your fellow physicists. But the reverse is also true: the real world must exist at the quantum level. If the math works out, if the vision behind it holds together, then it must be describing something that has actual substance and energy. The details may not be exactly as we understand them. The description may be missing some elements, forces, or bits of math that we haven’t worked out yet. But the world must exist in the smallness of subatomic particles as much as it does in the vastness of stars and galaxies.

The math doesn’t exist in a quiet vacuum. The cards, the game, and the players must also exist to give the calculations meaning.

1. I have cheerfully used the many-worlds interpretation in my novel The Children of Possibility, about time travelers from the far future, and in its prequel The House at the Crossroads. But I know I’m having fun and don’t take this stuff too seriously. So much fun, in fact, that I’m now working on the sequel that picks up where Children left off.

2. See Fun with Numbers (I) and (II) from September 19 and 26, 2010.