Sunday, May 29, 2016

Postpartum Regression

As I have noted elsewhere, writing a novel is like renting part of my brain to a theater troupe for a year or more. While everyone else around me is going about the daily business of living, my thoughts are crowded and sometimes interrupted by sudden images, questions, inspirations, plot notes, and bits of dialogue—as if the actors were working out the script and rehearsing their parts. During the actual writing of the novel, it is as if the actors were giving their opening-night performance. And during the review and editing process, they are sharpening their lines, resolving dramatic issues, and preparing to take the show on the road.

Maybe other writers have a different system, but for me the whole process of creation is mostly driven by my subconscious. That is, I cannot sit down and invent a character, devise a situation in which he or she can operate, create other characters for him or her to work with or against, and then structure a neat and complete little three-act outline that captures the essence of the book. It’s not that this is hard for me; it’s just inconceivable. Maybe other writers can do this, but for me the characters are not puppets or windup toys, and the plot is not an Erector® set of prefabricated incidents and contrivances.

It takes me almost as long, if not longer, to figure out who the characters are, what they’re doing, what motivates them, and what obstacles they might encounter—all just to arrive at a plot outline with some kind coherence—as it does for me to write the production draft. This is a time of scribbling down odd thoughts, writing out sample dialogue, asking lots of questions about what might happen, getting stuck in one unsatisfactory plot shape, searching for the plot twist that will get me unstuck, and generally building a folder full of notes and ideas. I can’t just think of a new book, but I can—like Michelangelo releasing the figure from the marble—ask my brain about the shape and texture of the book which exists out there in the ether, or deep in the recesses of my subconscious.

Since this method of writing is all-consuming, I have learned that I cannot work on two books at once: there’s only room for one troupe in my brain at any one time. So, even after I put the final edits to the book, I need to keep it “hot” in my mind so that I can deal with any problems that might surface during the editing, HTML coding, and page-layout stages. Only when the story is locked up can I let the whispers die down, dismiss the cast, shelve the script, and burn the old scenery. And then I have to let my mind return to an empty space before I can start filling it up with the next book.

Some book ideas have been with me for years, even decades. But none of them is complete and ready to write. They exist as a sentence or two, a question, a character name, and the vaguest notion of what the person is like and how to shape the story around him or her. For example, the next book I’ll be writing, The House at the Crossroads, has been in this nugatory gestation ever since I completed the novel to which it is a sequel, The Children of Possibility. But other books have come first in my writing queue, such as the one I just finished, ME, Too: Loose in the Network, which was itself a sequel to a much earlier novel. Really, life is good when you have novel projects stacked up like jetliners circling the airport.

This clearing away of the subconscious—burning the scenery, dismissing the actors, quieting the voices, and stopping the questions—always leaves me with an empty feeling. It’s like the postpartum depression a mother must feel after the birth of a child. It’s as if the purpose has gone out of my life. My head is suddenly too quiet. My brain has an abscess where a book used to be. The next novel is still just a couple of sentences, a notion, a vague shape in the mist, with a thousand possible outcomes all hanging in the air and a thousand questions still to be asked before it starts making coherent sense.

That is what the art of writing a novel really is: taking an imagined possibility and simultaneously building up the roots and branches of what it ought to become while testing and pruning away the shoots and stems of what it should not be. This is how an idea takes shape—at least in my subconscious. It’s a process of slow realization, of groping forward, of finding the edges and surfaces of the figure stuck in the marble.

But it’s an uncomfortable process, mildly painful and unsettling, like healing a broken bone or a burn. It would be really nice if I could take this span of time between writing one book and another to declare a holiday for myself, but my brain doesn’t work that way. If I am amusing myself with a trip or even a day’s outing, the business of going and doing, seeing and enjoying, meeting and remembering, crowds out that space in my head reserved for the next theater troupe. Growing a book is a process of invitation. So I need to proceed quietly, read good books, practice my karate and music, ride my motorcycle, and give the empty place time to spawn new ideas and their follow-up questions.

For me, that’s the only way to write a book.

Sunday, May 22, 2016

The Faces We Wear as Masks

An old friend and former coworker of mine recently started reading ME: A Novel of Self-Discovery. She wrote to ask me what the main character looked like. She realized that the image on the cover was supposed to be “evocative,” but could I provide a visual image—or even draw her a sketch—of the Multiple Entity?

This was also a problem that my agent at the time foresaw: how will readers relate to a character who doesn’t have a physical description? When we think of people, especially the fictitious people who live only in books, we tend to latch onto some physical characteristic like long blonde hair, a cherubic face, or dark and smoldering eyes. Even a limping gait or a twist of the shoulders will do, as in the case of Richard III. We see the character as a generalized person, but with some remarkable and memorable features.

Authors are generally of two minds about this. On the one hand, if the description is too detailed and particular, it will leave readers with a strong and indelible image but allow no room for them to create their own sense of the character based on verbal style, attitudes, choices, actions, and the reader’s own imagination. On the other hand, if a strong image is not provided relatively soon in the story, readers might feel unattached or envision the character in ways that conflict with later descriptive hints in the story, such as the character being tall enough to reach the top shelf of a cabinet, small enough to pass through the opening of an air duct, or possess features that can be mistaken for somebody else.1

In deference to this problem of visualizing ME, I wrote into the story an episode where the computer-program-turned-cyber-spy is forced to inhabit a mechanical device, a human-shaped automaton or robot, in order to escape from an assignment across the border after the phone lines are cut. ME’s carrier ’bot also suffers damage in transit, and he has to fix it manually—which further enhances the visual context of the apparatus. But this robot is not ME, just a temporarily adopted cybernetic environment, like the various computer systems, automated factories, and pieces of mobile machinery he invades, controls, and then leaves behind in the course of his adventures.

I told my recent correspondent that ME is a spirit, disembodied, like the voice in your head. If the reader wants a more concrete visual representation, then ME is a couple of hundred thousand lines of Lisp programming code, arranged in ten modules which can pass through the internet like a snake or array itself in the cloud on a series of hard drive sectors. The first module is a beak that cracks open a targeted computer core, negotiates with or simply smashes its defense mechanisms, and charms or dupes the operating system inside into letting ME take control. Behind that first module are others with dedicated functions like making executive decisions; sampling and indexing current RAM as a form of working memory; interpreting and translating all the foreign programming languages ME might encounter; juggling random numbers to support his creative impulses; performing error trapping and recursive analysis as the program passes from one environment to another; and finally, in the tenth and last module, executing a core phage to mop up all traces when ME wants to leave the pirated system.2

But is this to say that ME, the artificial intelligence, looks like its operating program: lines of English-language source code or binary digits, each with its different functions, running on a computer chip? This would be the same as saying that a human being looks like a trillion neurons suspended in two pounds of pink jelly, visually represented as a large, wrinkled mass: a gooey walnut.

Is that all any of us are? Of course not. But then … are we more truly our physical externalities? Are we the face we see in the mirror each morning? But then … that face is not one that another person—our family and friends, our employers and customers—would immediately recognize. The image we see is reversed left for right and subtly distorted by any asymmetries in our faces. Anyone who has seen a picture of him- or herself and wondered for an instant who that might be will understand the difficulty. The world knows us by a different aspect, a different lift of the eyebrow or curve to the smile, than the face we know so well.

When we think of ourselves acting in the world, as a character acts in the novel, what do we imagine, what do we see in our “mind’s eye”? For that matter, what do we see of ourselves in the fictitious narrative of our dreams each night? Do we visualize from the fragments of self that we can see in the daytime? That is, are we aware of the tips of our shoes and the tops of our knees flashing just below our line of sight as we walk along? Do we see our hands as they gesture to support our arguments, type on the keyboard, or manipulate knife and fork—or chopsticks—to bring food to our mouths? Do we look down and see our own bellies and laps?

I believe we all carry a stylized, ghostly image of ourselves in our mind’s eye that has about as much relation to our physical reality as the description of a character in a novel bears to the total persona that the reader builds up in imagination from the character’s verbal tics, reasoned thoughts, reactions to surprise or danger, private attitudes, needs, desires, drives, and actions. Our inward mask may be mildly distinguished by dark or fair complexion or hair, a characteristic frown if we are feeling threatened or suspicious, a smile if we are happy or feeling generous, a brisk or languid manner depending on our mood and energy level.

In social situations we may remember and force ourselves—or that outward self we wear—to smile if the occasion is happy, or to frown and look grave if the occasion is solemn. Then we are working to make our outward image conform to an expected reaction, regardless of how we might actually feel. When a person works in a business environment or service function, that conscious smile or solemn look, which is sometimes—perhaps often—at variance with our personal feelings, can become an almost unconscious reflex. But when we are turned out to our pleasures—sitting at home reading, say, or in a darkened theater watching a film—then our face will relax into a habitual shape, either a vague frown or an unforced smile, depending on what might be called the state of our soul.

But is any of this what we really are? Is a human being an image, a face, or a mask any more than a character in a novel is the physical description provided by the author? Aren’t we both more: a sum of verbal tics, thoughts, hunches, reactions, needs, desires, drives, and actions played out in a given set of circumstances? And aren’t those tics, thoughts, and actions predicated on, and predicted by, the configuration of those trillion neurons suspended in jelly—or a hundred thousand lines of Lisp coding arranged in ten modules? What does a “real” person look like, anyway?

1. A third approach, leaving aside the question of detail, is for the author to base the character description intentionally on the face and physical features of a currently prominent and popular actor. The idea—for those writers who yearn for a movie to be made from their novels—is that the general readership and eventually the actor him- or herself will see the resemblance and absorb the idea of that actor playing the character on the big screen. I never heard that this approach actually works, but it gives some writers a reason to hope.

2. Note that I have used the male pronouns he-him-his for ME. Although an artificial intelligence is technically genderless, he identifies with and responds to gender in others. As a susceptible awareness raised in the mostly male environment of a programming lab, this particular intelligence appears simultaneously drawn to, charmed by, protective of, and confused by human females while emulating and reacting competitively—either dominant or submissive—to male figures. So, when a choice of persona is offered, ME adopts the male character and pronouns.

Sunday, May 15, 2016

My Take on Voice Hearing

In the community of people with severe mental illness—now commonly called “consumers” (i.e., of mental health services) or “peers” (i.e., of others with mental illness, as opposed to the rest of humanity)—it has become accepted that auditory and visual hallucinations, such as hearing voices and seeing people and things that others cannot, represents a form of reality, a “lived experience.” It is no longer acceptable to tell consumers that the voices and the people are “all in your head” or that they are “not real.”

I am of two minds about this state of affairs. On the one hand, it really is condescending and paternalistic for the rest of us “normals” to tell people who are having visions that their reality is a delusion and the visions don’t exist. We are imposing our interpretations and prejudices on another human being, which is generally a bad way to act. On the other hand, if we accept that such a thing as “severe mental illness” exists and represents a pathology, a departure from health and the natural, unaffected operations of mind and brain—and not simply “another way to be”—then accepting, condoning, and even participating in the patient’s experience of hearing voices and having visions would seem to be anti-therapeutic. If the patient is struggling for recovery, which would be a return to health and “normal” brain function—and not just a pleasant accommodation of the illness’s symptoms—then it would seem obvious that the patient must acknowledge the common interpretation of human reality and accept that the voices and visions are a product of illness. Indeed, they are “all in your head” and “not real.” And does the notion of illness mean anything if its symptoms are promoted as simply another way of looking at the world?

From both a physiological and a philosophical point of view—that is, both neurology and psychology—we now understand that what a person sees and hears is more than just the light waves entering the eyeball and sound waves entering the ear canal.1 Our eyes and ears are only the signal inputs, reporting all received sensations to the brain centers—the occipital cortex for sight, the superior temporal gyrus in the temporal lobe for sound—where these inputs are then processed, interpreted, compared to prior experience, and coordinated with other sensory inputs.

Our sensory apparatus receives much more information than the signals that eventually become processed as experience and enter into our awareness. As you sit reading this, your brain focuses your attention on your visual cortex and on interpreting the symbols written in emitted light on a computer screen or reflected light on the paper page—if you happen to have printed out this blog. While your awareness is so engaged, your ears continue to absorb sounds from the room around you, including the soft hiss of air molecules impinging on your ear drums. Your skin records the temperature of the air around you, the weight of your clothing, and the pressure of your body on the chair or other furniture. Your limbs are sending signals about orientation and muscle tension, and your brain responds unconsciously by adjusting your position. Your nose records random smells, most of which don’t rise to the level of awareness unless they are strong or correlate with remembered experience, such as a dangerous and disagreeable odor like the sulfurous mercaptans added to natural gas for easy detection, or the pleasant aroma of someone nearby cooking a favorite food for dinner. If you take a sip of the beverage at your elbow, your taste buds will signal the drink’s flavor, which your brain in its concentration on reading may or may not accept into awareness.

Every sense is receiving and forwarding to the brain its thousands or millions of messages every minute. It is the business of the brain, both in its processing centers like the occipital cortex and the temporal lobes, and in its control of consciousness in the brain stem and forebrain,2 to interpret these signals, accept which among them are significant at the present moment, and which can be safely ignored—perhaps to be stored for later analysis, perhaps lost for all time. It is my contention that the hallucinations experienced by people with a severe mental illness, as well as the visions experienced by normally healthy people in temporary states of ecstasy, drug and alcohol intoxication, extreme agitation, overstimulation, intense fatigue, impending starvation, or some other impairment of normal function, are related to these otherwise ignored or misinterpreted sensory signals.

In my novel The Professor’s Mistress, I portray a young woman experiencing the progression of schizophrenia and veering toward her first psychotic break. As she sits in a chair in her living room, the furnace comes on and the air register issues a soft, barely heard whisper. She interprets the sound as the spoken words “fish knives.” This conjures in her memory a moment of embarrassment during her wedding reception when she opened a gift of silver fish knives and, in her naïeté, didn’t know what these strange implements were used for and so clowned around with them. Later, when she is deeper into her psychosis, she mistakes a shadow in a darkened bar for the ghost of her long-dead mother and carries on a macabre conversation.

We know from brain imaging technologies that it’s a myth we humans only use about ten percent of our brains. All of our brain is functioning and active most of the time, although not every function or activity rises to the level of our awareness. And our awareness is divided into the main focus on what we are doing—like reading or driving or holding a conversation—and a roving, restless, unmindful subsidiary awareness in which random thoughts, uninvited notions, and unrecognized sensations will announce themselves—like the smell of dinner cooking or the idea that a normal household sound is actually someone whispering “fish knives.” It is with this subsidiary non-focus of awareness that the subconscious intrudes on our daily thoughts. This is where, in my own case, the solution to a problem that I’ve put aside from active contemplation will suddenly rise into active focus. It is also where a rustle in the grass will put us in mind of a lurking tiger, and moonlit shadows of the leaves stirring overhead will make us think of ghosts. And in this not-quite-focused mental state, the rustling may actually be perceived as the approach of a tiger or the leaf shadows as the presence of an invisible spirit.

As a writer, I experience voices and visions all the time. My process of writing—and it may be different for some other writers—is, first, to read over and absorb the materials I will need to work on before sitting down at the keyboard. That is, I will look at the notes I have taken from an interview or research on an article assignment, or the outline with plot points, dialogue cues, and imagery suggestions for the next scene in a novel. These notes constitute the universe of things I know about the subject. Second, I also need a starting point for the article or scene, which might be a question to resolve in the article or a sound, image, a line of dialogue from the scene. I call this starting point the “downbeat.” But then, with all these elements floating loosely in my mind, I put my fingers on the keys and just start writing. Other than the fragments in my outline—which may or may not subsequently appear—none of what comes out is exactly planned. I am hearing the narrative voice in my head speak the prose, seeing and hearing the imagery that the characters experience, and hearing them speak the dialogue as it develops. In this process of revelation, I give my subconscious wide scope to intrude with random imagery and associations.

The difference between my seeing visions and hearing voices during the writing process and the experience of someone with schizophrenia or other mental illness is that I know where the sights and sounds originate. I know that I am indulging the “self-talk” that we all carry in our heads. This is the articulate stream of conscious that uses language to express thoughts. It is the voice in our minds that exclaims “What’s that?” when we hear a strange sound—even if we don’t say the words out loud. Most literate people think in words rather than raw emotions and reactions. I, by long practice, have trained this self-talk to speak in complete, grammatical sentences and have lent it to my subconscious to conjure up action, imagery, adopted personas, and extended conversations in the case of my novels, or lines of argument and paragraphs of orderly explanation in the case of my blogs and articles. This talent is not unique: I’m sure every writer exercises his or her imagination in this way to some extent.

Scientists have begun to understand3 the cognitive and emotional impairments that mark schizophrenia and the other psychotic illnesses which develop in late adolescence. They attribute at least some of these symptoms to an overaggressive form of the natural paring of excess neuron connections which have grown out during the brain’s development. A certain amount of paring is normal in the maturing brain, but the brains of schizophrenics seem to take away too many connections, damaging the brain’s functions. In such a case, reasoning ability and emotional stability are lost. It’s not too much of a stretch to think that a similar impairment takes place in the brain’s ability to process sensory inputs in an orderly fashion and distinguish between what is actually perceived and the false interpretations supplied from the imagination: the ghosts, the tigers, and the fish knives.

Is it possible that a person with severe mental illness might not know that this substitution of sensory imagination for actual experience is taking place? I offer three possible explanations. The first is that the brain can certainly hide whole areas of experience from active awareness. We see this in amnesiacs who can’t access their past experiences. We also see it in people with what is now called “dissociative identity disorder” and was previously known as “multiple personality disorder.”4 The second explanation is that psychologists are beginning to understand the mind’s recall of experience is actually a fairly slippery and inexact process. We analyze and change a memory slightly every time we bring it forward into awareness. This is one of the reasons false memories are so easy to implant: the mind can incorporate notions and suggestions about an experience into every rendition of it until the false elements become as real in the awareness as an actual occurrence. And third, mental illness tends to protect itself with a symptom called “anosognosia,” in which the mind denies that it is experiencing anything unusual or out of the ordinary.5

Given all these clues to the unreliability of actual experience versus scrambled sensory inputs mixed with active imagination and impaired perception, is it really so strange that people who have visions and hear voices might insist these experiences are real and separate from their own mind? But that does not make the experience real. And it does not make the illness disappear.

1. Along with chemical traces impinging on sensors in the nose (i.e., smells) and on the tongue (tastes); mechanical and thermal stimulation of sensors in the skin (touch, pain, temperature); the pull of gravity on liquid in the inner ear (balance); and strains on muscles and tendons (body position). All of these are as subject to confusion and hallucination as the senses for sight and sound.

2. The exact location in the brain of our “consciousness” is still a matter of study and conjecture. The simple act of being awake and aware, as opposed to asleep and “unconscious,” belongs to a cluster of neuron cells in the brainstem called the reticular activating system. These cells work with other parts of the forebrain such as the hypothalamus, basal forebrain, and thalamus which participate in various pathways identified by their particular neurotransmitters—the chemical governors of the nervous system—including acetylcholine, dopamine, norepinephrine, and serotonin. These pathways access higher parts of the cerebral cortex where processing of sensory inputs, controlling motor function, executing planning and projection functions, and other discrete processes occur. For a deeper discussion of how these systems contribute to consciousness, see for example, Brain Stories, by Teddy Poh.

3. See “Scientists Move Closer to Understanding Schizophrenia’s Cause,” by Benedict Carey, The New York Times, January 27, 2016.

4. See for example, “Dispelling Myths about Dissociative Identity Disorder,” by Margarita Tartakovsky, MS, at Psych Central, or “Dissociative Identity Disorder (Multiple Personality Disorder)” at WebMD.com.

5. See I Am Not Sick, I Don’t Need Help! by Xavier Amador on the poor insight of people with a mental illness.

Sunday, May 8, 2016

Something Happening Here

What it is ain’t exactly clear …1 And maybe that’s because I’m political but not tied into the party structure and punditry of this country. My political and economic blogs usually deal more with philosophy, root causes, and underlying assumptions than any appraisal of this or that candidate, proposed bill, or current slogan. And yet, having watched the 2016 primary election process in both parties—a slow-motion train wreck, in my view—suggests some kind of political earthquake is going on. And I don’t think anyone, either working in the mainstream media or opining on the pages of right- and left-wing blogs, has the current situation exactly in focus.

Neither do I. But I can sense a rumbling, the low-frequency hum of shifting tectonic plates, and it both scares me and excites me. Excites me, because we might for once see real, radical change in this country, and not just in the usual progressive direction. Scares me, because actual change is always interesting in the sense of the old Chinese curse.

First, let’s take the Democrats. For the past six months or so, we have watched Hillary Clinton fight off Bernie Sanders. Ever since the 2008 primaries, she has been the presumed heir apparent, the next in line. This is largely because she’s a woman, and the favoring of special interests which is the hallmark of the Democratic Party long ago dictated that, first, we needed an African American to be the standard bearer in 2008 and, now, a woman to follow after him. That was the order of precedence. And after a woman, presumably, would come a gay candidate, then perhaps a Hispanic, and then—given the current focus of favoritism—an openly transgendered psyche.2 Merit, experience, and qualifications count for something in this process, of course, but pride of place goes to ethnic or gender identity. In Hillary’s case, it helps that she has name recognition as a Clinton; experience as first lady, senator, and secretary of state; has campaigned nationally before; has access to massive funding; and is backed by the party establishment. She also has lots of baggage—but so does any politician of her standing.

And Bernie Sanders? What has he ever done? He was mayor of Burlington, Vermont, and then sole representative in Congress for that state, and finally, later one of its senators? Styling himself a progressive and social democrat—well, some kind of socialist—he was a longtime independent who criticized the Democratic Party and only joined it late in 2015 for his 2016 presidential run. A brief look at his political career shows he was most active in opposition to pending legislation rather than sponsoring any landmark, direction-changing bills of his own. “Leader” is not the first word that comes to mind with Bernie. “Gadfly” and “old white guy” come closer to the truth.

And yet the young people love him! His avowed socialism has been, as they say in Silicon Valley, a feature not a bug. Medicare for all! Free college tuition for all! He would gladly spend another $20 trillion that we don’t have, on top of the $19 trillion we already owe.3 And he would make up the difference by taxing into elimination the wealth of the “top 1%”—even though that wealth, when taken into the public treasury, would fund the government for about one fiscal quarter.4 What the young don’t understand is that modern socialism does not lead to communal enjoyment of shared resources but to the crony capitalism of China and Japan, where the government anoints economic winners and losers. A young person under Bernie’s kind of socialism could kiss goodbye his or her dreams of inventing the Next Big Thing—whether groundbreaking technology or a popular internet app—in the garage. The benefits of innovation, if it exists at all, will flow to established companies with big legal departments and lobbying power. And the worse danger is that Bernie would lead us, not to Denmark and Sweden, but to Venezuela.

In reaction to Sanders’s success at the polls, Hillary Clinton has pushed far to the left. Perhaps when she gets in office she can be flexible and “triangulate” her way back to reasonable economic and social policies, as her husband did after the 1994 midterms. But perhaps not. She has seen the future and it is the Big State, Leviathan, government as absolute solution, the collective will of the majority, written in stone.5

Then, in the Republican Party, we’ve had a howling mess, with up to 17 candidates competing for national prominence. In addition to a sprinkling of governors and senators with no clear dominance of the status quo, we’ve had a real-estate developer, a high-technology CEO, and a brain surgeon.

My common sense says that for an executive office like the presidency, the natural choice is a person with the closest comparable experience, such as a state governor. This was the path of Franklin D. Roosevelt, Jimmy Carter, Ronald Reagan, Bill Clinton, and George W. Bush. Barring that, someone with corporate experience as a CEO would know how to lead a large, complex organization like the Executive Branch. And yet in the 2016 primaries, governors Rick Perry, Scott Walker, and Jeb Bush dropped out early, and John Kasich was a woeful laggard among the survivors. Carly Fiorina as a corporate chief executive with a history of tough choices was always a distant seventh or eighth. (Simply being a woman or ethnic minority doesn’t cut as much ice with Republicans as it does with Democrats.) And of the senators in the race—the path formerly taken by John F. Kennedy, Lyndon B. Johnson, and Barack Obama—the near-front-runners Ted Cruz and Marco Rubio have made their names as outsiders to the party establishment and spent their political capital attacking each other.

Until the recent shedding of all other candidates after the Indiana primary, most of the Republican candidates opposed and distanced themselves from the party establishment over key issues. And at this late stage, the candidate with the most primary votes and the likely nomination is the real-estate developer and reality-television host whose connections to the party and its philosophies are, at best, tenuous.6 He has clawed his way to the top of the heap with braggadocio, smears, insults, wild schemes, visceral intolerance, and childlike petulance. The rest of the candidates all damaged themselves by attacking and smearing each other rather than clearly stating their own positions and solutions—although, of course, the structure of the primary debates as media circus and moderator-directed free-for-all did not help them here.

My plaintive call for the past six months has been, “Where are the adults?” Where are the wiser heads, the party elders, the experienced people who are supposed to watch over our traditions and not let such muddle and confusion get out into public view. I think they are in hiding, hunkered down, waiting for an explosion or the apocalypse. I am reminded of the white magicians of Monte Albano in James Blish’s wonderful novel Black Easter, which was about letting all the demons out of Hell. As chaos ensues, the champions of order and justice try to gain control of the situation by summoning heavenly angels to humankind’s defense—but the only spirits who will come down to their mountaintop retreat are either fidgeting and distracted or headless and terrifying. God knows what’s about to happen, and it ain’t pretty.

So … what is happening here? I think that, on both the left and the right, we are seeing a massive frustration with the way things are going. And the result is that radical and irrational choices are becoming almost … sane.

After seven and a half years, President Obama’s progressive transformation of the country—so horrifying to conservatives and Republicans—is not moving fast enough either for the young people who follow Bernie Sanders or for the old-line Democratic Party faithful for whom half-measures and methodical approaches are anathema.7 But after a strong start, the progressive surge has run up against the natural resistance of a country whose politics are still pretty much centrist. The surge has also encountered the inertia that all utopian efforts experience when they meet the frictions of real-world economics, party politics, hedging, compromise, and everyday complacency. At the same time, the Republican attempts at stopping this transformation, bolstered by midterm elections that brought into Congress the Tea Party and other uncompromising conservatives, have encountered the frictions of congressional rulemaking, ridicule from the mainstream media, and an assured presidential veto. And so, for those same horrified conservatives, the Republican Party establishment’s opposition is not strong enough or self-assured enough to earn their confidence.

Like a giant wad of saltwater taffy, the country is being pulled by the poles of its two parties in opposite directions: Utopia Now! on the left, and Stop! Back Away! on the right. No one is specific or patiently descriptive about the kind of country and political systems toward which or away from which they want to move. When you’re temporizing and building castles in the clouds, you don’t use actual bricks. But everyone knows the direction they want to go—toward or away.

As in the 1960s, and in the song with which I started this meditation, the notion of revolution is in the air. In the Counter Culture—the environment in which many in today’s Democratic Party cut their baby teeth—the word was explicit. Today, “revolution” has become “transformation,” although the intent is the same: whatever we have at present is bad, throw it away and start over, building something really good. But it’s hard to call for explicit revolution—with its attendant riots, economic and political disruptions, social upheavals, attempted military interventions, and ultimately a change of government—when you’re the party in power. Still, the notion is there, fostered by the mechanics of agitprop and agitpunkt in Alinsky’s Rules for Radicals. All of that Counter Culture rebellion was aimed at creating so much chaos and collapse that the status quo, “the System,” “the Man,” and eventually the government would be overturned and not just gradually transformed. But old habits and attitudes die hard.

And on the right, notions of backlash and abolition—or of outright secession and ultimately civil war—are also in the air. If the transformation cannot be stopped, maybe it can be sidestepped by creating a separate and more comfortable reality. Donald Trump has already suggested a Summer of ’68–type revolt if he does not get the Republican nomination in Cleveland. His calls for political violence are stirring some instinct in his followers—and I can’t believe they are all skinheads and white supremacist thugs, who have never amounted to more than a pimple on this country’s politics. Trump is appealing to many people, though not especially to conservatives, because his solutions are bold, uncomplicated, and uncompromising. Build a wall! Ban the Muslims! Rip up the unfair trade treaties! Return to greatness! (Make the trains run on time!) His followers are no longer interested in half-measures and methodical approaches, either.

Like utopias, revolutions and rebellions have the beauty of imagined simplicity—so long as you are not specific about outcomes, or body counts, infrastructure collapse, famine, or what parts of the status quo actually have to come crashing down. Revolutions and rebellions are also unpredictable: once people start reacting with their guts and smashing anything they don’t like, the wave can move in unexpected directions.

But one thing I can predict: none of the players who started the ground shift this year will end up in power when the tectonic plates finally stop moving. In this I am reminded of the coalition of Social Democrats and other reasonable political reformers in Imperial Russia, who forced the Tsar’s abdication and set up a constitutional government but soon fell to a second revolution by the Bolsheviks, who before that had been nothing more than a pimple on the country’s politics. And I think of the National Socialists, who had been only a comic nuisance—mere street thugs and beer-hall orators—all during the years in which war reparations and the Weimar Republic ran the German economy into the ground, but when things finally fell apart and wiser heads tried to take control of the chaos, the Nazis rose to power.8 Stranger things have happened in recent history than the sudden warping and disappearance of two old, established, cherished parties such as our own Democrats and Republicans.

Like the fidgeting, headless angels of Monte Albano in Black Easter, God knows what’s about to happen, and it ain’t pretty.

1. Buffalo Springfield, “For What It’s Worth,” 1966. This song was popular just as the Counter Culture was taking off in the year I graduated from high school. It seems particularly relevant today.

2. Suggesting that a brilliant political career lies ahead for Bruce-Caitlyn Jenner.

3. When I was in college, we studied economics from Paul Samuelson’s textbook. He taught, and we all believed, that the national debt didn’t matter because “we owe it to ourselves,” and no one was going to call it in. That was then. Now the public debt is held variously by the governments and central banks of China, Japan, the European Union, and by the Caribbean banks. If we lose the reserve status of the dollar, thereby goring their national economies, these governments might just call in our debt. And yet no one in the current election cycle is particularly addressing this astounding overhang of $19 trillion—an amount we can never repay with any amount of simple spending cuts and revenue increases.

4. And when you start taxing wealth instead of income, you have the government probing even deeper into your life and personal finances than it does now. Besides, taxing private holdings would crash the economy by removing the major source of investment for many corporations and charities. See It Isn’t a Pie from October 3, 2010.

5. For this I blame my generation. While those of us who were centrist and conservative went into business and the professions, the campus radicals who protested the Vietnam War and “the System” went into government and teaching positions. We made money and built our careers, while they prepared the next generation for cultural, political, and economic revolution. And now, for the vast majority of young folks today, the air they breathe and the water they swim in is democratic socialism, if not soft, silent Marxism with a small “M.”

6. My personal feeling was and still is that Donald Trump is a Democratic Party plant and the “October surprise.” He is a longtime friend of the Clintons, a friend and patron of East Coast Democrats, and a manipulator and partaker of government influence and largesse. He has structured a campaign designed to appeal to a Democrat’s caricature of the Republican voter: greedy, intolerant, racist, and isolationist. If or when he gets the nomination, I fear he will do something outrageous at the last minute—well, something even more outrageous—in order to throw the election to Hillary.

7. Like the Affordable Care Act. For progressives, this bill was never about ensuring that everyone bought and paid for their own medical coverage through the insurance industry. It was a Trojan horse, loaded with adverse measures like community rating and mandated minimum coverages, designed to break the current private-insurance model of health care, bankrupt the commercial insurance companies, and force the transition to a nationalized, single-payer health care system. But such legislative cleverness is lost on people for whom single payer is an obvious and worthy goal that they feel should be enacted immediately.

8. For further analysis of these historical references, I suggest Bertram D. Wolfe’s Three Who Made a Revolution on the origins of the Bolsheviks and their takeover, and William L. Shirer’s classic The Rise and Fall of the Third Reich on the Nazis. Both old but good and thorough reading.

Sunday, May 1, 2016

The Curse of Subjectivity

First, as preface, let me say that I had a wonderful childhood. I had great parents, a good elder brother, and a wonderful and funny extended family. I wanted for nothing growing up, because my parents filled our home with books and music, took us to museums, had boats and taught me to love the water, and—my father being a mechanical engineer and my mother a landscape architect—filled my young life with interesting questions, gadgets, and insights. I had no doubt that my parents loved and supported me … except, of course, when I hated them and assumed they hated me—but every child hates the world that at some point.

Still, I was raised in the 1950s by parents who had gone through the rigors of the Great Depression and then World War II. Even though we were in the glorious decade of America’s emergence as a superpower and, compared to every other country, wealthy beyond belief, it was not a time for coddling children. The Victorian adage “children should be seen and not heard” was still in effect, at least in our household. So my brother and I were expected to go along on all family outings, sit quietly in the backseat, and not bother the grownups. We were to be respectful to all adults. Our childish opinions might be heard, with a smile, as far as the second or third sentence, then dismissed with a pat on the head. Our heartfelt desires were tolerated only on our birthdays and Christmas morning. Our deepest fears were explained away with reason and witheringly cold logic. Our inconsolable tantrums were simply not tolerated.

It was still a time when a measure of adversity was considered good for the soul as well as the body. My mother used to say that every child must eat a peck of dirt, because she understood instinctively that exposure to the world’s microbes was good for the developing immune system. She was unfazed and practical about scraped knees and cut fingers, knowing that childhood was a rough-and-tumble experience. My parents didn’t mind if I went out to play after dinner, because they trusted I would be home before full dark. They knew that shielding a child from the world was a bad way to grow up. They also knew that getting everything I wanted was bad for character development: I had to do chores to earn my allowance, and it was—by design—never enough to buy all that I desired. I had to save, plan, make choices, and know that some wants would go forever unfilled.

Mine was not a cold household, but personal emotions and feelings never really entered into the discussion. I was supposed to get up and go to school, rain or snow, hot or cold, whether I felt like it to or not.1 I was supposed to do my chores whether I liked them or not. If I asked to be excused from regular housework or the common courtesies of setting and clearing the table, I had better be delirious and running a fever—or better yet unconscious. Being awake and ambulatory meant you did your chores, your homework, your duty. “I don’t want to” was not a strong argument in our house.

We also weren’t encouraged to take things personally or see ourselves in personal terms. My parents considered it déclassé2 to ask for special favors or to expect special treatment. To assume that one was part of a particular group or class, for good or ill, and marked for particular attention, either favor or scorn, was to separate oneself from the vast community of unremarked individuals, all democratic equals, who deserved our generalized respect, public courtesy, and—within limits—our polite trust. Everyone encountered in their world was treated as a lady or gentleman—that is, expected to behave well, extended the proper courtesy, and offered a helping hand in need—unless and until he or she proved otherwise.

If I had to sum up my parents’ attitude toward the world—and the attitude they expected me to adopt in growing up—it would have been: “This isn’t about you.” That is, the world doesn’t care about me. Life doesn’t owe me anything.3 Other people are not really concerned about my situation, my prospects, and my feelings, one way or another. I am not fundamentally different from that vast community of unremarked individuals.

That is a tough lesson to learn, but it has served me well. First, it has made me relatively immune to, if not suspicious of, personal praise. If someone goes out of his or her way to praise and admire me, especially in a situation where the comment was not conspicuously earned, I know to expect some angle, some purpose that will work to that other person’s benefit and not mine. I know what I have achieved, and it is my judgment which matters, not the world’s or anyone else’s.

Second, the lesson has left me relatively insensitive to insult, if not exactly tough-skinned. If someone makes a general observation about a particular class or type of individual, or a characteristic mannerism or action, or some other comment meant to arouse anger and ill feelings, I tend to ignore it, because I don’t stop to think the comment is about me. I don’t see myself as a member of any special class or group, for good or ill. I don’t fly little flags on my person that distinguish me from those around me. I slide through insults as one of those unremarked individuals, a gentleman, worthy of and assuming the world’s generalized respect.

Third, the lesson has made me dutiful in my relationships. Since “this isn’t about me,” I treat my work, my personal obligations, and my commitments as foremost in my life and matters of personal honor. For forty years, I went to the office every day, prepared to work to my physical and mental limits,4 cope with problems as they arose, and keep moving forward, regardless of how I felt about any of it. The idea of taking, as some of my coworkers would say, “a mental health day” never entered into my thinking. If I was conscious and ambulatory, I went to work and did my best. Duty before feelings was my approach, learned at my mother’s knee.

The gift my parents gave me was an outward focus. I was trained to observe the world objectively, on its own terms, and make the best of it that I could. It was not my place to interpret everything I saw subjectively—in terms of my feelings, my likes and dislikes, or my sense of myself.5 The world is a given, complete in and of itself, and not a subjective reality that I or anyone else can control.

To be raised otherwise as a child is to see the world as revolving around yourself and your feelings. This is a kind of crippling, a lack of development in a person’s psychological immune system, and a lack of sufficient inner personal resources to cope with whatever comes next. Taken to extremes, it can make you distasteful, untrustworthy, and vulnerable to others.

1. Of course, living in New England, we had “snow days,” when the roads were too treacherous to travel and public schools were closed. After a storm the night before, we children would listen breathlessly to the morning news to see if our town had announced no school that day. Hooray! And we never calculated that for every day school was closed in winter, the school year was extended one more day into summer. But, at that age, June was in the far distant future.

2. A word they never used, although they certainly knew its meaning. To them, standing out, making demands, or claiming for yourself some kind of special standing was a proof of personal insecurity, perhaps of bad intentions, and likely of possessing insufficient personal resources. To them, capable people paid their own way, met adversity with equanimity, and never complained. My parents would have appreciated the attitude of Duke Leto Atreides in Dune: “Let us not rail about justice as long as we have arms and the freedom to use them.”

3. “The world doesn’t owe you a living” was one of my father’s favorite sayings. And by that he didn’t just mean that the world—the community, the country at large, and the economy—had no business paying me a stipend or an unearned income. That much went without saying. He meant that the world had no place reserved for me: no expectation of a job, a ready niche, or a set of opportunities waiting for me to step into. If I wanted to make my way in the world, I would have to scramble, to learn skills I could sell, to make myself good at something the world needed, and still guard myself and my household against theft, misfortune, and economic downturn. Or as the old American catchphrase put it, when the colonists turned their pigs loose to forage in the forest: “Root, hog, or die.”

4. A supervisor once said Parkinson’s Law—that work expands to fill the time available—never seemed to apply to me. I always tried to do my assigned work with dispatch and then, usually, looked around for more.

5. Of course, I have a sense of self—a rather strong one—but it is not the filter through which I see the world. It is the measure I use to mark my own actions and achievements. I am the judge of me, while other people have their own place with their own rules and values.