Sunday, July 7, 2019

A Strange Esthetic

Red file folder

I have an aversion to the obvious. The world that is apparent to the senses at first look, the answer to any question that comes first to mind, the most widely accepted social and political views—all of these fall somewhere between ennui and ick! with me.1

I prefer the second glance, the deeper meaning, the hidden truth. And that’s if I’m feeling philosophical. In other areas—music, for instance—I like the strange chord progressions, the minors over the majors, the first and fourth over the first and third, and the transitions that feel just a little bit “off” and odd. In paintings, I prefer bold but unusual color combinations, hues, and shadings, or perspectives that are slightly skewed. In photography, I like shots taken from an angle or from noticeably above or below eye level.

So a guiding principle in my writing—my esthetic if you will—is to avoid the obvious. It’s easy enough to tell a story from one point of view, the first-person narrative. Or from the omniscient narrator, who observes and reports all the action at once, sampling the story from inside A’s point of view and then, in the next sentence, popping into B’s head to get the reaction to whatever A has done or said. It’s easy enough to set the story in familiar old Grover’s Corners—the background of Thornton Wilder’s Our Town—or the anonymous American suburbs that Steven Spielberg mined so artfully for his characters in many of his early movies.

To me, that’s bland and boring. And it lacks style. I much prefer a definite setting with a few kinks and quirks and special needs, like the summertime resort of Amity Island in Spielberg’s Jaws, where everyone is dependent on tourist dollars and so has an economic as well as a visceral reason for hating and fearing the shark. I’m not looking for an Everytown as the place to tell a story, but a town with an edge and maybe a secret.

This is one of the reasons that I have settled on telling my novels through the tangled stories of multiple characters and tightly controlling their viewpoints. In this form, every scene is told from the viewpoint—that is, from inside the head, as if written in first person but with third-person pronouns—of a single character. The narration tells, and the reader knows, only what is available through that character’s senses and perceptions, intuition and insights, and knowledge of the story so far. If I want to show the reader the immediate reaction of another character to what the viewpoint character has said or done, that reaction must be discernible from an exclamation, facial expression, or other clue visible to the viewpoint character—and it will depend on the viewpoint character being the sort of person to notice the reactions of other people in the first place.

Limiting the story to the separate viewpoints of a cast of characters forces me as a writer to consider and choose. That narrowed viewpoint is like someone holding a flashlight in a darkened room. (I’ve used this analogy before.) The viewpoint character’s attention, vision, understanding, and reactions can focus on one thing at a time. This is like stream-of-conscious writing, but with the ability for the character to reflect, recall, and question what he or she is perceiving and doing.

And then I let the reader, who is riding along inside the viewpoint character’s head, have his or her own reactions to the world as the character sees it. For example, if the character sees but does not note or distinguish an obviously misplaced object or clue, the reader is tacitly invited to note it for him- or herself and thereby wonder about the perceptions, understanding, and even the intelligence of the viewpoint character.

This kind of limit on the scope of my writing—and these mind games I play with the reader—force me out of the obvious ways of telling a story. The story doesn’t start just anywhere but in a particular place and time, and with a particular viewpoint. And from there, I am using the perceptions of the viewpoint characters to make the setting unique. Not just a china cup but a china cup with a crack in the rim, or fading paint in its design, or the character’s memories of the cup once sitting in Grandma’s china cabinet. The world in this place is not obvious, not simple, not the expected. It’s a different world, filtered through the perceptions—and sometimes the misperceptions and misunderstandings—of a particular person.

By avoiding the obvious, by looking for the strange, the skewed, the particular, I am forced to make the novel’s setting and circumstances come alive in my imagination and in the reader’s mind. I give the world an element of surprise leading—sometimes but not always—to a consideration of what might be new and different this time.

Of course, there is a danger in taking this aversion to the obvious too far. Some combinations of musical notes are not mysterious but simply discordant. Some color combinations are not only surprising but clashing and garish. And some stories so violate the norms of sensibility and end up in such bad places that readers are not enticed and intrigued but simply repelled. So, as always, the dominant force in the storytelling—as in music and art—is the creator’s sense of control.

The author’s imagination—as with the composer’s ear and the painter’s eye—can run all over the place. The artist can reach for the weird simply in order to be weird. The intent can be to create the strange rather than the interesting. And sometimes, if the artist is in a bad mood, to create the repellent and offensive, to trick the reader into stepping into a metaphorical manure pile and then, presumably, to laugh as the reader vainly attempts to wipe his or her shoes.

So the aversion to the obvious requires an element of restraint. In every art form, there are reader/listener/viewer expectations that are shaped and honed by experience and catalogued for the artist in volumes concerning poetics, music theory, or art appreciation. Stories, for example, don’t always require happy endings,2 but they do have to end in a place and manner that explain the actions that have gone before and render a set of consequences that the reader finds intellectually and emotionally satisfying. To push the story in a direction or to a conclusion that avoids the obvious to the point of not making any kind of sense would be a mistake.

But, that said, I do spend a lot of time between the first impulses recorded in my outline and the final set of words on the page looking for images, responses, and story lines that rise above the obvious first-take and arrive in someplace unique, interesting, and sometimes even surprising.

1. In fact, an element of my somewhat strange and dry humor is to state the obvious with a degree of apparent boldness, as if I were drawing a new insight, or absolutely deadpan. This usually gets me funny looks and explains why some people think I’m really kind of stupid.

2. For this, see my recent blog Classic Comedy from May 19, 2019.

Sunday, June 30, 2019

That Apple Story

Apple with bite

In the Bible, in Genesis 2:16-17, the Lord God warns Adam and Eve against eating the apple from the Tree of the Knowledge of Good and Evil. They do so anyway, because the Serpent—Satan in disguise—tempts Eve with flattery. And after she eats, she entices Adam to join her. This act is the one thing in all of their idyllic existence that God has forbidden them to do. It is the one fruit in the Garden, a place of plenty and pleasure, that is denied to them. And when they eat it, they are banished forever into a hard and desolate world to make their way amid toil and suffering.

For Christians and Jews, this is the “original sin.” It is the moment of mankind’s “fall from grace” and separation from God. This one ancient act removes all of us—even the most innocent, the newborn babies, the harmless imbeciles, and saints who strive to do good and create love and understanding all their lives—from God’s perfect love and understanding. This knowledge of good and evil is also, by the way, the major functional capability that separates us from the animals. This is what makes us human.

The wolf hunts the rabbit and eats it. The wolf is not mean-spirited or bad-tempered. The wolf does not hunt the rabbit because the wolf is evil. The wolf also knows nothing of love and mercy. It cannot refrain from hunting out of compassion for the rabbit’s meek little life. So the wolf also cannot become good in any moral sense. The wolf is merely fulfilling its nature as decreed by evolution or, if you will, as designed by God. The wolf does what it needs to survive in its environmental niche. If the wolf knows anything about good and evil, it is a simple equation: hunt and eat equals good; fail and starve equals bad. The wolf cannot even try to be good or sink to being bad. It can only be effective or ineffective in its ultimate purpose, which is hunting and eating rabbits and other small mammals.

Human beings have reasoning power. We also have refined emotional states. We can, we do, and we must think and feel about what we are doing. This is the cap that the prefrontal lobe—the center of the brain’s analytical, planning, directing, and decision-making capabilities—puts on our motor functions and our biological, endocrinological urges. We cannot help but think about ourselves and our actions—and ask questions about them. The content of those questions are not so much biologically based or inherited as they are culturally transmitted. For human beings are cultural and social beings, no matter whether we were developed that way by evolution or designed that way by God.

In one case, if your parents and your society laid emphasis on right and wrong, good and bad, you will likely ask if what you are doing or intending to do is moral or immoral in that context. But if your parents and society only laid emphasis on utility and efficiency,1 you would probably only ask if what you were doing or planning was going to work or not, to succeed or fail in your own interest, with less regard with whether you ought to do it or how it might affect others. But you would still ask yourself questions. You would still apply values, although biology and evolution cannot tell you what those values should be.

But later in the Bible story, in Genesis 3:22, God reveals his true purpose in denying humanity the apple. By knowing good and evil, Adam and Eve have become “as one of us”—that is, like God Himself and the higher order of angels. The human pair must be banished from the garden before they also eat from the Tree of Life and become immortal. Immortality plus ethical judgment apparently equals status as a rival godhead.

So the sin was not just disobedience—crossing a line that God had drawn around a certain tree in the garden. There are other sins of disobedience that humans have since repeatedly committed—particularly those carved in stone by the Moving Finger: taking other gods, worshipping idols, abusing the holy name, breaking the sabbath, disrespecting parents (the sins against reverence), as well as murder, adultery, theft, perjury, and envy (the sins against other people and property). None of these sins will damn a person or a race of beings for all time. And none is so grievous that it cannot be forgiven.

But the sin with the apple, coupled with the potential for sinning with the Tree of Life, was the one that could not be forgiven, ever. It comprised these two first humans—creations of the Lord God—aspiring, whether knowingly or not, to become gods themselves. In doing so, Adam and Eve broke up the order of things. They stepped above their proper place. They were set to become uppity.

This sort of thinking, that human beings were supposed to fit into a divine order and had to know their place, smacks of the medieval faith in what was once called the “Great Chain of Being,” or in Latin, Scala Naturae, the “Ladder of Being.” It’s an idea that goes back to the Greeks, with Aristotle and Plato. In this hierarchy, all of creation fits into a stepwise progression from inanimate matter, to plants, to animals, to human beings as a special and favored kind of animal, to angels, to God himself.2 This was the way of the world. This was the fixed order of creation.

Stepping outside this order would tend to challenge and subvert the notion of Creation itself. In a theology that depends on a Creator God, and places that God at the apex of all things inanimate and animate—in the old sense of having an animus, or life-giving spirit—the created being cannot rise above the creator. Cause and effect just don’t work that way.

But in the world in which we actually find ourselves—and putting aside cultural myths and creation stories3—the top of this world’s hierarchical existence is human-scale intelligence. Other animals may be larger, stronger, or faster. Some may even have intelligence approaching our own—like dolphins, whales, and elephants. But none of them has yet invented writing, or calculus, harnessed fire, or sent radio waves above the atmosphere. And that’s not for the lack of opposable thumbs—or the fact that dolphins and whales live in the sea, without access to fire and electricity—but the lack of understanding, planning, and imagination.

We human beings did not create ourselves, or not in the original biological form. That form with its massive brain evolved from the primates, who go back to ancestors among the apes, and from there back to the mammals, reptiles, amphibians, fish, etc. We did not do a whole lot of self-inventing for the hundred thousand years or so during which we hunted the slower animals, gathered the most edible berries, and experimented with fire when we found it resulting from a lightning strike. But once we settled down in the richest river valleys, discovered agriculture, and invented writing, our days as simply clever animals were over. We invented ourselves by capturing the thoughts of other people and previous generations and committing them to stone, clay, and papyrus in forms that people who had never met the original thinkers could interpret and understand. And from there, we went on to invent culture, science, and—yes—even religion, hierarchies, and notions of “divine right.”

Aside from our meaty parts, which evolution gave us, we are the animal that invented itself. We worked out notions of right and wrong, good and evil, utility and efficiency, and all the other values which we conjure and apply to the universe around us. We worked them out from observation, shared experience, and transmitted culture. Although still Homo sapiens in physical form, we are no longer our primitive, stone-chipping, berry-picking ancestors, any more than we are still H. neanderthalensis or A. afarensis. We have continued evolving in our minds, our culture, and our understanding.

We are the closest thing to gods as exist on this planet. And that will have to do until the Saucer People show up and can teach us something new.

1. You can imagine a society composed mainly of scientists and engineers—say, the Bene Tleilax of the Dune universe—who would place efficacy and utility as values above kindness and compassion. Their world might be mechanically perfect, but still it would be a hard place to live.

2. I suppose subdivisions are possible in this hierarchy, too. For example, a horse, for its general utility and gentle nature, would be ranked higher than the wild ass or zebra. And the more useful cow would outrank the moose and deer. If the Greeks and medieval Christians had known about protozoa, microbes, and germs, they would probably have classed them somewhere between dirt and plants, down there with the corals and sponges. It’s a game scholars can play endlessly.

3. And yes, I know: in this case, the Theory of Evolution and its great chain of ancestors, from the first self-replicating molecules, to RNA, to DNA, through the Cambrian explosion, to the armored fishes, the lobe-finned fishes, tetrapod amphibians, reptiles, and finally mammals to human beings, is just another creation story, revered with semi-mystical status.

Sunday, June 23, 2019

A Lot Like Madness

Midnight writer

Writing a book is a lot like madness. This applies equally to painting a giant canvas or composing a symphony—any major work that is unseen and unshared until all the effort has been expended and the thing is finished and ready for the world to read, see, or hear.

You labor with issues, detect and solve problems, and have daily bouts of triumph and despair about which your family and friends, the people who read your last book—viewed your last painting, or listened to your most recent symphony—and the general public know nothing.

And you know that much of what you have been living with for the past month, six months, a year, or more will never be known, never seen, by those same friends, readers, and the public. You alone will have any memory of your attempts to fit the working outline, or preliminary sketches, or evolving musical themes to the vision you had when you started, which was the whole reason you began this particular project. You alone will understand the compromises you had to make, the choices you considered and discarded, the opportunities that might have taken the project in another, more exciting direction but would have meant going back to the beginning.

For every finished work on the page, the canvas, or the score, there are half a dozen or more echoes and images in your mind of what you might have done, how it might have gone, the pieces that didn’t fit, and the pieces that would only fit after you changed them irrevocably.

Between the inspired first vision and the finished work there exists a rude, splintered, tentative thing that the writer calls an “outline,” the fresco painter used to call a “cartoon,” and I don’t know what the composer calls this intermediate step, perhaps indeed it’s a “theme” or a “melodic line.” It is the organic1 structure, upon which the writer hangs incident, description, narrative, and dialogue. It moves from the beginning, which occurs almost anywhere in space and time, toward a definite and specific end that, when reached, the reader feels is both inevitable and satisfying. But the writer, the creator, knows nothing about inevitability and feels nothing of satisfaction. How it will all come together, viewed from the midpoint of creation, is still a mystery.2

While the writer/artist/composer lives with—more often wrestles with—this intermediary function, you forget that none of this will ever be known to the reader, viewer, or listener. If the outline/sketch/theme serves its purpose, the finished work will be complete and stand alone as an organic whole. Alternative plot lines, overpainted details, and unsung melodies will not exist for the person who receives the work from your hand.

This is the source of what I call the “Frankenstein effect.” While the good doctor sees every mismatched part, every dropped stitch, and all the bits of skin he had to stretch to fit, the person who comes upon the finished production sees only the final effect, the monster. For the doctor, the body under construction remains a sad botch of might-have-beens. For the person facing it on the path, the monster is complete and terrifying—perhaps even perfectly so. And we must remember that, in Mary Shelley’s story, the good doctor is insane.

So … back to the question of madness.

In what occupation, other than the arts, does a person live so completely inside his or her own head? The surgeon is surrounded by a staff of other professionals—the anesthetist, surgical assistant, scrub nurses—who watch every cut, anticipate some of the surgeon’s moves, and are constantly aware of the progress of the surgery. The project manager or construction engineer works with a team of subcontractors, laborers, and logistics specialists, and communication about the project plan and design, so that they each will understand and can execute it, is a major part of the manager’s job. Even the orchestra conductor works through the various musicians and instrument sections, and when the conductor has a particular vision or novel interpretation of the score to execute, he or she must still communicate it to these performers, elicit their cooperation, and sometimes experience their negative feedback.

The writer, painter, or composer—especially when young or just starting out—works alone. If the artist is unsure of him- or herself, it is always possible to begin by copying and thereby studying the masters: those authors, painters, composers whose work first inspired him or her to take up this particular art form. But creating copies and rendering homage are not satisfying for the true artist, or not for long. The artist wants to tell his or her own stories, create a unique picture of the world, conjure up a theme or melody that no one else has ever heard before. A person without this personal vision or driving sense of individuality might then lapse into copying—but without the reverence due to a master—just whatever has become popular.3 There can also be money in simply going through the motions: copywriting, graphic design, elevator music. But it’s not art, not satisfying.

An experienced artist with one or more successes to his or her credit does not necessarily have to suffer the uncertainty of the beginner. After all, there are glowing reviews, public recognition, and actual royalties to confirm his or her talent and bolster the ego. But then, an artist can’t keep cranking out the same work, variations on a theme, over and over again.4 It may be what the public wants, but the job soon begins to look like a rut. And when the established artist attempts something new and different, then he or she is working alone again, and the perennial uncertainty resurfaces. Yes, you know how to write a mystery novel, a police procedural, or decent science fiction, but can you research and write historical fiction or undertake a literary novel? And suddenly you are wrestling again with the question of your own talent.

Madness is a private thing. No one outside your head knows what’s going on in there. And when you try to tell people, you get funny looks or stares of disbelief.5 So maybe your thoughts really are disjointed, delusional, and dissociated from the world of rational men and women. Maybe you’re not at all a writer or any kind of “artist.” As a writer with a desire to tell a new story, a painter trying to capture a new vision, or a musician trying to articulate an unheard theme—and not sure you even have the talent to pull it off, let alone create something that will become publishable, saleable, eventually famous, and perhaps even profitable—you are halfway to the state of the schizophrenic patient who hears voices, battles with strange ideas, and struggles to sort out the affairs of daily living from the madness going on inside your head.

The only thing worse than this is not doing it at all. To turn your back on the whole artistic process. To ignore the random ideas that come in the night, or while you’re doing something else in daylight, and not stopping to write them down. To let die the whispers from your subconscious that have finally resolved the problem with your next chapter, and remaining steadfastly incurious about whether the solution will work or not. To stop being this unique, special, creative, frustrated person, who every day owes a debt to the muse, is responsible for the assignments you set for yourself in the outline, and is dedicated to producing the book a year that you promised yourself. And then you would become just another person in the world, who gets up in the morning, goes to the paying “day job,” comes home at night to drink or watch television, and simply … exists until it’s time to die.

And that’s worse than being just a little mad.

1. “Organic” is Aristotle’s word, from his Poetics. It implies the various parts of a work growing together and functioning as a unified whole, like the parts of a plant or animal. In terms of the reader’s perception, this organic arrangement is mentally and emotionally satisfying because the interrelation of the parts is mutually supporting. For example, Julia is in love with David, but David’s lack of returned feeling makes Julia sad and angry; vengeance and murder ensue. The work’s every part relates to support the outcome. Or as Anton Chekhov said somewhere, if you’re going to show a gun in Act One, you had better fire it in Act Two or Three.

2. Any community or group of writers always discusses “the O-word,” as if the outline had strange powers and attracted superstitious awe. The alternative to developing an outline before sitting down to write the actual book is sometimes call going by the “seat of your pants,” or “pantsing.” For me, the outline is the once-through of the story at the 10,000-foot level. It ensures that I have a complete story in hand when I start to write.
    Sometimes the outline is complete in some places, fragmentary in others, but still it provides a structure, however tenuous, between start and finish. If I don’t have at least a partial outline, the problem is not that I cannot write. Without the promise of a destination, I can write pages and pages of useless description and idle dialogue between characters in search of meaning and direction. And that’s no good to anybody—not even to me, who cannot hope that a plot will coalesce out of all this chatter, like due on the morning grass.
    My deal with myself is that the outline is there as an assignment, a starting point, and a bridge to the next time I sit down to write. My bet with myself is that, prompted by the outline—especially if the day’s assignment is not a particularly strong scene or chapter—then my subconscious will click in during the writing process and come up with something better … perhaps more exciting … maybe even superb. And so far, the little demons at work down there have always come through on the bet.

3. When I was casting about for an agent, I heard several variations on the pitch to write what I would call a “coattail” novel, to cash in on the popularity of, and public hunger for, the current bestseller. When Harry Potter was new and fresh, the thought of many envious agents and publishers was to get their untried authors to write about “a boy wizard with glasses” in a school for witches and wizards. Before that, it would have been some kind of epic about magic rings in the middle of an elf war. And later, it would be a naïve young woman in the grip of a billionaire sadist. This is always a trap, because by the time you finish and try to sell your copycat novel, the public taste will have moved on to something else newer and fresher.

4. Think of writing the thirteenth James Bond novel or the seventh Harry Potter novel. Sooner or later—unless your plots follow a greater story arc than can be told in just one novel—it all becomes repetitious and unsatisfying. I think Stephen King expressed this frustration best with his novel Misery.

5. My wife read and generally liked my early novels—up until Crygender, which I wrote on assignment from my publisher, and she found the story disgusting. We never really discussed my works in progress, because I had learned long before that you can’t talk out a story with other people: it just makes the ideas become set in place, go cold, and disappear. But one night she asked me what I was thinking about, and I described the plot point I was wrestling with. She later told me that she had thought she might become my inspiration and have a dandy solution for me, but on the spot she could think of nothing. She left me alone with my thoughts after that.

Sunday, June 16, 2019

A Preference for Weakness

Food allergy

It appears that no one wants to appear to be—and act as if they were—strong anymore. Has our culture so decayed that we actually admire—and aspire to be—weak, defenseless, vulnerable, a victim?

Item one is the sudden outbreak of food allergies among the middle class. Yes, some people have acquired allergies that may have something to do with plastics or hormones in the water. Yes, some food allergies are so severe that the person can go into anaphylactic shock and die. But that is not the phenomenon under discussion here. It seems that suddenly everyone is allergic to, or sensitive to, or just afraid of, the protein in wheat known as gluten. People with certain identified autoimmune disorders, such as celiac disease, do get sick when they digest gluten. And people whose traditional diets did not include wheat, such as those who come from areas dependent on rice cultivation in Asia, may experience a reaction to gluten. But that is not what I’m talking about.

Far more people with a Northern European background than just a few years ago seem to be claiming gluten sensitivity, almost as if it were fashionable to throw up your hands at bread and pasta. Couple this with our wandering dietary pronouncements—meat’s bad, butter’s bad, cholesterol’s bad, butter is better than margarine, fish are good, fish are bad, carbohydrates are energy, carbohydrates are bad, avocadoes are bad—and you get a population that is suddenly exquisitely picky about what everyone is eating, and no one can adequately explain why. That’s a deplorable state for a human digestive system that was tempered and hardened by a hundred thousand years of hunter-gatherer existence and can usually bolt down and profit from almost anything organic except tree bark and dead leaves.

Item two is the sudden outbreak of reactive offense. Yes, some people say and do truly mean, stupid, hurtful things, and they should be either quietly confronted or politely ignored. And yes, sometimes people intend those things to be offensive in order to get a reaction from the public at large. But it seems that suddenly everyone is incapable of taking the polite route and refraining from noticing or reacting to the rudely offensive. Now everyone almost seems to hunger for opportunities for taking offense. To quote from one of my favorite movies, The Duellists: “The duelist demands satisfaction. Honor for him is an appetite.” And so, it would seem, for many people today the satisfaction of reacting with horror, scorn, and outrage at remarks and gestures, whether meant offensively or not, has become an appetite.

In my view—in the view of my parents, who raised me in their beliefs—to give in to such failings, to demonstrate such vulnerability, is a weakness. That succumbing to precocious food sensitivities and minor discomforts was to make yourself vulnerable to the world. That reacting with offense at every slight and injury was to allow yourself and your emotions to be played upon by potentially hostile forces. They believed in strength and wanted their sons to be strong.

As a child, I suffered from remarkably few allergies but had a bad reaction to mosquito bites. Let one bite me on the wrist, and my lower arm would swell until my hand was fixed in position. If one bit me on the cheek, that side of my face would swell. As a boy growing up among the wetlands surrounding suburban Boston, mosquitos in summer were an inevitability. My mother sympathized with my condition, but she didn’t agonize about it. I never was taken to the emergency room, and no one suggested any medications to counter the allergy. My parents believed I would grow out of the affliction, and I did.

My parents tolerated few childish food dislikes. My brother and I had to eat what was put on our plates, like it or not—and we mostly liked it, because my mother was a good cook. I had one real aversion, to cheese. I suppose that, in later life, I could excuse this as my sensing that cheese was “rotted milk,” but as a child I just hated the taste, smell, and texture of the stuff. It wasn’t until pizza became popular that I would eat even the mildest of provolones and mozzarellas in any form, and never just as a chunk of the stuff on a cracker or melted into a sauce. My father, being a cheese lover, disdained my aversion and tried to beat it out of me as an example of childish attitude. My mother, being of a kinder heart, would make separate portions without cheese for me when preparing cheese-heavy dishes. But she still considered my aversion a sign of personal weakness.

The Protestant ethic was strong in my family. You were supposed to work on yourself, learn as much as you could, eradicate your failings, take up your responsibilities, be dependable and loyal, work hard at whatever you undertook, and be ready to account for your actions. Claiming extenuating circumstances when you failed at something was just not allowed: a properly prepared mind should have foreseen those circumstances and worked to overcome them. Claiming to be a victim of other people’s actions, whether intentional or not, was unacceptable. We were the people who paid our own way and made our own fate. We helped others along as we could, but we did not succumb to their malice and their schemes, if any. We anticipated traps, spotted them in time, and side-stepped them. We were in control of our own lives and not anyone else.

In another of my favorite stories, Dune by Frank Herbert, the female school of political agents and manipulators of bloodlines, the Bene Gesserit, had as an axiom “Support strength.” I do not take this as some kind of class statement, such as favoring the oppressor over the oppressed. Instead, it means finding what is strong in each person and developing it, helping to create people who are capable, self-aware, resilient, brave, and strong. It is an attitude of which my parents would have approved.

The current preference for weakness in our popular culture—expressed in accepting every allergy and food phobia as a sign of personal sensitivity, and accepting every cause for offense as a sign of spiritual purity—is a dead end. It creates people who are incapable, self-serving, brittle, scared, and weak. This is not a people who will take up arms against an invader, or volunteer to land on the Moon or Mars, or do the hundred other daring and dangerous things that previous generations have been asked to do and responded without a whimper.

But we may not be that nation anymore.

Sunday, June 9, 2019

On Rational Suicide

Line of suicides

The human mind is a mechanism of many moving parts. The human brain has several major components and many subsystems or neural circuits, all built up over millions of years of evolution, based on the rudimentary brains of the early fishes. In the most fortunate of us, the mind can be a fine-tuned instrument, capable of analyzing the mathematics of splitting the atom or taking the vision of a sunset and reimagining it in words or music. For many of us, the brain is a stable platform that enables us to live happily, deal with setbacks and anxieties, and savor moments of pleasure and triumph. But for some of us, the mechanism goes out of whack, the parts don’t cooperate, and the result is a mental life full of misperceptions, missed cues, and chaos.

And then there is the issue of suicide. Most people would like to consider suicide as a form of mental illness, a breakdown in the brain’s systems. For them, it is something that might happen to other people, like clinical depression or schizophrenia, but nothing that might be waiting around the next turn of events to grab them, spin them on their head, and kill them. For these people, suicide is never a rational act, never what a sane or well-balanced person does.

This view is reinforced in our society and in Western Civilization with cultural shaming—that the suicidal person was not strong enough or did not consider his or her own value or responsibility to family, friends, and community—and with religious prohibition—that the act of suicide, self-murder, is contrary to God’s law and a usurpation of God’s will, which alone tells us the direction that our lives will take and the point at which they will end. But these cultural arguments and prohibitions against suicide are not universal. For example, in feudal Japan—at least among the samurai order—ritual suicide was linked to notions of obedience and personal honor. A samurai who failed his lord in any significant way could only atone by taking his own life, and this was considered proper and good.

In my view, which is based on notions of evolutionary biology, suicide is the unfortunate triumph of the prefrontal cortex—the center of thinking, organizing, projecting, and deciding, as well as expecting, hoping, and dreaming—and its supporting circuitry in the hippocampus and amygdala, which coordinates both memory and emotions, over the hindbrain or cerebellum—center of autonomic functions like breathing, heartbeat and circulation, swallowing, and digestion—and the reticular activating system in the brainstem that coordinates consciousness. Suicide is the temporary—although ultimately final—triumph of human thinking over the brute, animal functions of the body.

Suicide is the collision of the brain mechanism that creates expectations, performs planning, perceives hope, responds to dreams, and does whatever else drives the mind and the soul forward into the future, with what the mind perceives as reality, the end of hopes, plans, expectations, or whatever was driving it forward. And when that forward-thinking part of the mind gives up and dies, when it contemplates all the damage and despair that continued living might cause, what going forward without hope or dreams might mean, then the rational mind can overcome the brain mechanisms that keep the body breathing, eating, and living. The decision to act in harming the body, whether by self-starvation, asphyxiation, overmedication, cutting of blood vessels, or other ways of knowingly causing irreparable and permanent damage, is the temporary but permanent defeat of those systems that keep the organic parts functioning and moving forward blindly in time.

And again, there are cultural traditions and physical mechanisms that work to delay or even nullify this collision between the mind and the body.

On the cultural level, good parenting will teach a child that, first, we don’t always get what we want or deserve in life, and that disappointment and denial are a part of the human condition. Common sense will teach a person eventually that, while dreams and expectations are nice, they are not the basis of reality, that a person must work for what he or she wants, sometimes sacrificing dreams for mere survival, and that plans sometimes go awry. Wise teachers and friends will tell a person that hope does not live in one shape or future but is a free-floating quality, and that a person can find strength in the smallest moments of peace, visions of beauty, or gestures of good will. And many religions, especially the Zen parables, teach that expectations and assumptions are phantasms of the mind and not a reflection of reality.

On the physical level, most of the methods for ending a life—like starvation, asphyxiation, and cutting—are painful and create their own immediate need for the forward-thinking mind to stop, reconsider its decision, and actively reverse course. Other methods—like a drug overdose—take time to create their effect, and then the busy mind focused on the blankness of the future may unmake its decision to choose death and so choose medical help. Only the methods which put in train an immediate and irreversible course of events—like jumping out of a high window or pulling the trigger on a gun aimed at the head—offer no immediate pain and allow for no useful second thoughts.

Why am I going on about suicide like this? First, as an act of individual courage—and in defiance of most social taboos—I long ago, even as a child, decided that no thought would be unthinkable for me. If it can be imagined, it can be explored rationally and soberly, as a fit subject for the human mind. Second, there have been times in my own life—not often and not recently—when I have felt the pressure of lost hope, of a great, gray blankness lying ahead with nothing to counter it, no expectation to fill it, and no reason to avoid it.

And then my dear wife, my partner in forty-one years of marriage, died a year and nine months ago. When you lose someone with whom you have shared the majority of your life, aside from the immediate grief, you also have moments of simple emptiness. Everything the two of you did together—the events, memories, pleasures, and inside jokes you shared—are irretrievably gone. The daily routines you built up in living together are now meaningless. The habits you yourself curtailed and the actions you avoided because you knew she did not like them are now empty gestures. And when so much of your life has been taken away, you can sense that great, gray void in the years ahead. In many respects, you become a ghost yourself.1

There have been times since her death when I thought I would simply “go under.” That it would not matter if my hindbrain stopped the automatic functions of pulling on my lungs, beating my heart, and cuing my desire to eat, and that I would just fade away. Now, this is not a warning to my family and friends. I am not going to follow any of this up with positive or passive action, because I share those same cultural traditions and physical mechanisms designed to prevent suicide. Or not, that is, until my own brain is so far gone with dementia that I become disabled, unable to care for myself, and so mentally isolated that I cannot move forward. And that’s a vow I made long ago.

But what I am trying to say is that I can understand the impulse toward suicide. It is part of the human condition. It comes about when an animal brain grows so advanced, so complicated, and so beholden to its own vision and hope for the future that the denial of that vision and hope leads to irremediable despair with no alternative.

Suicide is not an irrational act. Its very possibility is part of what makes us human and not animal.

1. Some days, as I move around our old apartment, I have the sense that perhaps I have entered an alternate reality: that I was the one who died and now drift through my daily routines as a ghost trapped in this empty place, while she lives on, somewhere else, in a future filled with hopes, plans, and expectations.

Sunday, June 2, 2019

The Downside of Gutenberg Economics

Pipe organ

18th century pipe organ
at the Monastery of Santa Cruz, Coimbra, Portugal

A friend of mine had a long professional career as an organist and singer in jazz clubs and cocktail lounges. Upon retiring about a dozen years ago, he spent the healthy sum of $18,000 to purchase a top-of-the-line Hammond electronic organ for his home so that he could continue practicing and playing. This organ has now developed problems involving its computer chips and, despite the efforts of various expert mechanics, including the company now owned by Suzuki Musical Instrument Corporation, is considered unrepairable. An instrument that, if it were a human being, would hardly have passed puberty, is now a nice wooden cabinet enclosing a couple of dead circuit boards. And therein lies a tale of the late 20th and early 21st century.

The original organ, familiar from your neighborhood church or a 1920s-vintage movie theater, is an arrangement of pipes connected to valves and an air box. Some of the pipes have notched throats, like a whistle, and some have embedded reeds, like an oboe. All draw breath from the blower feeding the air box or plenum. When the organist presses a key on the console, it activates the valve, letting air into the corresponding pipe and making it sing. This is a design going back 2,300 years to the Greeks, and the oldest extant organs go back about 600 years. If one of these organs breaks, it can be repaired. The job takes an expert who generally has apprenticed himself—because this is rarely a skill undertaken by women—with an older generation of experts and devoted his life to the job. Maintaining a church or theater organ is expensive, probably about as much as building one from scratch, but it can be done.

The original Hammond rotary organ was not based on pipes and blowing air but on something called a tone wheel. Each key on the organ corresponds to an iron disk that has a series of notches cut into its edge. The number and spacing of the notches are mathematically configured so that, with all the wheels rotating at a constant speed on a common shaft, a sensor facing the edge of each wheel picks up a signal in impulses per second that matches a musical tone—for example, 440 impulses per second generates the sound of the A above middle C. To recreate the harmonics that a pipe organ makes by pulling stops to bring in the voices of pipes in various relationships above and below the note being keyed, the Hammond organ uses drawbars to pull in more or less of the electric signals from each tone wheel above and below the wheel being sampled. If one of these original Hammonds breaks, it can be repaired. Again, the job takes specialists who may have to fabricate some of the parts, but the system is mostly mechanical and can, for a price, be restored to working order.

But the organ my friend has, and all the Hammond organs you can buy new today, are electronic. There is no pipe, no tone wheel, nothing mechanical and touchable. The sound of a pipe organ or the cherished buzz of the tone wheel organ has been electronically sampled as a wave that is then encoded in a computer chip as a piece of digital information, the same as if it was a number in a spreadsheet, a word in a document, a line in a graphic, or a pixel in a photograph. When you press the key on such an organ, it calls up the corresponding note from digital memory. When you adjust the drawbars, they pull in the harmonics of other sampled notes. But it’s all just bits and bytes. If your electronic organ has a burned out chip, and that chip is no longer made or available somewhere in stock, your organ is dead.

So, the irony is that with the right people who know how to straighten and fabricate the right parts you can fix a 200-year-old pipe organ or a 50-year-old tone-wheel organ, but nothing can resurrect a 20-year-old electronic organ, piano, keyboard, computer, cell phone, or other digital device if the defunct chips are not available.

And the chips are not available because computing technology—under the force of Moore’s law1—moves forward so rapidly. Designing, plotting, masking, and photoengraving computer chips is a function of what I call “Gutenberg economics,” where the creator makes a single template for a book page, a chip design, or any other intellectual property and then prints off as many as are needed at an ever-diminishing cost per unit.2

The downside with all of this, of course, is that once you have performed all these preparatory steps and finished your print run, you are ready to move on to the next project. If you made computer chips with capacity X and capability Y a dozen years ago, and today’s chips have a thousand times that capacity and a million times that capability but operate slightly differently—say, with different inputs and outputs on the contact points in modern circuit boards—you are not going to go back and do a limited run of antique chip designs just because someone somewhere has a board in an organ, desktop computer, cell phone with a burned-out chip. Like the cost of makeready on a printing press, the costs of setting up for the fabrication of a semiconductor design are the largest part of production. No one pays to start the process over again for just a few copies of the printed book or newspaper—or for a couple of hundred computer chips to repair outmoded products.

So, while the computer chip itself might be virtually immortal, the device in which it’s installed is susceptible to the least defect in just one of the many chips that power it. Burn out a memory component that, on the scale of a large-run chip fabrication, might originally have cost less than a dollar to make, and your $18,000 Hammond organ is electronic waste in a nice cabinet.

In the same way, you can still buy a 1929 Model A Ford and get it serviced and repaired, because there is a small but loyal following for these cars, and gaskets, bearings, carburetors, and other parts can still be sourced, refurbished, or fabricated. You can even restore a 1929 Duesenberg and have yourself an elegant town car—if you have enough time, patience, money, and access to a good machinist. But let a chip burn out in the engine management or fuel injection system of your 2009 Toyota or the charging system of your 2019 Tesla and the chances, over the years, become vanishingly small of finding a replacement. Once the car company notifies you that it will no longer support your model year, you are pretty much on your own, and your treasured vehicle becomes a ton and a half of useless scrap.

In the same way, a document hand-lettered on parchment or printed on acid-free paper can survive for centuries. It will still be readable, even if you require a linguistic expert to translate the words. But recovering the early draft of your novel from a twenty-year-old 5.25-inch floppy disk or a ten-year-old 3.5-inch floppy can now take expert help and, within a couple of years, may not be possible at all. Despite all the standardization the computer industry has made in disk formats and plug sizes and capacities, it’s a sure bet that one day your two terabyte hard drive or 128-gigabyte USB thumb drive will be unreadable, too.

In the same way, many of us music lovers have had to recreate our collections as vinyl gave way to tape, gave way to disk, gave way to MP3. And our movie collections have moved from VHS—or, spare me, Betamax!—to DVD, to Blue Ray, to MP4.

This is not planned obsolescence or some evil scheme by the record companies to make you buy The White Album again. This is simply the advance of technology. Last year’s hip replacement is just not going to be as good as the metal or ceramic joints that orthopedic surgeons will be installing in the year 2028. You might remember fondly that 1970 Ford Mustang you once had and want to own again, but its engine mechanics, gas mileage, and service life will not be as good as today’s model. And when your 2019 Mustang bites the dust because of a dead computer chip, the models that will be available then will be even better yet.

As I’ve said before,3 once the Industrial Revolution got underway—then morphed into the Information Age, and is now fast becoming the Automation Revolution—we all got on a fast escalator to the future. Some of it we can already see. Some only science fiction writers can imagine—and sometimes they get it wrong.4 But it’s all going to be amazing.

1. Moore’s law says—or used to say, starting back in 1970—that the processing power of computer chips doubles every two years. I don’t know if that still holds, because the effort to cram ever more transistors onto a silicon wafer is now approaching physical limits. And new concepts in quantum computing may bring along even greater advances in computing power.

2. See Gutenberg and Automation from February 20, 2011.

3. So many times I won’t bother with a reference.

4. See, for example, Robert A. Heinlein’s early fiction, where we travel in space but still have to program electromechanical computers the size of a room.

Sunday, May 26, 2019

Allocating Scarcity

The Human Condition:

Allocating Scarcity – May 26, 2019

Teasing with apple

In economics, the primary facts of life—and the first set of curves you learn—involve supply and demand. The value of any good or service is fixed by the intersection of the supply curve, from great availability to great scarcity, with the demand curve, from scant interest to intense need or desire. Dirt is of little value, because you can pick it up anywhere and nobody—other than gardeners, and they require a special quality of organic content—has much use for it. Diamonds have great value because they are relatively rare and almost everybody—at least those who love jewelry or seek a store of value—wants one. The physical space where supply curve (the seller) and the demand curve (the buyer) meet is called a marketplace.

One of the enduring problems of human society is how to allocate scarce and highly desirable goods and services. A system of pure barter doesn’t cut it. How many bushels of wheat must a farmer grow to trade for a diamond to put on his fiancé’s finger? And what is the local jeweler going to do with the suddenly acquired hundreds or thousands of bushels after the exchange is made? To facilitate the trade, human societies almost immediately invented the monetary system, establishing physical markers to represent relative value. And those values, again, were set by supply and demand. Diamonds trade for a lot of markers; bushels of wheat trade for comparatively less—unless there’s a farming crisis and famine, in which case wheat might trade like diamonds because everyone has to eat.

Interestingly, it doesn’t matter what the markers are made of. Gold and silver coins have worked for some societies, because their metal content is relatively rare and desirable and so represents a store of innate value. The tribes of eastern North America used beads made from white and purple shells, called wampum, that traded like money—having innate value only because the purest colors were relatively hard to find and the beads were pretty. The Chinese emperors from the 4th century BC to the 20th AD struck copper and bronze coins, called cash, that had value only because the government would redeem them. And the Chinese were the first to use paper printed with special markings as money. The point is, money has value because other people will accept it in exchange for goods and services.

In the Star Trek universe, there is no scarcity. All the energy a space-faring society could need is supplied by matter-antimatter reactions. All the goods they want are provided by pattern replication, from food and clothing to formed metals, and presumably gold and industrial diamonds—everything, curiously, except for the dilithium crystals needed for the energy conversion. Those crystals have to be mined and traded. And in Star Trek they don’t use money, either. All the matter-antimatter converters are presumably owned collectively and the replicators are found everywhere. I suppose people might make some kind of living by inventing and selling unique replicator patterns: a new flavor of ice cream or a new kind of fabric for clothing. But no one aboard the Enterprise seems to be dabbling in this, and they would probably give the patterns away for free as a community good.

The existence of scarcity and the rise of market values and money systems driven by need or desire bother many people. Inevitably, throughout history, those who have been able to garner and stockpile scarce or desirable commodities, or the value markers needed to buy them, have gained power over others in their society. And, always, this garnering and stockpiling have carried the whiff of unfair practice. The person who saves up canned goods and water in a time of disaster, when their neighbors are starving and thirsty, and will only trade them for more money than they can command in good times, is generally despised as a hoarder. The person who gets a government contract to operate iron and bauxite mines in wartime and to build tanks, planes, and ships, while everyone else is scrimping and saving under government rationing, is regarded as a profiteer. Even the movie producer who creates a colorful fantasy full of imaginary lust and action and then, through advertising and the manipulation of public tastes, convinces millions of teenagers to part with their $10 for two hours of vacant, mind-numbing entertainment, is regarded with suspicion.

At various times in our history, small societies have tried through enforced sharing to overcome the advantages some people may gain from scarcity and the opportunities they can reap by manipulation. The feudal manor and the Israeli kibbutz are such places. The cobbler in the manor does not trade the shoes he makes for a certain amount of the farmer’s wheat; he just commits to make shoes for anyone in the community who needs them and then takes his share of the community’s supply of food, housing, and clothing. The same for the miller and baker, the mason, and the weaver. If the community is self-sustaining in the raw materials required to support this communal exchange, everyone is happy. If the community lacks some things—for example, not enough cowhide from the current consumption of beef to provide leather for all the shoes that are needed—then people may do without, or wear last year’s shoes, or go barefoot when the season allows. Fat times and lean are amicably shared.1

All of this works on the small scale, where everyone in the manor, village, or kibbutz knows everybody else. People in these close-knit communities don’t have to look at their neighbors’ dinner tables to see what they’re eating, at their feet to see what they’re wearing, and in their cellars to see if they are hoarding either food or shoes.

But once your society grows larger than, say, the county level, things start to change. Then you don’t know what other people far away have or need, or what their advantages and disadvantages are, and whether or not they may be hoarding the goods you require to survive. The human impulse to share dwindles remarkably the farther you move beyond family, friends, and neighbors. The willingness to sacrifice and do without for the presumed benefit of strangers is not part of the human psychological makeup.

Marx thought that in the perfect society, once the lesson of “From each according to his ability, to each according to his needs” had been learned and absorbed into the human psyche, the need for a state to direct allocation of resources would wither away. People would just go on indefinitely producing whatever they could, sharing the abundance in good times, and enduring their hunger pangs in lean times.2 But human nature does not change with a few economic slogans—or with an armed revolution, state enforcement of quotas, and the liquidation of recalcitrant classes.

Every system of distribution that tries to ignore market values and monetary systems, and that grows bigger than a small town, requires a stern hand to measure regional and national need, set production quotas, and align distribution. And there are two problems with this.

First, it makes no allowance for human variability, for differences in taste and perception. “You like pistachio ice cream, comrade? Too bad, because the popular flavors—and all we have the resources to make—are chocolate and vanilla.” People in the aggregate are numbers, ciphers, zeroes, not real human beings with actual likings and loathings. In a marketplace with available capital in the form of unallocated money, someone would be inspired to figure out the percentage of people who actually like and would be willing to pay a bit extra for pistachio ice cream, then he would build a plant to make enough to serve their desires. That inspiration would create a market niche, provide jobs, and make people … happy. But in a socialist state with production experts and yearly quotas, everyone is overworked just trying to figure out how much chocolate ice cream to make and ship to California—if we even have enough dairy production in Minnesota to support such a frivolous commodity, let alone the refrigeration capacity to keep it cold in the summer—to bother with any marginal taste groups.

Second, those production experts and planners are human beings, too. Even when their work is supported by spreadsheets and computers, they are susceptible to the same frailties and failings as anyone else. The inspiration to make a market niche in pistachio ice cream wouldn’t occur to them, because the state and the other production planners would not let them benefit from the impulse to serve that market segment. Worse, they could too easily see that what would benefit them was offering a small favor to, and accepting a small gratuity from, a particular industry or region or segment of society. It is human nature to look out for oneself, one’s family, and one’s friends before taking care of some hypothetical public good. Unless the production and distribution are run by wholly artificial means—computer intelligences, dispassionate angels, or clockwork mechanisms—the possibilities and opportunities for graft and favoritism will undermine the purest instincts toward fairness and equality.

Scarcity will always exist. Even in a world of unlimited energy generation and pattern replication, there will still be material goods like artworks and handicrafts, services like medical specialties and innovations, and experiences like ziplining through the rainforest or trekking to the peak of Mount Everest that someone will want more than anything else and more than his neighbors might want. There will always be objects of desire that people will crave and fight for, no matter how stoic and rational they are supposed to be.

The only way to accommodate human needs and desires fairly, in a way that will make people happy, is to let the marketplace work. Let people make decisions for themselves about what they will value in life, how hard they want to work for it, and how they will spend their money. Do we occasionally need a regulating hand to see that some people without resources are not left to each out of dumpsters or starve? Of course, that is the purpose of the safety net, and no one disagrees with it. But there is a huge economic leap from letting public servants watch the marketplace and occasionally make necessary corrections—to giving them total control of the whole mechanism.

Scarcity exists not only in the belly or in physical resources. It also exists in your mind and your desires. And that is a corner of the world and the human spirit that no dispassionate, socialist governmental structure can reach or touch—or satisfy. Nor should it try.

1. And no one in these communal societies needs luxuries like gold and diamonds—except the feudal lord, who can always go raiding for them. If the community does come upon such valuables, the people generally consent to use them to decorate and enrich the local church rather than bedeck only a select few members with jewelry.

2. And note that Marx predicated this method of distribution on the abundance that would be created by “unfettered” production in a fully developed socialist society—presumably without the restraints on labor placed by limited capital, human laziness, and human greed, “after labor has become not only a means of life but life’s prime want.” If this isn’t magical thinking, I don’t know what is.