Sunday, December 10, 2017

Learning as a Form of Evolution

Neuron cells

I’ve been making some existential comparisons lately—Life Like a Sword and Language as a Map—so I thought I would round out the sequence of metaphors by looking at the way we form our knowledge.

The popular conception is that we acquire new knowledge the way a coin or stamp collector makes a new acquisition: pick up the fact, place it in our memory box, recall and retrieve as necessary. Our head then becomes a database of acquired facts, like a phone contact list or a Rolodex of business cards. Take one and slot it in. Modify it with an overlying note if any of the information happens to change. And remove the card—that is, commit ourselves to forgetting the subject heading and its content—when the information is shown to be wrong or is no longer of use.

But is that really how it works, all neat and tidy, like filing little pasteboard squares?

Actually, our brains are representational devices. We interpret our sensory input and apply it to the process of assembling a model or representation of what we know and think about the world outside our heads. We are model makers, map makers, myth makers, and story tellers. What we learn goes into a vast web or network or congeries of impressions, summaries, conclusions, and projections that collectively represent the world as we know it. We are constantly testing and expanding our worldview. We are always asking, consciously or not, “Is this the way the world really works?”

We are constantly—although perhaps unconsciously—looking for comparisons with and similarities to the things we already know. When we get a new fact or form a new impression, we test it against our worldview, the structure of our model of the world. We ask, “How does this fit in?” And if the fact or impression conflicts with what we know, our brain goes through a small crisis, a scramble for immediate understanding. We test the new knowledge against its background: “Where did I get that idea?” “What was its source?” and “Do I trust it?” We also experience a small—or sometimes large—tremor in our worldview: “Why do I actually think that?” “Could I have been wrong?” and “Is this new knowledge a better way of seeing things?”

The habit of referring back to our internal model runs deep. For example, when learning a new language, such as French from the perspective of an English speaker, we leverage the grammar and the words we already know and understand. When we learn a new French word like chien, we don’t immediately associate it with a four-footed pet of a certain size range, disposition, coloring, and similar physical details. Instead, we link it to the English word dog and then concatenate onto chien all the past impressions, learned attributes, and personal feelings we already associate with the concept in English. In the same way, we adapt French grammar and syntax to our known English way of speaking, and then we extend our knowledge with new concepts, like the issue of nouns and objects that we normally think of as inanimate and sexless now acquiring a specific gender. By learning a new language, we expand our general knowledge of both our own language and its place in the way the rest of the world communicates.

In this sense, each piece of new knowledge—both the facts, impressions, and understandings that we acquire by the happenstance of general reading and daily experience, and those we acquire by conscious study such as a new language, or the history of an unfamiliar place and people, or a closed curriculum like mathematics, physics, and chemistry—each discovery is making a series of minute changes in the brain’s internal environment. And the effect that these new facts and impressions have on our existing ideas—the current model or myth that is running in our heads—is like an organism’s response to accidental modification of a protein-coding gene: the new knowledge and the resulting change in our worldview either enable us to live more fully, completely, successfully, and confidently in the environment that we actually inhabit, or the changed worldview contributes to our failure to compete and thrive by causing us to interpret wrongly, make mistakes, and suffer feelings of doubt, denial, and depression.

But some facts or interpretations—perhaps most of them—don’t cause an immediate change in our relationship with the outside world. We can carry a bit of false data, a misremembered fact, or an untested impression in our heads for months or years at a time without it affecting our personal relationships, our social standing, or the decisions we make. And then, one day, we will learn something else that will contradict the comfortable model and bring on the crisis. In the same way, some mutations to a gene have neither a helpful nor harmful effect in the current environment. The modified gene and the changed protein it makes gets passed down from generation to generation without challenging the fit of the organism to its environment. But then the environment changes, and the organism is either better able to compete under the new conditions, or the changed environment shows up an inherent weakness, and the organism either thrives or dies. Sometimes the environment doesn’t have to change, but another mutation enhances the effect of that earlier genetic change, and the organism either excels against other members of its species or fails to compete.

As an example of the mutability of our worldview, both as individuals and as a collection of academics building a body of scientific or historical interpretations, consider the advance of human knowledge in the field of genetics.

At first, back in the early 1950s and the world of Watson and Crick, we valued the newly discovered DNA molecule and its messenger RNA strands solely for the proteins they made inside the cell body. Genetic scientists held to what was then called the “central dogma” of molecular biology, that DNA transcribes to RNA, which translates to proteins. Geneticists could point to the start and stop codes associated with the protein-coding genes. By finding and fishing out these codes, they could pull out sequences of DNA, copy them over to RNA, and analyze the resulting coded calls for each of the twenty possible amino acids in the developing protein string. These twenty amino acids are the universal building blocks for all of an organism’s complex proteins—in fact, for all life on Earth.

This central dogma held until about the year 2000, when the Human Genome Project and Celera Genomics published draft sequences of the entire three billion base pairs in twenty-three human chromosomes. Analyzing the code, geneticists then discovered that only about ten percent of this DNA was used for making proteins.1 So what was the other ninety percent doing? Many scientists figured that this genetic material was “junk DNA,” old code left over from our prior evolution, from genes that coded for proteins that our evolutionary ancestors might have needed as fish or reptiles, but with no meaning now and so abandoned to gradually mutate into genetic mush.2

The new facts about the frequency of protein-coding genes forced a reevaluation—a modification of the scientists’ mental model—of the nature of the genome. The scientific community remained with either the “junk” hypothesis or a condition of wonder until about 2004, when a new bit of knowledge emerged. Botanists working with certain flowers discovered that a short strand of a particular RNA, when introduced into a seed, can change the color of the flower. They hypothesized that the RNA either promoted a gene that had previously been silent or blocked a gene that had previously been expressed. They dubbed this effect “RNA interference,” or RNAi.

Soon, the genetic scientists were studying a class of short RNA strands, about fifty bases or less, that they called “microRNAs,” or miRNA. They began to see that these bits of RNA were used inside the cell nucleus to promote genes in different patterns of expression. And then Eric Davidson at Caltech, by working with sea urchin embryos, mapped out the network of genes in an undifferentiated embryonic cell that produced bits of microRNA to promote other genes to make different miRNAs—all without coding for any proteins. Depending on a cell’s position in the sphere of identical embryonic cells that develops shortly after fertilization, the pathway through this miRNA network changes. Some of these cells, through the proteins they eventually produce, become the internal gut, some the epidermal surface, and some become spines. By comparison with another organism far removed from sea urchins, the Davidson laboratory could trace out a similar network—which means it operates in most animals and plants, and likely in humans today. This miRNA network is the timing and assembly manual by which some embryonic cells in our bodies become liver cells, some brain cells, and some bone cells.

This discovery addressed a question that apparently no one had ever considered. If the entire genome is for producing proteins, then why doesn’t every cell in the human body make all the proteins required by all of the other cells? Why don’t neuron cells pump out liver enzymes and bone cells create and then, presumably, ignore neurotransmitters? Davidson’s work suggested that, while ten percent of the human genome makes proteins, functioning as the parts list of the human body, the other ninety percent is the sequential assembly manual.

But the story didn’t end there. Other geneticists noted that simple chemical compounds called methyl groups (CH3) often became attached to the promoter regions of genes—usually a sites where a cytosine base is followed by a guanine—and inhibited the gene’s expression. They at first considered this an environmental accident, randomly closing off gene function. But they also noted that an enzyme in the nucleus called “methyltransferase” worked to add these methyl groups to newly replicated DNA strands during cell division. If methylation was an accident, why was there a mechanism to preserve it in daughter cells?

From this question, the scientific community began studying methyl groups attached to DNA and learned that this was the cell’s way of ensuring that those brain cells didn’t begin producing liver enzymes or bone cells make neurotransmitters. Once a cell had differentiated to become a certain type of tissue, methylation locked out its other possibilities.3

So the community of microbiologists had to work gradually, discovery by discovery, to develop and refine their model of human genetics. From the central dogma of protein production being the only purpose of all DNA, to a whole new use of RNA to differentiate cell types, to the inclusion of “accidental” methyl groups to lock in that differentiation.

Every science goes through such an evolution and refinement of knowledge, discarding old ideas, patching in new discoveries, building, tearing down, and rebuilding the model, each time coming closer to what’s really going on in the world. In the same way, every human being learns certain skills and truths, discards old notions, patches in new understandings, building and tearing down his or her worldview, until the person attains something approaching … wisdom.

1. This did not leave them with too few genes to account for all of the body’s proteins, because they also discovered that many genes have alternate splicings. The scientists already knew that some gene sequences had “exons,” or patterns that expressed the code for the protein, interspersed with “introns,” or non-coding intrusions into that pattern. What they learned from the human genome was that the promoter region ahead of the start code could specify different ways to combine those exons to create variations in a family of proteins. Nature is more wonderful than we can imagine.

2. Not everyone agreed with this assessment. The human body spends too much time and energy replicating the entire genome each time a cell divides for us to be carting around this much junk. After all, the phosphate bonds (PO4) that are the backbone of each paired strand of DNA are also the working part of the cell’s energy molecule, adenosine triphosphate. And phosphorus is not so common, either in nature or in the body, that we can afford to hoard it and squander its potential with junk DNA.

3. Methylation also would explain why the early methods of reverting a body cell to a type of embryonic cell, by starving it until the cell almost dies, worked so poorly. This was how the scientists in Scotland cloned Dolly the Sheep, and in order to achieve the one viable Dolly, they had to sacrifice hundreds of attempts at cloned embryos and raise not a few genetic monsters. The starvation method must have essentially stripped out methylation as the cell approached its death, reverting the genome to is undifferentiated state.

Sunday, December 3, 2017

The Trials of Vishnu

Tower of Pisa

In the Hindu pantheon, three major gods form a trinity:1 Brahma the Creator, Shiva the Destroyer, and Vishnu the Preserver. Although I constantly try to emulate Brahma in my writing, creating imaginary worlds, people, and situations as if they were real, I find that in everyday life my patron deity is actually Vishnu.

I tend to hold on to things, sometimes cherishing them for sentimental value and sometimes simply because they might, one day, under the right circumstances, become useful again. For example, I have an old and now somewhat tattered bath mat that my paternal grandmother once crocheted. That’s from sentiment. I have in storage books that I decided years ago I wasn’t going to read right away but that I might need again. And in my closet I think I have the dress shoes I wore to my high-school prom, now fifty years ago. And at the back of my closet … even Vishnu doesn’t want to look there.

A lot of this preservation deals not so much with just keeping things as keeping them in their proper, pretty, pristine, and like-new state. This is a manifestation of my own particular form of obsessive-compulsive disorder, or OCD. For many people with this disorder, the compulsive task is repeatedly washing their hands, or checking for their car keys, or reconfirming that the stove is indeed turned off. Performed enough times, especially under conditions of stress, the disorder can be crippling to a normal life. My form of OCD deals primarily with the two S’s: surfaces—which includes scratches—and straightness.

If an object in my possession has a shiny surface, I am constantly checking it and cleaning it of dust and smudges: the screens of my iPhone, iPad, and computer monitor; the cases of any of these devices; and the shiny bits of my motorcycles such as gas tank, fenders, windscreen, and dial lenses.

Before I go for a ride, I use a wet paper towel to float off the dust, followed by a microfiber towel to wick off the water without leaving droplet marks or fine scratches. After I ride, I use the still-damp towel to clean off any bug carcasses and dust I’ve picked up along the way. If I find a scratch—however minute, no matter that it only shows up in direct sunlight and from certain angles—I bring out the wax or the plastic polish to address the defect. When I wash the motorcycle, I immediately follow it with a coat of wax or acrylic sealant to preserve the surface. If there’s a deeper scratch, visible under any light conditions, I bring out the polishing compound and work it to oblivion, hoping that the mar doesn’t go through the clear coat and into the paint. And if there’s a stone chip, I go after it with touchup paint, followed by compounding and sealant.

You might think that the solution here would be the new matte finishes that motorcycle manufacturers have introduced over the past several years. But they can’t be touched up for visible scratches. And then I worry about wear, especially the sides of the tank where my knees grip the surface and the fabric of my trousers would leave—horrors!—a shiny spot. There’s just no way to win this game.

This is why my favorite material is glass. It wipes down easily, and usually it will break—and thereby have to be thrown away—before it will scratch. For this reason I like tempered glass for my eyeglasses instead of the new plastic lenses. (I’ve had enough polycarbonate motorcycle windshields to know that, while they might take a bullet, they also scratch fairly easily.) I also favor polished titanium and stainless steel for my watches because of their wear resistance. I’m just picky that way.

If an object has a scratch or wear mark that I can’t polish out, I agonize over it. I see that point of infinitesimal damage more than the whole bright surface or the shape, design, and purpose that brought me to admire and desire to obtain the object in the first place. Is this a crippling affliction? To my daily round of activities—such as incessant hand washing would be—then no. But to my emotional stability—when I have actually considered selling a motorcycle because of a deep and unfillable stone chip in its lustrous black paint—well, yes.

The other aspect of my disorder is the alignment and straightening of things. Part of this was my upbringing as the son of a landscape architect. My mother had an innate sense of design—my father had it, too, but not to her degree—backed up by her training as a meticulous draftsman and landscape gardener. Even though her courses taught her to “avoid straight lines” when laying out a flowerbed, she appreciated things that were square and even. And she wanted everything to have its own place and its own space. So if I, as a youngster, pushed my desk into a corner of my bedroom—so that its edges were touching both walls—she would gently advise me to pull it out, at least an inch from the back wall and six or seven inches from the side wall, so that the desk “owned” its space in the room. Crowding furniture side by side and pushing area rugs up against the baseboard were a violation of her own particular feng shui.

So I practice straightness in my environment. Pictures hang level. Wall clocks have their twelve and six aligned vertically. Rugs are square with the room. When we bought the condominium, we had a hardwood floor installed instead of the usual wall-to-wall carpet. The pattern of the parquetry is a series of small wood oblongs arranged in larger squares. Thank God the installers aligned the sides of these squares with the walls of the room—although the plasterboard itself is none too straight in some places. Otherwise, I would live in a nightmare of constantly trying to square up the floor and walls in my mind, or squint until they almost aligned. But I do keep pushing the area rugs—which are all rectangles, no ovals or circles here—to align with the edges of the floor squares. And I judge the position of a table or chair by counting the wood blocks in the floor pattern at each leg. On my daily path through the apartment, I am constantly straightening a rug with my toe, squaring up the hang of a picture, pushing at a table edge, aligning a place mat, adjusting the spacing between items on a shelf. It’s an endless job.

Some of this compulsion has made me a better writer and editor. I see grammatical looseness as a violation of alignment. I see unfinished thoughts as incongruent with the shape of an argument. The nits of spelling and punctuation are minute scratches—some of which only I can see, and then only in certain lights—that must be polished or repaired. Of course, when it comes to laying out a newsletter page or a book cover, I try to give pictures and other graphic elements their own space and not crowd them. And I have an eye that is calibrated—or used to be—to a printer’s point, or a seventy-second of an inch, about a third of a millimeter.

For example, in choosing the photo of the Tower of Pisa, above, to accompany this article I selected the best image for its lighting and background. But then, in Photoshop, I had to rotate the image two-point-five degrees counterclockwise because, while the tower was leaning most dramatically, the buildings, the light pole in the background, and the implied horizon were also tilted. I notice these things. They bother me.

So, is this an affliction or a source of strength? I don’t know. It is quirks like this, if anything, that define me. But I do know that, when I am in my grave, my ghost will be haunting my last dwelling place, nudging ineffectually at a crooked rug and scrabbling with ectoplasmic fingers to straighten a tilted picture.

1. What is it with religion and triads? First, there’s the familiar Father, Son, and Holy Ghost—which never made much sense to me, because of the “ghost” part. And then, in the ancient Celtic religion, things usually came in threes and their bards were expected to declaim in rhyming triplets. And finally, the Scandinavians had the Norns, three old women, one to spin the thread of life, one to measure it, and one to cut it off. I find that in my writing, which generally comes from the subconscious, I sometimes feel the work is incomplete unless an argument is supported by three examples, a list includes three members, or an object is tagged with three distinct adjectives. I guess I take after the bards in that way.

Sunday, November 26, 2017

The Problem With “No Problem”

French marketplace

I believe that the spoken forms of courtesy are the grease that lets our creaky social machinery, as well as our badly joined—as in a cabinet whose drawers and hinges aren’t quite aligned—personal senses of self-respect and shared obligation, function within tolerable limits. Speaking those little, objectless sentence fragments like “please” and “thank you,” as our mothers taught us, is what keeps us all from screaming, leaping, and tearing at each other’s throats.

In the same way, those almost pointless physical courtesies, like making eye contact with the people you just met, to show that you are both an interesting person and willing to communicate in depth, or offering your empty hand, to show that it doesn’t hide a weapon—or, in some cultures, offering a short bow, to expose your neck as a sign of trust—are all ways that we signal not just our common humanity and a benign spirit but also our willingness to take risks in meeting strangers and new acquaintances halfway.

Almost all such polite phrases in common usage are archaic forms, spoken relics with the sharp edges and the grammatical functions worn off.

“Please,” for example, is a much-shortened form of “may it please you.” That is, with the thought fully spoken: “Do this for me only if it would give you pleasure to accommodate my request.” You see an echo of this in the French form: “s’il vous plait,” or literally “if it pleases you.” The French verb plaisir, which has its root in the Latin placere, means not just being pleasant but also enjoying or finding satisfaction. This is a fairly gracious way of thinking and acting. This little verbal elision says, “I know that you have freedom of action and can make choices in how you do things. My request is probably going to be a burden on you in some fashion, and I don’t want to cause you any trouble. So consider complying only if it would give you pleasure or satisfaction.”

In modern usage, at least in American society, “please” has become some sort of code word for an intended enhancement, like adding “really” or “very” to a descriptive adjective. The word “please” has acquired the effect of saying, “I’m serious about this.” So, from someone trapped in a locked room: “Let me out! Please let me out!” This goes back to the childhood escalation of entreaty: please, followed by pretty please, and ending pretty please with sugar on top.

In the formula “may it please you,” the verb is in the subjunctive mood. This is an element of grammar that almost nobody teaches anymore, which means that students may have a hard time recognizing and using it, although it still sticks in the ear and in the mind, and it represents something we all more or less understand.

We all learn about the three basic tenses in English: past (something happened), present (something is happening), and future (something will happen). But these tenses also have their completed or perfect and their uncompleted, progressive, or imperfect forms: the perfect implies something that happened once and is now over and done, while the imperfect implies something that has happened before and may still be happening and recurring. In addition, we can work the changes of past, present, and future as verb forms from the viewpoint of one of the non-present tenses. You can use English to indicate something that happened in the past from the viewpoint of the past tense itself—such as, “I had gone to the store”—or something that is in the past from the perspective of the future tense—“I will have gone to the store.”

All of these tenses—which you learn in depth when you study the grammar of English root languages like Latin and Greek—are in the indicative mood, which deals with actions that really did, do, or will occur. But English and most other languages also have the subjunctive mood, describing what we hope or expect but are not certain will occur, or an occurrence about which we want to express some doubt. “May it please you,” in the example above, is a wish that honoring the request will be pleasing to you, not a statement that the request will automatically be pleasing. In the same way, when a person says, “God bless you,” it is not a statement of fact, that God actually does bless you, or a command in the imperative mood, “God, bless this person” (which is on the same plane as “Dog, sit!”). Instead, we are saying “[May] God bless you”—which is a hope, a wish, and an offering of benediction that the well-wisher is not in a position to grant him- or herself but that is for the Supreme Being to provide.1

When we say “Thank you,” things are a little more straightforward. Here the elision is simple, just dropping off the subject of a complete sentence: “I thank you.” No subjunctive need apply: you are offering actual thanks, an acknowledgement of a benefit received, and perhaps an obligation to return the favor in the future. This is that bit where we graciously entertain a risk: we are saying that we do accept the obligation to one day respond in kind. As adult individuals able to stand on our own two feet, we are shy about receiving gifts and favors. The usual word is “gratis,” meaning “free,” which goes back to the Latin gratia, meaning favor or kindness. To receive a favor or a free gift puts one in a lowered position, because only children, servants, and slaves expect to receive something for nothing. So we express thanks, with the implied obligation to return the favor, which puts us back in a position of social equality with the giver.

The response of the giver used to be, in American and in most English-speaking societies, “You are welcome.” This phrase has its roots in the words of a householder or host greeting visitors: “You are well come,” meaning it is good that you have come; your arrival is appreciated; and by extension, my house and my hospitality are at your disposal. You can see the sense of this in the opening song by the Master of Ceremonies in Cabaret: “Willkommen! Bienvenue! Welcome! Fremder, étranger, stranger. Glücklich zu sehen. Je suis enchanté. Happy to see you. Bleibe, reste, stay.” That is, to tell a person thanking you that they are welcome is to return and thereby nullify the obligation. “No, this was mine to give, and no obligation is created.” This is a pleasantly gracious turn of phrase.

But more and more in American usage I hear—and receive for my spoken thanks—the phrase, “No problem.” The underlying message here is quite different: “Providing this gift or favor, or doing this service for you, did not inconvenience me. Serving you has not been a problem for me.” Perhaps, if we want to be kind, we can interpret the message as: “I recognize that I am here to serve you. That is my purpose—and therefore not troubling to me.” But still, there is a reversal of message and a diminishment of intent here. Not “I grant you freely without obligation, as a host offering hospitality,” but “Your implied demand for service has not inconvenienced me.” Or “Your being in a position to receive a favor—as a small child or servant—might have inconvenienced me, but I want to let you know that it didn’t and I reject the notion that I had any obligation to give you a gift or provide you a service in the first place.”

Thank you and You’re welcome are a yin-yang pairing, the simultaneous creation and removal of a feeling of obligation, engendered by an originating act of kindness. The reply of No problem denies the intention that original act. In this way, it feels like a rejection of politeness rather than an expression of it. Perhaps I’m being over-sensitive, but to hear “No problem” when I try to thank someone leaves a faint taste of disdain in my mouth and in my mind.2

1. The clue that a speaker is using the subjunctive is a verb conjugation would normally sound and feel wrong. For example, the third person (he, she, it) conjugation of please is “pleases,” as in “It pleases me to see you.” But “If it please you” leaves the matter in some doubt or to be hoped. Similarly, when you see “helping” verbs like “may,” “would,” or “should,” you are often dealing with the subjunctive.

2. But cultural tastes differ. In two languages and cultures that are widely separated by geography and affinity, the Spanish use “De nada” and the Russian “Nichevo” to say the same overt thing as the English phrase “You’re welcome.” Both the Spanish and the Russian translate as “It’s nothing”—as does the French “Pas de quoi”—which is pretty close in semantic content to “No problem.” But I still find the English “You’re welcome” more gracious.

Sunday, November 19, 2017

When Wars Will End

Chinese women marching
Terra cotta soldiers

When I was growing up, I remember a tee shirt with the motto: “Be Reasonable, Do It My Way.” As with most such shirts, the intention was to be amusingly ironic. But I find it amazing that many people, especially in the realms of politics and economics today, actually think like this. No, not that they would ever voice this motto, because formulating this thought aloud in words would show them how fatuous is this idea compliant control. But still, it lurks there in the back of their minds: “I’m right, you’re wrong, we’ll all get along when you just shut up, listen to me, and obey.”

For the person who unconsciously subscribes to this tee-shirt motto—or who can look at it, read it aloud, and still fail to find the irony—the people around them must be something other than real, live, self-actuating, independent, and strong-minded human beings. Maybe the subscribers believe they are surrounded by a species of fleshy ghosts, or puppets, stick figures, and imaginary characters, like those in a book or play. For the subscribers, other human beings may have faces and voices, but their thoughts, their reactions, and the words that come out of their mouths are somehow unrelated to reality. For the people who believe Their Way is the Only Way, the intentions, opinions, aspirations, and desires of other people are simply unimportant, fictitious, or wrong—on the order of “You can’t really believe that, can you?”

We live in a world of varied opinions. Freedom of thought and action is not just an abstract idea, to be written somewhere on a dusty parchment and forgotten when convenient. The human ability to approach ideas, propositions, and profound beliefs as intellectual objects, to dissect them, to weigh the evidence for and against each side, to reach a conclusion, and then act on it … this is all a function of each human being’s having a unique brain inside a physically separate body. Perhaps all protozoans exhibit common reactions to their environment, based strictly on their genetic code and the interactions of their internal proteins. Perhaps all fish, frogs, and reptiles are predictable in their behaviors, based on the primitive structures of their vertebrate hindbrains. But when you get into the class Mammalia, where the brain starts developing different lobes and functions, learns from its environment, and can overcome its reactions, all bets are off and groupthink becomes a relic of the distant past. Even dogs have different personalities and operate with their own senses and ideas.

Yes, there have been societies with monolithic social and political structures: Nazi Germany, Soviet Russia, Maoist China, and North Korea under the Kim regime. We are all familiar with their rallies and parades, with row after row of soldiers marching in step, usually followed by military vehicles which, lately, are towing intercontinental missiles.1 You can look at the newsreels and the pictures and think that everyone in these societies must think the same: ants with jackboots and armbands. But note also that every one of these societies has an active bureau of secret police and—somewhere out in the country, away from the cities—a growing population of political dissidents in labor camps.

So, the people who march in step and who stand out in the sunshine with their arms raised—that is, the members of these societies who don’t go into the labor camps—do they all really think alike? Are the mottos of the national party and the image and words of the dear leader really engraved on their hearts? While it’s not possible—not yet, thank God—to open a person’s skull and examine his or her brain from the outside to see the content of its innermost thoughts, I believe that even there we would find a diverse mix of ideas, intentions, and reactions. Some people actually believe the party mottos and the leader’s words because they have heard the arguments, weighed them to the best of their abilities, and agree with them. Some believe because the arguments are easy to understand and affirm, and the person—for a variety of reasons—doesn’t think or care much about politics or economics, or the greater questions to which political and economic thought applies. Some believe that by saying the words and mimicking belief in them they can be part of the larger thing that is moving through their lives whether they want it or not. Some smaller fraction of this group believes that if they say the words loudly enough and show other signs of commitment and support they can get ahead in their job or their living situation—that they can climb on the back of the beast, ride it, and perhaps one day even be allowed to take the reins. And others believe that the beast is here, is a fact of life whether they want it or not, and the course of greatest safety for themselves and their family is to pretend to show it their support.

Even in the heads of the jackbooted soldiers marching down the street—aside from immediate concerns about missing a step, scuffing the shine on their boots, or being caught with their uniform in some other kind of disorder—you would find this diversity of beliefs and expectations.

Even in the social dimensions where belief is the basis of a group of people coming together in the first place, you will find that they all have different ideas and opinions. From the simple words of a carpenter and later a rabbi who lived two thousand years ago, how many varieties of Christianity have spawned, encountered schisms, and split over issues like the nature of the godhead, the nature of sin and right and wrong, the meaning of the words as they are written in the texts, and even the substantial nature of bread and wine? Even a supposedly monolithic religion like Islam, which adheres to the words of a single man exactly as they were written down from his lips, has split into factions over the rights of his successors, the importance and interpretation of certain passages above others, and the proper form of obedience. Even the simple and direct message of the Buddha has generated two versions of the proper way to observe and follow it, plus cultural variations in every land that has adopted Buddhism. Religions like Christianity and Islam will often act or react in a monolithic fashion when expanding into new territories or confronting outside opposition, but all the while their adherents are thinking, interpreting, adopting, and preparing to set up another sect or create a new schism.

People cannot help but have different ideas. In every land, in every generation, the restless human mind examines, interprets, adapts, and sometimes discards the thoughts and beliefs, patterns and traditions that wash across groups of people like the waves in an ocean. In reality, most people cannot hold a single belief or thought in their heads for all of their lives. Most groups larger than an extended family or small tribe cannot remain cohesive for longer than a generation—and some not even until the next election.

Whether a person believes in and cares deeply for a political or economic proposition or tradition; or follows it only because mother, father, or a teacher once voiced it and the person him- or herself really doesn’t care; or follows it because his or her neighbors are suspicious, will whisper among themselves, and will one day turn the person into the secret police … various societies tend to build up standing waves of belief and tradition that in turn will engender consolidated political and economic actions. Societies in this state of temporary conformity, at the crest of the passing wave, will sometimes try to spread their rigid social, political, or economic order to other groups and countries. And then, in an excess of enthusiasm, war will break out, soldiers will be enlisted for fighting and killing instead of just marching, and human beings will endure the results in terror and misery.

When will the world come together in harmony? When will the various religious sects recognize that their functional similarities outweigh their doctrinal differences and join together in uniformity? When will all of the Earth’s nations agree on a single political or economic principle and a social order that is not tainted by the interpretations and cultural characteristics of one group or another? When will they put off their differences and give up their quarrels? When will the wars end?

Only when we can put off our restless human mind, stop thinking and examining the propositions of everyday life, and become something more like fleshy ghosts or stick figures than active, thinking human beings … That is, never.

1. I find it telling that the Western democracies, which allow wide latitude for political and religious dissent, don’t hold these monolithic rallies and military parades. When was the last time you saw members of the military services marching in lockstep down Pennsylvania Avenue? Not on the Fourth of July: our national parades are a time for high-school marching bands, costumed dance troupes, veterans groups, and balloons. The active-duty military only come together and march in step to honor a fallen president.

Sunday, November 12, 2017

A Sense of Honor

Puppet master

I’ve been reading disturbing stories on social media about honor violations at this nation’s military service academies like West Point and Annapolis. Evidently—and there is much controversy on both sides as to how much of this is true—standards are eroding and cadets are becoming less observant, both of their own actions and those of others. For the record, the honor code states: “A cadet will not lie, cheat, steal, or tolerate those who do.” This is verbatim from the Virginia Military Institute. Other academies’ codes are similarly simple and direct. The penalty for any violation is an honor hearing usually followed, if the infraction is proved, by expulsion.

Having a code of conduct at school is nothing new nor particular to military academies. Most institutions of higher learning have codes, more usually involving academic honesty and issues like falsifying research data, sharing test answers, and plagiarizing the work of others. But “lie, cheat, or steal” covers those nicely, too. The intention is that anyone who passes out of the institution’s gates with either a military commission or an academic degree is presumed to reflect the institution’s traditions and values. The person is not just in possession of new knowledge that he or she did not have before, but the graduate has also acquired a sense of responsibility, as well as an understanding of and loyalty to certain patterns of thinking and acting. He or she can be trusted in the wider world to function in a particular way in relation to others. A complete education is not just about knowledge but also about core beliefs and accepted attitudes and obligations.

This is an old-fashioned idea about what education does for both the individual and society. And it calls into question the modern substitutes for traditional brick-and-mortar university learning that are now available through online teaching resources like the Khan Academy. I am all in favor of competition—especially the kind that can be offered easily, cheaply, and without the expenses associated with printing textbooks, physically traveling to remote locations, and finding room and board there. Competition like this opens educational opportunities for more people. But I also question whether a person sitting on his or her sofa, reading HTML texts, watching prepared videos, listening to online lectures, and passing online tests, will acquire the sense of academic community, identification with a particular way of being and presenting oneself, and adherence to institutional ideals that attendance away from home in a college or university environment can provide.

And perhaps my reservations no longer matter. If a university or military academy education has stopped being a life-changing experience, intended to mold the individual’s character, and has now become just another expensive commodity, like a Mercedes-Benz or an address in the upper East Fifties, then the old assumptions no longer apply. If the quest for a university education is now totally commercial, with the goal of receiving a piece of faux-parchment signifying validation of one’s academic success, no matter whether obtained through honest study or by fraudulent shortcuts, then the modern student might as well skip the tedious classroom discussions and sit at home in front of a computer screen, soaking up and regurgitating knowledge and then receiving a convenient Diplomate® app for easy display on his or her smartphone. Then the only thing he or she will really miss is four to six years of binge drinking and hookup sex.

The whole point of an honor code, especially in our military service academies, is that we expect our future military officers—and the educated technocrats from other universities, who will eventually run our society—to be special people. We expect them to have all the fluff and laziness, the willingness to succumb to temptation, and the selfishness of thinking about their personal goals, their precious careers, and their own skins ahead of the lives and safety of others and their duty to the nation—to have all of that ground out of them. We expect them to be fearless in facing hard facts and making hard choices. We expect them to be the sort of people that others can look up to, obey, and perhaps die at their command. Without this kind of moral training, this sense of honor, the graduates who become commissioned officers or trained professionals are just ordinary people—and perhaps cowards and thugs—who happen to have a special set of skills.

Honor is an old-fashioned idea. It seems quaintly 19th-century today. And it has been under attack for most of the 20th century.

In my terms, a sense of honor is nothing less than the codification of the superego—also known as the “parent” in Transactional Analysis—provided that this inner voice has been properly prepared and firmly established by the person’s mother, father, teacher, priest, coach, and perhaps his or her drill sergeant. In my family, with strong parents on both sides, we heard a lot of “That’s not the way we do things,” “That was a bad choice,” “Don’t be selfish,” “Don’t be small,” and similar admonishments spoken directly into our faces. If a child does not get such a thorough grounding in moral issues and proper choices by the time he or she reaches puberty, then rules that come later in life—even simple ones like “Do not lie, cheat, or steal”—will have little effect. The sense of personal honor, of obligation to something larger and more important than self—and hence “super” ego—will simply not exist within the adult’s mind. Either that, or it will exist but only as an abstraction, an intellectual curiosity, like quantum mechanics, and not something personally relevant and vital to his or her situation.

In traditional Western civilization, men have been ingrained with a sense of honor along the lines of “Do not lie, cheat, or steal,” combined with admonishments to be strong, to protect the weak, to have a firm purpose concerning our jobs and careers, to provide a good home to our families, and to be good citizens. Men are expected—were expected—to become brave soldiers, honest workers, and loyal companions.

Women, on the other hand, have traditionally had a separate sense of honor bound to their role as mothers and the founders of families. A woman’s honor has traditionally been completely physical, centered in her virginity. A woman might lie, cheat, and even steal—in fact, she was often expected to cultivate a certain artfully deceptive nature—but she was expected to defend the space between her legs, reserving it and her sexual favors only for authorized males. And that authority was always defined by other males—father, brother, husband—except in the singular instance that she might choose one man to love and cherish (and presumably obey) for the rest of her life, and tra-la-la. This was the traditional way a society that places great store in family relationships and transfer of property through inheritance tries to keep bloodlines “pure” and avoid giving favors and estates to the products of casual bastardy.

Now that women are taking a more active role in the greater society outside the home and attaining positions of authority, power, and respect, we have to widen the notion of female honor as well. We must expect women to become the same kind of brave soldiers, honest workers, and loyal companions that we require of men. A sense of honor is now a universal human requirement, not reserved to any gender roles.

Unfortunately, the trend in modern society has been moving away from any sense of honor at all. I saw this during my university days in the late ’60s, when I started meeting self-avowed hippies. The hippy lifestyle is based on doing what feels good rather than doing what feels right and appropriate. Smoke a joint, take hard drugs, get drunk, get laid, tune in, drop out—it’s all about personal gratification. That, and not being so awkward and gauche as to call other people’s values and actions into question. People with a sense of honor, with the built-in stop code that makes them think about the consequences of getting stoned or laid, were—and still are—considered “uptight,” “square,” and “rigid.” That is, unhip, uncool, and unpleasant.

A society of careless hedonists, who have no internal moral compass more sophisticated than “It feels good, so do it,” is easier to lead and control. You can sway them with cheap amusements and petty freedoms. And when you turn your back on them, they will sink down in a kind of beatific stupor rather than rising up in arms. Careless hedonists have no grit or gumption—old-fashioned words!—no expectations or goals beyond the next moment, and no sense of self that might wake up and say “No, not here,” “That’s wrong,” and “I won’t do that.”

A sense of honor is considered unfashionable these days, and so is optional. Is it any wonder then that our universities and other institutions, even our military service academies, are willing to let it erode? Is it any wonder that notions of honor and duty are fading into the past?

Or have they merely gone underground, among those of us who remember, waiting for a better day?

Sunday, November 5, 2017

Sensitivity as Vulnerability

Puppet master

When I was a child, my parents were relatively strict. They brooked little in the way of nonsense—and no sass—from me and my brother. They taught us how to be sober, industrious, capable adults. They wanted us to be strong, not so that we could terrorize the neighborhood or gain unfair advantage over our fellow men, but because life is unpredictable and sometimes hard, and you never know when you will need strength, will power, and sober attention to survive it. And they did all this without preaching or ever using words like “sober,” “capable,” or “strong.” They operated instead through setting examples and by turning common experiences into life lessons.

In our household, weakness and disability were not failings to be ashamed of but limitations to be overcome. If we were not strong in some area, then practice in order to become strong at it or find a way to compensate for the weakness. If we ever should have a disabling condition, then then we would compensate it, fix it, or cope with it in silence. If we made a mistake, then admit the error, try to correct it, and move on. Apologize as necessary, but only from a position of confidence and personal strength. As people with good standing in the community, a good education and proper upbringing, and access to family resources, we were supposed to be among those on whom others could count for assistance in emergencies or call for help in time of need.

The family ethos was to pay our way, do our duty, support the local community and the broader civilization, and stand on our own two legs. That was what it meant to be an adult in the larger world. We didn’t impose on anyone. We asked for no special favors. But we also kept what belonged to us and took care of it. And we didn’t back down from doing what was fair and right, either for ourselves and family or for others in our circle and our community.

But now I sense a change in the culture around me. It seems that people these days act—or are expected to act—as if disability and weakness were sources of strength and pride. They have become a sign that the community, the culture, the world owes you special consideration if not a bounty and some financial compensation. Maybe the world does owe you that, especially in particular cases like a civil suit, breach of contract, or class action, but my father would have suggested you’re a fool to expect that anyone else is actually going to pay it. Count the money after it’s in your hand.

The current culture also seems to suggest that a bristling sensitivity to personal sleights, to acts of disrespect based on gender, race, or ethnicity, and to loss of face or character in any situation should be appropriate to a well-balanced adult. And that these damages to the spirit must be redressed as a form of social justice and should carry the same weight as physical injury or damages to health and property. My mother would have called this attitude “wearing your heart on your sleeve” and suggested that this is not a survival trait.

For one thing, being overly sensitive and ready to take offense puts you in the crosshairs of the world’s bullies, road-ragers, and careless or thoughtless people. Your sensitivity makes you vulnerable to the hurts that life is all too ready to dish out, and it will require you—if you are consistent and conscientious in your approach to the world—to be constantly on the verge of taking offense and then being forced to take action. In the way that I was brought up, this is too much like work. Far better to develop a thick skin, ignore the offense—but mark the offender for future reference as someone to distrust or avoid—and then go on about the business that is important to you.

For another thing, wearing your hurts, your desires, and other aspects of your personal self so openly seems … insecure. In my family, what a person really thinks and feels, desires and needs, likes and dislikes are all matters for discussion only with other family members and intimate friends. An adult, a well-balanced individual with business to attend to, wears a personal face for that inner circle and a public face for everyone else. To expose those hurts and desires to the world at large is to give potentially unfriendly forces too much information. Your enemies—or those with an interest in becoming such—will know how to practice on you. The situations that can make you register offense or psychological hurt are usually related to your own insecurities and your inner sense of weakness or inadequacy. Why would you want to show the world at large any other persona than that of a self-sufficient, responsible, capable, balanced adult, someone with unknown access to personal and physical resources, and with unguessable limits as to your patience, tolerance, and goodwill, or to your capacity for animosity and rage?

Asking others to give you their pity and support and suggesting that they must act carefully and walk in circles around you to avoid giving you offense is a bad strategy. It is asking too much of the world. Living your life on those terms is putting yourself and your weaknesses at the center of others’ concern. That’s a nice situation, if you can get it. But the reality—as any well trained and well brought up child would know—is that you are just not that important. Not to the world. Not to the public space beyond your family and your intimate circle. And perhaps not even to them.1

To believe otherwise is conducive to neither spiritual nor physical strength. It is, in my view—and in the view of my parents and the generation they represented—a sign of weakness and vulnerability. What the world, that public space, and most of your family and friends expect from you as an adult is strength, capability, patience, persistence, and the ability to cope.

1. Unless, of course, you are the actual Prince of Wales and not the pauper Dauphin-in-hiding.

Sunday, October 29, 2017

Language as a Map

Antique map

Cartography, the art of making maps to record details of a patch of ground in human-usable form, is an artificial and abstract art. It’s artificial in that, when you come right down to the process, the map itself bears no physical relation to the surface it records. It captures none of the details that a human being standing on the ground would see, nothing that looks like the symbols on the map, not the trees, not the mountains, not the buildings. And the product is abstract in that, when a person tries to put the product to use, he or she must supply an intermediary step, that of knowing what the symbols represent and how they were used to depict the ground in two dimensions.

We might think that map reading is intuitive, but actually it’s an acquired skill. Most of us—or at least I did—picked it up at home, first from watching our parents use maps, and then by asking questions and following along ourselves. If the skill is taught in school anymore, it probably comes in the second or third grade, when the curriculum starts differentiating “social studies” from reading, writing, and arithmetic. By the time a child is studying anything so formal as “history,” he or she is expected to be able to look at a map and know the conventions, such as that the top is “north”—at least among those of us who are European descendants—and that some of the squiggly things are roads while others are rivers. If you went to the outback of Australia or the rain forest of Brazil and handed a map to an aborigine who has had no contact, or almost none, with civilization—if such people still exist in this day of satellite dishes and continuous, invasive explorations—that person would not know what to do with it. For him, it’s just a piece of paper with squiggles and colorful blotches. And when told that it describes the land surrounding him on all sides, his response would likely be, “Um … no.”

Anyone who has spent any time with antique maps knows that the conventions of cartography have changed over the centuries. Modern surveying techniques, exact measurements, and now satellite photographs have all made the details a lot sharper and more reliable. We no longer show mountains as little pictures of lumpy hills, representing the peaks only from the perspective of someone hanging fifty thousand feet in the air along the southern edge of the map—while the rest of the information pretends to be a vertical projection, invested with meaning only when viewed straight down from the document’s midpoint. We no longer draw little trees for forests or tufts of weed for swamps. For simplicity, modern maps like those used in schoolrooms show elevations as colored backgrounds, with the lowest areas usually in bright green, shading through yellows and light browns for the plains, to dark browns and reds for the highest peaks.1

A more detailed map—such as one you would use in hiking over rough ground—uses contour lines that try to represent a three-dimensional space on a two-dimensional surface. And again, it takes skill to read this kind of map. You have to supply that intermediary knowledge step to understand that each line represents an equal increment in elevation—and then to realize that where those lines are far apart the slope is gradual and the walking easy, and where they are close together the ground is actually a cliff face and you’d better bring ropes and pitons.

Maps are also drawn for specific purposes. A roadmap, such as you once could buy at a gas station—back when they serviced all of a driver’s needs, not just gasoline, packaged foods, and big drinks—and now you must get from AAA®, are designed to show a dozen different grades of road, from dirt track to superhighway. But such a map is going to be vague on the subject of mountains, forests, lakes, and rivers—other than to show where the bridges and ferries are to be found. Conversely, a riverine or marine chart is extremely accurate about soundings, snags, buoys, and bearings but leaves the dry ground completely blank except for details like wharves and docks right at the water’s edge.

These days, with the ubiquitous smartphone putting gobs of online data in your hand, mapmaking as a representation intended for paper viewing is dying out. Maps are now compilations of all these details—filtered according to the application—on a screen that orients itself and picks the scale according to command and scrolls along as the user moves from one place to another. And now Google Earth does away with the abstract map entirely, showing you an actual photograph of the ground, taken by a satellite in orbit, with the ability to fly you down to “street level” and view photographs of the surface taken by a roving camera sometime within the last year or two. This is no longer a map requiring that intermediary knowledge step but a stop-action video of the ground itself that any aboriginal Australian or native Brazilian would be able to recognize—if the Google camera cars ever went to the Outback or traveled up the Amazon.

In the same way,2 language is an artificial and abstract way of representing the complex reality that we humans find all around us. It’s artificial in that no word or phrase is an exact copy of the thing or place being described. It’s abstract in that the hearer or reader of a statement made with language must supply an intermediary knowledge step involving the meanings and often the connotations of the words themselves, the mechanics of grammar and syntax, and an experienced ear to distinguish the sloppy, elided speech of everyday communication from the parsed and precise language as taught in school. Such knowledge must exist before he or she can appreciate the content and intention of the communication or description. If you don’t think this intermediate step is necessary, try walking as a non-Chinese-speaking foreigner into a bakery in Beijing and ordering a donut.

Just as maps have evolved from depicting mountains with lumpy little hills and forests with picturesque tree shapes, so human language has evolved—and it in all its forms is constantly evolving, adapting, and growing more precise and specific. If you read a history of language like John MacWhorter’s The Power of Babel, you can see that language as spoken is not the product of dialect pressures and slang usage working on neatly separated language families like “English” and “Spanish.” Instead, every person, as a member of a local group or affinity, speaks and writes with a set of evolving symbols and meanings. Spanish, as MacWhorter shows, is nothing but a dialect of Latin that has softened, morphed, and changed over the centuries on the Iberian Peninsula after the Roman Empire withdrew. Similarly, French is a dialect of the same Latin which evolved in northern France, while Italian is Latin that has developed in place on the Italian Peninsula. And English is just a cracked and crazed horror—the result of a millennia-long culture clash between the native Celts and the conquering Romans that was then warped and shaped by successive invasions from Northern Germany, Denmark, and French Normandy. And all of these languages went through another convulsion in the past six centuries or so, when mathematical thinking based on Arabic terms and scientific thinking based on Greek terms arrived with the Renaissance and the Enlightenment.

Just as maps are used for particular purposes—hiking, driving, or marine navigation—so languages have developed specific words and constructions for defined purposes. Two doctors discussing a patient will use precise medical terms and adopt a peculiar perspective of diagnosis (what is happening now) and prognosis (what will happen in a future with and without treatment) that the patient might only dimly understand. In the same way, two physicists discussing quantum mechanics, or two programmers describing a piece of software, will quickly subside into specialized terms—quarks, leptons, bosons, vectors … or RAM, ROM, gates, loops—that the lay person can only follow with much difficulty and then with a high potential for misunderstanding. Again, the average person doesn’t have that intermediary knowledge step, just as a speaker of English-only is lost in the Chinese bakery.

In dealing with maps, soldiers and other practical users must constantly remind themselves that “the map is not the territory.” You might think that you can plan an attack or defense in great detail just by working from a map. But maps are still artificial and abstract constructs. The slope you believe might be gradual enough for a team of soldiers to charge up carrying full packs and equipment turns out to be much steeper. The team is delayed—or cut to pieces in a crossfire. So a good officer tries to see the ground, perhaps by aerial observation, probably by walking over it in person, before committing to an action. In the same way, a ship captain knows that the chart may not include all the nuances of current and wave action common to an estuary or harbor and so employs a pilot who knows those waters. And a hiker discusses the trail with others who have walked it before committing his or her life to the wilderness.

In dealing with language, writers and other practical users must constantly remind themselves that the terms and constructions they use may not be understood by everyone who encounters the text. A good writer—whether of fiction like a novel or nonfiction like a technical manual—is constantly advised to think of the intended reader, imagine that reader’s familiarity with the intended words and concepts, and work through the differences. A technical manual to be used in a closed industrial setting will use different norms from the manual intended for home use in assembling a piece of furniture or operating a stereo set. A novel written for a science fiction audience will use different concepts, structures, and devices than one written for the romance or mystery reader.

But unlike a map, where the thing being represented is open ground and available to any set of eyes, the reality that language tries to capture is usually private and personal—at least for those of us who write fiction. What is the shape of love? Of frustration? Or rage? We cannot uniformly represent these abstract human realities with contour lines or graded roads. Instead, we must talk around them. We must present stories—circumstances, actions, results—that will elicit in the reader’s mind and memory the feelings we are trying to portray. I cannot accurately tell you the shape of love, but I can tell you a story about meeting and engaging with a particular person, living in their orbit and in their arms until their presence becomes second nature, and then losing that person to misfortune or misunderstanding, so that you can feel the delight and the pain in your own chest.

In this way, language is not just like a map, it is better than any map. Where a map is a two-dimensional representation of a three-dimensional surface, the power of language encompasses past, present, and future, involves both actions and reactions, and includes the dimension of feelings about those actions and circumstances, regret for things that never happened, hope for things that might be different … that is, a representation of the a vast, multidimensional universe that is the human mind.

That is what we writers are about: we are cartographers of the human soul.

1. The color range doesn’t go through blue, though, because that color is reserved for the water of rivers, lakes, and oceans. And here, the convention is that the darker shades of blue are the deeper bodies of water.

2. This is not an original thought, but it’s one that has been rattling around in my head.

Sunday, October 22, 2017

Life Like A Sword

Forging a samurai sword

Metaphors comparing the whole span of a human being’s existence to some household object or process—“Life is like a sponge!”1—are usually cheap and easy. But they’re fun to make anyway. So here goes.

A human being is like a sword, and life is the history of its making and use. The best human beings are the product of good materials and loving care in the making.

In the Japanese art of swordmaking, the master smith smelts and refines his own iron, prepares the wood to make charcoal for the steel’s carbon content and for heating the furnace and forge, and attends to every detail of the manufacturing process, including those components made by other artisans: the handle of ray skin wrapped with cord; the guard made of decorative but strong metal forged and carved into a memorable pattern; and the wooden sheath that both protects the sword and is itself protected by layers of shining lacquer. This attention to materials is like a child who starts with good genes, is born into a loving home with attentive parents who have consciously decided to nurture another human being, and is then given over to dedicated teachers who will work to make him or her into a confident, loving, productive, and happy adult.

The Japanese sword is made from two kinds of steel. The central core is a low-carbon alloy that is relatively soft and flexible, giving the sword its strength and resistance to shattering. During the forging process, this core is hammered into the groove of an outer jacket made from a high-carbon steel that is hard and stiff, able to resist blows and to hold an edge. When these two steels are quenched after being pounded together, the inner core contracts more rapidly than the outer jacket, giving the sword its characteristically graceful upward curve.

In similar fashion, the best human life is made of both flexible and hard mental and spiritual components and attitudes. A person who will be both successful and happy must have some measure of vulnerability to the world and the people surrounding him or her, able to understand and respond to the pressures that the world brings in into any life. A person who cannot bend under pressure will break. A person who cannot perceive love and pain in others and respond with compassion will live alone in a muted, unhappy, and unproductive life. At the same time, a successful person needs a shell, an outer jacket of mental and emotional toughness, able to withstand adverse opinions, direct and implied criticisms, and outright emotional and physical assaults without folding up or losing his or her drive and sense of purpose.

The Japanese smith prepares each piece of steel, core and jacket, before joining them. He hammers it out, then folds it over and hammers it out again. The steel may be folded and hammered between eight and sixteen times. The folding creates layers exponentially: one becomes two, two become four, four become eight—like the process of putting one coin on the first square of a chess board, two on the second, and so on until reaching a staggeringly immense number before loading the sixty-fourth square. With this level of combined folding, the samurai sword may have between 256 and 65,000 layers of steel in each of its component parts. These layers and the welds that are made where they fuse together give the steel its strength.

The best human life has a recursive element. Whatever a person undertakes—practicing a profession like law or medicine, playing a musical instrument, engaging in a sport or martial art, or perfecting a fine art like painting or writing—requires repeated practice, usually on a daily basis. Each time a lawyer or doctor takes on a new case, a musician confronts a new score, a player engages in a particular skill or move, a painter confronts a blank canvas, or a writer slips into the stream of a story, the person’s quality of effort improves. The practitioner discovers and sheds excess motions and bad habits. And he or she develops a deeper and deeper sense of the art and its complexity. The adept has explored both the art and his or her own psyche and abilities at a level that the beginner and the novice cannot understand. This layering of experience and expertise has its dull spots and its plateaus, like the weld line between one layer of steel and the next, where weeks of work seem to advance the practice and the art not at all. The knowledgeable student—or one guided by a knowledgeable mentor or teacher—knows that these plateaus are gathering places, where the mind and body are accumulating, analyzing, and storing past experience. Each plateau will be followed by a renewed climb with sharply increased ability and understanding.

In the same way, people build up a relationship with their life situation and with the people who inhabit it by going through periods of intense feelings of happiness, acceptance, and love, followed by periods of depression, doubt, and dislike. This is the way the brain and the mind build up a rounded picture of the physical world or of another special human being: seeing the object of life and affection from different sides at different times, making and remaking judgments about the situation and the person, reaching a state where one can say with confidence, “I know this place, this life, this other person.”

And finally, the samurai blade—like any sword or knife blade—is heated and then quenched, plunged into cold water or oil, to temper and harden it. The Japanese swordsmith coats his blade with clay in varying layers before this tempering. The thicker clay laid along the spine allows the steel in this area to cool more slowly in the water, making the sword’s backbone more flexible. The thinner clay along the cutting edge allows more rapid cooling, making the steel there harder.

Human beings are tempered by the shocks and reversals of life. A person heads in one direction, with one set of goals and expectations, only to be turned or thrust aside from the path by a personal failure, external conflict, or unexpected disaster. A person with the best preparation and attitude—the steel that is both flexible and hard—accepts the shock, adjusts his or her course, learns from the experience, and begins anew. A person with poor preparation and attitudes—the steel that is too soft or too brittle—collapses or shatters, stops, folds in on him- or herself, and does not begin again.

The finished samurai sword is a thing of both purpose and beauty. It is a weapon, created to be the best at the purpose for which it was designed: to cut a human limb or body apart with one blow, cleaving armor, clothing, tissue, and bone. It is a savage purpose, but one that is clear and obvious. At the same time, the sword is an object of love and beauty. The blade surface is polished to a mirror finish; the edge has a carefully defined, satiny appearance; and the components and appliances like the handle, guard, and sheath are examples of the highest craftsmanship.

The best human life should be a mix of purpose and beauty. The adult must take up and hold a position: a place in society, a profession or pursuit, and a role in the family or other communal group. These purposes define the life and make the person whole. At the same time, the developed life should be a work of art in itself: the person fills his or her time and expends his or her energy apart from the working world in acquiring and savoring experiences, knowledge, new skills, and wider associations that make a human being into a more rounded, capable, and attractive individual. The person creates order and beauty in the world through his or her thoughts and actions.

These are the elements and attributes of a good life—the best life. But not everyone gets to live such a life.

In feudal Japan, every samurai—the military adherent of a noble lord—carried both long and short swords, even in peacetime and always on his belt except in the most intimate moments at home, and then the swords were always within reach. The sword was both the symbol of the samurai’s position and the primary tool of his profession. During World War II, Japanese officers were expected to wear a samurai’s long sword, or katana, as a symbol of their rank. Many carried heirloom blades handed down in their families for generations. But toward the end of the war, when losses in the field had reduced the numbers of both established officers and antique swords, the crop of newly promoted lieutenants carried swords of no distinction, mass produced, and sometimes just hammered into shape and filed to an edge from any old piece of flat steel, such as a truck’s leaf spring. These swords were no better or more attractive than the black iron blades with hooked ends given to the orcs in The Lord of the Rings movies.

In these days in the West, with the decline of family life and the reshaping of education to focus more on self-esteem and a sense of entitlement than on knowledge, effort, and experience, many of our young people are left in a bleak state. A child from a broken family in a declining school, given no opportunity to recursively practice a sport or musical instrument, prepared for no particular purpose in life other than general dependency, and shielded from the mental and emotional shocks that are natural to sustained effort, prepares for a flat, dull existence without the tempering of personal trials and painful adjustments. Such a person is like a truck spring hammered out to look like a sword but with none of the internal qualities or external finishes that define a battle-worthy weapon.

I fear that we are making a nation with more truck springs than samurai swords these days.

1. You spend all your time wiping up messes, and when you’re old and saturated with gunk, you get thrown away. … Ewww!

Sunday, October 15, 2017

The Best Life

School picture

Irene Mary Moran (1940-2017) was born in San Francisco on 23rd Avenue, just north of Taraval Street, in a house her parents John and Delia Moran had owned since before the Great Depression. The neighborhood and the parish of St. Cecelia Church defined her early life and remained her spiritual home for more than sixty years.

The street she lived on brought friendships that Irene treasured throughout her life. It was also a steep street with smooth sidewalks that invited Irene and her friends to do crazy runs on their metal roller skates down toward Taraval, with only a sharp turn into the last driveway on the block—risking a fall and scraped knees or worse—as the way to stop from shooting out into busy traffic. Irene always said she got up the courage to do this after a breakfast that included a Cherry Coke.

When her beloved father died in 1948 of a heart attack, Irene’s life changed drastically. As a young man, John Moran had been a long-distance runner, had been wounded twice in World War I, and came to America from England to become a member of the U.S. Customs Service. Her mother Delia Carty had been born in Ireland, came to America in 1919, and worked for ten years as a domestic before meeting John in San Francisco in the late 1920s. John’s death put Delia, Irene, and her brother Desi in difficult circumstances. While the rest of the country was enjoying the rebound from World War II and then the economic growth of the 1950s, Delia received a modest inheritance and had to work as a school secretary. For Irene and Desi, these years were a continuation of the hardships of the Depression and the war years, and it made Irene careful about money for the rest of her life.

Irene was educated at St. Cecelia School, Mercy High School, and Lone Mountain College, where she studied history. After graduation, she worked for a while at Western Greyhound as a typist. She also had jobs during school as a sales clerk, usually at Macy’s downtown; so Irene rode the Muni streetcars on a daily basis. These work experiences—which were all that seemed to be available to a woman, even with a college education, who didn’t want to be a teacher or a nurse—convinced Irene she needed a better course. She studied library science at the University of California, Berkeley, where she took her master’s degree. This was her first time living and working in the East Bay, outside of San Francisco, and she would sometimes joke that she had moved “overseas.”

Right out of library school, Irene got a job cataloguing rare books and manuscripts at The Bancroft Library—where capitalizing “the” was a point of honor. Although she may not have realized it at the time, the Bancroft was the best place for her. It was and remains one of the most respected history libraries in the world, building on the collection of Gold Rush historian Hubert Howe Bancroft, who documented the development of California, the West, and Mexico and Central America after he arrived in San Francisco in 1852. Irene developed a great pride in the institution, made many lasting friendships in the library, and had deep respect for its Director of the time, James D. Hart.

With a permanent job and newfound freedom, Irene bought her first car—the first in her family—in 1965. It was a baby-blue Volkswagen Beetle, and she loved it. Irene was a self-taught driver and immediately took the car on a long, solo trip to northern Arizona. There she encountered her first patch of black ice, spun into a rock wall, and learned about getting her car repaired as an out-of-towner. She later took other trips in the VW with her mother to Portland, Seattle, and Vancouver. She kept that car for more than ten years and then only sold it to the son of a friend.

Irene stayed at the Bancroft for 27 years, rising to the position of Head of Public Services. There she was responsible for staffing the Reading Room and preparing the quarterly exhibits of donations to its special collections for the interest of the library’s Friends organization and the many scholars who use its amazing resources. At the end of her career, as the Bancroft and similar special-purpose libraries all across the nation put the catalogues of their unique collections online, Irene learned the new skill of computer coding and access. Working at the Bancroft in a position of authority made Irene the confident, capable woman she was.

She was always ready to help visiting scholars in their particular searches. During the mid-1970s she worked with the author Elinor Richey in developing reference materials, photographs, and drawings for Elinor’s next history project, The Ultimate Victorians of the Continental Side of San Francisco Bay. The volume was being published, like Elinor’s other works, at Howell-North Books in Berkeley. Elinor kept telling Irene, who was a tall woman at five foot eleven, about this tall young editor she was working with at Howell-North. And Irene’s response would be “Yes, yes, Elinor. But about this picture …”

Irene and Tom at Christmas

I was the tall young editor, and Elinor would tell me about this tall librarian she was working with at the Bancroft. And my response would be “Yes, yes, Elinor. But about this sentence …” I did go into the library once to retrieve some photos, and met a tall and beautiful librarian with long blonde hair. I recognized Irene from her name badge, but the only words we exchanged was her asking me to use a pencil instead of my fountain pen in filling out an order form. In a rare book and manuscript library, ink was forbidden because a scholar taking notes might accidentally mark a precious resource. Those were the only words we spoke for more than a year. But I remembered the name Irene Moran.

We finally met formally, as in a date, in 1975 at the publishing party for Elinor’s book, which was held at the Oakland Museum of California. We liked each other enough to go to dinner afterwards. From there, we continued dating and got married a year later. Because friends of Irene’s in Berkeley had just been married by this smart, young woman judge on the circuit in Alameda and Contra Costa counties, we took our vows at the courthouse in Martinez on October 15.1

In preparation for living together, we had been looking at housing in the area and focused on the Gateview condominium complex in Albany. It was an easy commute to Irene’s job on campus and had good bus and BART connections for my then-current job at the Kaiser Center in Oakland. We signed the mortgage papers while we were still single and planned to move in right after the wedding. Because we were the first occupants of that condo unit, we had the balcony enclosed and hardwood floors installed—work that needed some time to prepare. It was a beautiful location, with views of the trees on Albany Hill from one side and down the shoreline to the Bay Bridge and San Francisco on the other. The price was more than anyone in either of our families had ever paid for a complete house, and we always thought we would eventually move out to a home in the Berkeley Hills. But over the years of looking and not finding, and coming back to our condo where the sun was shining and the views were inviting, we always decided to stay. We remained at Gateview for 41 years.

A major influence in Irene’s life as a young girl was her cousin Kathleen, who was some years older. Kathleen had served in the Marine Corps and eventually managed an office in Philadelphia. She showed Irene that a strong and independent woman could be successful in the world. In 1981, in the midst of plans for moving with her fifteen-year-old son Gary to California, Kathleen died suddenly of a thrombosis. Irene decided that she wanted Gary to come out west anyway and that we would make a home for him. Gary stayed with us until he graduated from high school and joined the Air Force. Irene and I never had children of our own; so Gary and his wife Jessica and son Shane have since become our family.

Although Irene loved the Bancroft, it was always, well … work. In the mid-1980s we were watching the Alex Haley television special Roots. One of Haley’s ancestors—“Chicken George,” a slave who was also an entrepreneur raising fighting cocks—declared his intention to save his money and “buy his freedom.” That notion reverberated with Irene. She then and there decided to save her money and buy her own freedom—or be in position to take advantage of the university’s occasional retirement buyout packages. She was finally able to retire in 1991.

Irene always loved to travel. During her early years, she took a solo trip around South America including Buenos Aires, Rio de Janeiro, and Machu Picchu. And she went camping in Mexico and hiking in the Rockies with friends. She also flew to Ireland several times to visit the farm where her mother grew up, and which was then in the keeping of an aunt. After she retired, Irene and I traveled to London twice, to Italy twice, to Paris, and to Amsterdam. When I was working and unable to join her, she booked travels with lady friends to Brussels, Greece, and Eastern Europe.

Her newfound free time enabled Irene to volunteer in the causes to which she felt closest. Her brother Desi had suffered a severe mental illness all his life, and that inspired Irene to join the East Bay chapter of the National Alliance on Mental Illness, or NAMI. Over the past twenty years, she has worked as treasurer and office manager and coordinated the mailing of the chapter’s bimonthly newsletter. Early in her retirement, she also volunteered at the Marine Mammal Center in Sausalito, joining the Monday Day Crew. There for fifteen years she and others handled the rough physical work of herding sick and injured elephant seals and California sea lions, mixing fish mash and intubating animals that could not feed themselves, and cleaning the pens. It was vigorous outdoor work, and Irene loved it.

Irene also had twenty-plus years of serving as a volunteer usher at the Berkeley Repertory Theatre. And she served two terms on the Gateview Homeowners Association Board of Directors, both during difficult times for the association.

Irene at Richmond Art Center

Her mother Delia died in 2004 at the age of 102, and we always thought Irene would live as long. In her final years, Delia suffered short-term memory loss: she could recall people from her life on 23rd Avenue from fifty years in the past but couldn’t remember what she had for breakfast. This might have worried anyone else, but Delia remained a cheerful person with a gracious disposition. This gave me hope that there can be peace and acceptance under all of life’s conditions.

Irene battled depression for most of her life and alcohol in her later years. She hit “rock bottom” in the year her mother died, and then she decided to do something for herself. She joined Alcoholics Anonymous and took up their program with a will. She embraced its Zen-like demand for self-examination and self-honesty, as well as the AA tradition of service to others. She became a backbone of her home chapter, picking up and driving people to meetings and to their other appointments. Although Irene broke from the Catholic Church at a young age, she found peace in the AA concept of a higher power, or supreme spirit, and she began meditating.

Irene and I took our last trip together in the fall of 2012, to Arizona to visit the natural wonders and Native American heritage of the Southwest. This trip echoed one we had taken early in our relationship to the canyon lands of southern Utah and northern Arizona. Shortly after our trip, Irene suffered a heart attack and had a stent installed. This showed her that, in addition to her depression and alcohol, she had to work on getting exercise and eating right. She rose to this challenge as she had to the others. Irene was a brave, purposeful, dedicated woman.

Despite her efforts, her last couple of years were a time of failing health and diminished capacity. Earlier this year, she began experiencing headaches, nausea, and leg pains, which a neurologist diagnosed as an arterial inflammation, or vasculitis. On the morning after Labor Day, Irene succumbed to complications from this disease and the powerful steroid used to treat it.

Those who loved Irene knew her wonderful qualities. She lived the best of lives—strong, alert, interested, and purposeful. She was my wife, my love, my lady, and my best friend.

Irene’s favorite passages from Desiderata

“Beyond a wholesome discipline, be gentle with yourself. You are a child of the universe no less than the trees and the stars; you have a right to be here.

“And whether or not it is clear to you, no doubt the universe is unfolding as it should. Therefore be at peace with God, whatever you conceive Him to be.”

1. Today would have been our 41st wedding anniversary. Love you, Irene!

Sunday, October 8, 2017

A Balance of Power

Balanced rocks

In matters of politics and economics, I do not believe that any one side of an argument or a proposition possesses the ultimate truth, holds exclusive bargaining rights, represents final authority, or has been gifted with the other attributes and artifacts of power. Yes, I believe that truth, rights, and authority all exist, but they must be established, weighed, and tested on a case-by-case basis. No one has uncontested power by virtue of his or her personal beliefs, political stance, or past actions and achievements.1

But it would seem that our current political situation and its effect upon our economic situation has devolved into a philosophical fight over who should have the ultimate power to decide where the truth lies in any discussion and how society should be organized and maintained. The conflict eventually comes down to who shall have the right to life, liberty, and pursuit of happiness—and who should be shunned, shouted down, and ultimately hunted through the woods with dogs.

In this fight, some people prefer to give power and the weight of defining truth and making decisions to those who were either elected or, in most cases, appointed and hired into government positions. Adherents of this statist philosophy view these elected representatives and appointed or hired civil servants as high-minded, selfless, and incorruptible. They believe these government people should have authority over others because, first, they are bound to be fair and impartial through having no vested interest in the outcome of the decisions made in their sphere and, second, they possess the training and experience to make the best decisions based on the latest scientific, psychological, and sociopolitical thinking. The adherents have taken to heart Plato’s ideal, expressed most notably in The Republic, that society should be ordered and maintained by a cadre of philosopher-kings.

At the same time, other people would prefer to leave power and the burden of defining truth and making decisions in the hands of individual citizens. Yes, some decisions must be made for the common good by governors, legislators, judges, and their supporting departments—but these decisions should be in strictly designated areas like providing military defense and maintaining the borders; building community infrastructure such as roads, harbors, and water supplies; and offering police and judicial services for personal protection and redress of grievances. But for the rest of the social structure, the common people should have the freedom to decide what is right for themselves as individuals and spend their time, energy, and money obtaining the goods and services that they believe will best serve their needs. And others should be free to invest in the production, trade, and distribution of goods and the offer of services in an open market to fulfill those individual needs as they see them.

Those who advocate state control consider the free market, capitalist finance, and participation based on self-interest as rewarding greed, selfishness, and intentionally hurtful action. While those who advocate personal freedom of choice see an overclass of scientific and psychological expert administrators as an invitation to inertia, laziness, pride, and corruption of power.

But even the most libertarian advocate of free-market capitalism will admit that sometimes market forces under the principles of supply and demand, value paid for value received, and other effects of letting intelligent shoppers act according to rational principles will sometimes leave one side of the transaction in a position of advantage while the other suffers disadvantage and damage. Speculation and hoarding in times of crisis, market dominance and monopoly power are examples. In these cases the government needs to set some economic ground rules, and the courts must be available to render judgments and exact penalties.

And even the most progressive advocate of state control will admit that some functions of daily living are inappropriate for government to supply or control. Making personal decisions about whom you will love and take into your life, what values to teach your children and how to discipline them, where and how you choose to live, what career and pastimes to pursue, and what foods you like to eat or avoid are all subject to personal choice. Of course, some extreme advocates of state control—such as doctrinaire Communists and their totalitarian cohort—would insist that any personal element is a political illusion which should be discouraged and stamped out if possible. They believe that no individual choice or action is free from its ultimate effects on other members of society, and so every element of daily life should be guided by moral and scientific experts—or removed from the human psyche altogether.

These discussions are all about to whom you want to give the power in society.

For my part, I believe that any power structure is made up of people, and people in the aggregate and as individuals are not all one thing or the other. Some are greedy, some lazy, some dedicated and conscientious, and some are fools. Whether they work in a government office or a corporate headquarters, work out in the field with a state agriculture or transportation agency, or on the front lines as a customer service representative of a large corporation—they are still people with all their strengths and weaknesses, foibles and phantasies. But, with all of this said, I still believe that most people try to do a good job as they see it and as it has been defined for them in their work environment. Most people consider themselves to be basically good and well intentioned. Only a very few people wake up in the morning and think, “Now I will be an evil bastard.”

And most positions in the power structure, whether in a government or corporate setting, offer few opportunities for personal greed, laziness, and corruption. Every government has its code of ethics, as does every business organization. They have rules, personal and departmental goals, and internal audits. The people who run either organization, public or private, know that the population has its usual share—small in most cases—of connivers and criminals. The organization wants to give good service—even the Department of Motor Vehicles has service goals—and keeps an eye on how its employees are treating the public it serves.

In almost every political and economic situation, I believe in achieving a balance of power: between citizens and their government, between consumers and providers, between workers and management, between any two or more conflicting or competing groups. When one side of the equation has complete control, the other side is bound to suffer. Being a little-D democrat, I believe in the value of reaching agreements—each side gives something and in turn gets something—if not actual consensus among conflicting intentions and interests. This is only a matter of fairness because, really, while some people may be smarter, more experienced, more learned, and more level-headed than others, no one possesses the ultimate truth, the final word, or the all-seeing eye.

This means that any group which obtains prominence and power in a situation must remember Thomas’s Law: “The catbird seat2 is a wobbly perch and tends to dump you.” No one stays up forever. The wheel of karma grinds slowly but inexorably.

If you want an example of this, consider the current situation in academia. For most of my professional lifetime, university professors have enjoyed a position of both power and security in our society. With tenure generally available, they had economic situations that were assured against administrative removal for their holding controversial views or entertaining absurd or noxious ideas. Within the closed environment of the faculty lounge, they had a life of relative ease and congeniality, even with the imperative of “publish or perish.” And as shapers of the minds of future generations, they exercised as much control over our society’s values as any Hollywood or Madison Avenue mogul. The catbird seat. But now, with widely available student loans pushing up tuition, while declining educational standards and curriculum offerings push down the economic value of a basic college diploma—coupled with widely available learning options in the form of online and for-profit education—the secure position of tenured professors is rapidly dwindling. Soon they will have to “root, hog, or die” along with the rest of us.

For another example, consider the position of the Soviet nomenklatura at the top of Russian society. For seventy years, they were in positions of extreme power so long as they could toe the Party line and avoid the backstabbing of political competitors. But in the 1990s that all changed as the system that had nurtured and fed them collapsed of its own incompetence to raise the average Russian out of a third-world existence in an economy that lagged behind every other example in the developed West.

The catbird seat is a nice perch, if you can get it. And for some—like the last crop of university professors or a couple of generations in the nomenklatura—it might last until the holder is dead and gone and beyond caring. But without a balance of power, without a commitment to agreement and consensus, these niches have a relatively short half-life. Eventually, the perch wobbles and dumps you.

1. As always with a blanket statement like this, some exceptions apply. All individuals—except those previously shown to be irresponsible, such as the mentally incapacitated or convicted felons—have a right to life, bodily integrity, and freedom of person. Those who fall into the irresponsible category may give up some degree of freedom but still have a right to life and bodily integrity. Similarly, persons shown to be in possession of property in accordance with the laws of their society have a right to the use and disposition of that property under the law. Persons may be elected, appointed, or hired into positions of decision-making authority over other citizens—such as magistrates, judges, and legislators—but they hold that authority only in the sphere and under the terms of their service. With all that said, no one has a claim on ultimate truth—not even eye witnesses to the birth of creation.

2. See The Catbird Seat from September 29, 2013.