Sunday, January 31, 2016

The Roots of Art in Civilization

Several people on Facebook have recently posted a quote from Eli Broad, founder of The Broad Museum: “Civilizations aren’t remembered by their business people, bankers, or lawyers. They’re remembered by their arts.” My immediate response was: “Yes, but merchants, bankers, and lawgivers usually create the stable, civilized conditions under which art can be made and enjoyed.” And I’ve been thinking about that idea for a while since then.

Trade, access to capital, and stable, predictable laws are certainly conditions for pursuing art, great or otherwise. It’s difficult to paint and sell pictures without enough of an economy flowing through your city, state, or country to enable people to buy your art, or for gallery owners and museum curators to arrange places for people to see it. You also need banking services to enable gallery owners to operate as a business—paying out in advance for rent, utilities, commissions, and sales staff—and for patrons to amass enough disposable income to consider acquiring your art. And you need the stability of predictable laws ensuring your rights to the intellectual property embedded in your art, guaranteeing the right to ownership and retention of property in general for your patrons, and enabling transactions among individuals and institutions.

These civilizational requirements apply even more strongly to music composition, which until recently needed a group of people who sing and play instruments at least semi-professionally to get together in one place to perform or record your work. The requirements also apply to playwrights and screenwriters,1 who need the support of a theater company or a production studio to see their scripts presented or produced. Only the poet, short story writer, or novelist can work in relative isolation, cranking out daily word counts that might never see the light of day, regardless of the external economic conditions. Even if the economy is healthy enough to support any number of printing presses and publishing houses, access to the literary market has always been restricted by the publishing gatekeepers.2 Certainly, Emily Dickinson seemed content to write her poems on the back of shopping lists and put them in a drawer, except for the few works she circulated among family and friends.

But what other conditions are required for creating art, particularly great art?

You might think a general condition of peace would sustain the painter, composer, or writer. And certainly it is difficult to produce an object like a painting or a mural, a musical score or studio recording, or a novel either in manuscript form or epub coding while the bombs are falling around you and soldiers are raking the streets with gunfire. But one needs to consider the difference between peace as an opportune time for creativity, due to a lack of imminent personal violence, and peace as a suitable subject for contemplation and the artistic process.

Other than for landscape painters, pastoral composers, and modern activist writers, peace has always made a relatively poor subject for artistic expression. War and its lesser conflicts have given humanity more to work with in the form of divided loyalties, opposing passions, hard choices, and the clarity of mind that imminent death can bring. Peace is a long, golden afternoon where nothing much happens and we have time to drink wine and play with puppies—or paint pictures, compose songs, and write poetry. War, its dreaded approach, and its dreadful aftermath, place the human soul inside a landscape of stark contrasts, confused expectations, and troubling choices. Virgil, entering a golden age under Augustus after decades of strife and civil war in Rome, turned his artistic vision back to the trials of the warrior Aeneas as he escaped from the Trojan War to wander the Mediterranean, captured and spurned the heart of the Carthaginian queen Dido, and then waged a new war in Italy in order to found an empire.

Conflict, drama, forced decisions—these are the basis of so many stories. Even love stories are, underneath, tales of conflict between two people with similar but not quite identical aims and the obstacles they face. Romeo and Juliet without the animosity of Tybalt and the warring history of the Montagues and the Capulets would be a work of five minutes full of teenage hormones. And the recently popular mood of basing stories in the “hero’s journey”3 throws the protagonist more often into physical trials and tests than into exploring the inner workings of his or her own mind.

A time of social dynamism is also necessary to great art. War and civilizational conflict can provide that dynamism, but so also can the rise of a new religion, the disruptions of new ideas and inventions, and the collapse of the long-held order that was once made possible by those merchants, bankers, and lawyers. Of course, social dynamism may be war and conflict by other means. Religions establish and distinguish themselves by contrast and conflict with earlier belief systems. New ideas and inventions create disruption precisely because they upset traditional values, established players, and formerly profitable ways of doing business. And the collapse of a civilization, whether through internal systemic failure or external challenge, often brings on a state of war and conflict.

The “culture wars” this country is currently going through—sometimes called a “cold civil war”—are based on a clash of ideals and intentions, between those people for whom the march of progress, with its promise of change toward a better, more utopian order, is not happening fast enough and those people for whom what has gone before, with its traditional values and proven methods, still has the power to provide stability and order. These conflicts have been a particularly rich source of material for modern writers and artists.

Another source of artistic stimulation is population dynamics. Having lots of babies and so an assured supply of young people seems to be good for art. People tend to reproduce both when they think the future will be promising and stable, as in the great fecundity this country experienced with the “Baby Boom” after the rigors and uncertainties of the Great Depression and World War II. But people also tend to breed—to cast their genetic futures with the dice—when times are uncertain. The thought that you might not be around next year or the one after to settle down and raise a family can be a powerful promoter of survivalist urges. Most of us know, or suspect deep down, that while our own bodies are perishable and short-lived, our children, their children, and the generations to follow can make us practically immortal—or as immortal as mortality gets.

A young society is a dynamic society, full of new words, new personal styles, and new traditions as the adolescents push away from their parents to establish themselves in the world. A young society is full of hope and conflict, and that’s part of the power of the current crop of young adult literature and coming of age stories. The “hippie movement” and the “counter culture” of the 1960s were fine examples of this, with their outpouring of new ideas about religion, morality, and conflict, along with new hair and clothing styles, new freedoms in the arts of painting, music, and literature, and a reinterpretation of sexual codes that in the 20th century were still evolving from the restraints of the Victorian Era.

A society dominated by the elderly, where the young are in the minority and older, more established adults are hanging onto power and privilege, is a society that is already freezing into settled patterns and stale outlooks. And a society where the demographics have tipped toward the very old, such as in modern Japan, is a society on the verge of collapse and extinction. The stories, songs, and paintings from such places would be very dull indeed—right up until the end, of course, when new conflicts will create new stories and songs.

No, civilizations may best be remembered for their arts, but that art doesn’t happen without the rest of us to foster and support it. And yet civilizations in transition, especially those on the cusp of failure, may produce the greatest art of all.

1. Isn’t it interesting that “playwright” uses the ancient form “wright,” which comes from the Old English wryhta, meaning “worker” and is usually attached to “millwright” and “shipwright” to mean a worker in wood. It’s as if the playwright is carving the story out of an inanimate block of words in time. On the other hand, “screenwriter” goes directly to the modern action of “creating in words.”

2. Until now, when access to direct ebook and print-on-demand distribution has opened the field to any author wanting to publish his or her work. But these services rely even more on stable economic conditions, rights to intellectual property, and the existence of that great information and economic collaboration, the internet.

3. With a tip of the hat to scholar and mythmaker Joseph Campbell.

Sunday, January 24, 2016

Honesty in the Arts

We recently visited—I for the second time, my wife for the first—the “Jewel City” exhibit at the De Young Museum in San Francisco. It was a retrospective, gathering once again some of the paintings, murals, and sculpture from the 1915 Panama-Pacific International Exposition. The original exposition was the city’s celebration of the completion of the Panama Canal and its own recovery from the 1906 earthquake and fire. The modern exhibit a hundred years later was as much an eclectic affair as the original, mixing 19th century triumphalism about the westward march of civilization with French Impressionists, Norwegian Expressionists, Italian Futurists, and American painters and sculptors from the preceding half century.1

Going through that exhibit nailed for me, once again, a favorite period in the visual and plastic arts. It was a time when painters and sculptors regularly produced imagery that another human being could examine in detail; draw from it a sense of time, place, and mood; and relate it to his or her own life and thoughts. It was all “representative” art, of course, in the sense that it tried to show something with reference to the real world—or to an imaginary world that drew life and meaning from the world we all inhabit.2 The presentation might be skewed by the artist’s sense of perspective, color, light, and detail, but it was still meant to “be” something.

I feel at home with this kind of art. Artists and their critics from the late 19th and early 20th centuries might feud bitterly about the merits of impressionism versus expressionism and the details of painterly technique. But these are the minutiae of art history. For the average human viewer, the paintings and sculptures for the most part worked their magic and made themselves accessible.

Somewhere, sometime—and I think the civilizational collapse of two world wars, an economic depression, and the resulting political extremism had a big hand in this—we lost that vision. Art became personal to the artist, abstract to the point of concealing the original subject—and the efforts of Pablo Picasso and his followers had a hand in that—and isolated from the life and thoughts of the average person.

Some interesting studies in form and color have come out of this movement. I think of Alexander Calder’s delicately balanced mobiles from the 1950s—although I recall one critic decrying them as “lacking irony.” That comment has always stuck with me and helped shape my view of modern art. A Calder mobile is interesting, evocative, draws the eye, and inspires the imagination like moving clouds, schooling fish, or flocking birds. It is not a comment on the world but a vision of pure color and motion. I can look at one of his mobiles for whole minutes at a time and feel relaxed. But a nearby exhibit of four rectangular panels painted in primary colors with a roller brush—intentionally ironic and abstractly intellectual as they might be—have nothing to absorb my eye or my mind.

We have entered a time in modern art when the artist’s statement of purpose, typed out in a page or more of frankly tangled verbiage—abusing at once both the visual arts and the written word—has become a necessary part of the exhibit. You can’t just walk up to a painting or a sculpture anymore and just look at it. You need to have the artist’s view of his or her own worth and place in the ongoing process of “art.” And when the artwork is a picture of a two-dimensional human nude gagged with a surgical mask and stabbed with pins, which I remember from an exhibit at my own university in the 1960s, or a photograph of a crucifix dipped in a jar of urine, then you can guess that the artist’s view is pretty bleak and disoriented. And it’s not just the subject matter; the techniques themselves are usually slapdash and childish.

Now, I believe in personal freedom. I believe that everyone gets to do what he or she likes and thinks is important. But, and this is a big “but,” not every effort has equal value and not everyone’s “thing” should be practiced out in the open and pushed forward as supposedly important enough to engage the time and attention of everybody else. Not everyone should be—or should expect to be—paid for their sour, vindictive daubings that must be justified by paragraphs of explanatory prose.

Irony—the mode of speech or display that presents one thing but cleverly intends its opposite—does have its uses in prose and the visual arts. Socrates, for example, made a point of his pretended ignorance of the subject under discussion in order to draw out the ideas of participants in his dialogues. But when irony becomes the entire basis for an artist’s endeavors and the underpinning of his or her worldview—as in the criticism of Alexander Calder—it borders on sarcasm, ill humor, and disrespect of the viewer’s or reader’s time and attention. If the whole world is a big joke to you, something that only pretends to be what it is and whose values you know to be hollow … well then, how much further can your mode of discourse progress? There you are, snickering at life and the world—and you’re just about finished.

I’m glad that my business, that or writing novels, is one of the few art forms that cannot be turned on its head by this sour and dissipated view of life. You might be able to show your contempt for the worthlessness of hope and effort by painting black canvases or developing blank film, but it’s difficult to publish a book filled with randomly repeated words or meaningless strings of type.3 A book is supposed to involve the reader’s mind for a certain period of time, and no one willingly focuses his or her thoughts on empty content.4

As a writer, I know that to disrespect my readers is the death of art. I may play with them, tease their imaginations, and trick them with false clues, suggestive wordings, and red herrings littering the plot lines. I may use all the forms of irony to lead my readers to expect one thing and then drop them into another reality. But my readers—the alert ones, anyway—are expecting this. They read certain novelists for their double meanings and clever tricks in the same way that audiences love a good magician or a musical parody. But if I try to pass off something cheap and ill conceived, something that suggests I’m not playing with the reader’s mind but offering up trash because I don’t think my readers can tell the difference—they will know it immediately.

I know a few musicians—singers and keyboard players—and they say the same thing. You can clown around with your audience. You can imitate famous singers and make knowing parodies of popular songs and modern styles. But the minute your audience suspects that you are offering them less than full effort because you actually despise them and their sense of taste—then you are dead on stage.

To disrespect your audience is to imply that you as the artist are more important than they are, that your time and effort are more valuable, that you are somehow doing them a favor by performing. And yet they are the ones who can at any time stop reading, stop listening and watching, stop paying attention. That is the power, the superior position, held by the audience.

To disdain your audience is also to disrespect and belittle you own art. You are making light of your product’s value. You are suggesting that you waste your own time. And it is part of our culture, at least here in the Western tradition, that art in any form—even the silliness of making comic films or writing and singing funny songs—is a worthy pursuit that is due the artist’s full effort and attention.5

I don’t know how modern painters and sculptors get away with their overbearing sense of irony. It’s an attitude of contempt that no successful or even just competent writer, singer, or movie producer could survive.6 And yet the modern galleries and self-conscious “museums of modern art” are full of it.

No one likes being on the outside of an in-joke.

1. The original exhibition left enduring marks on the city. The Palace of Fine Arts, a reconstruction in concrete from the original in lath and plaster, became a landmark of the San Francisco Marina District. And then a group of western American painters who felt they had been snubbed by the Panama-Pacific International Exposition’s selection committee presented their own show at what later became the De Young Museum in Golden Gate Park. Finally, the French exhibit at the PPIE inspired sugar heiress Alma Spreckels to create the Legion of Honor Museum in the Presidio.

2. In the same way that the works of J.R.R. Tolkien and the fantasy writers following him—filled with elves, dwarves, and magical swords—or the science fiction of a Heinlein or a Herbert—filled with spaceships, strange creatures, and people with stranger powers—still deal with issues and concerns that have meaning for readers in the here and now.

3. Of course, there have been blank books—essentially exercises in bookbinding without the effort of typesetting—that were sold as jokes. They have titles like Everything I Learned About the Stock Market in Thirty Years of Trading. If you get one, you can always use the blank pages to start a journal or paste in recipes. But no one expects you to sit down and read the thing.

4. Well, the Zen koans ask you to do just this, but their goal is mental expansion, not diminishing attitude. And none of them runs to book length.

5. When I was studying English literature at the university, John Fowles wowed my generation with his novels. The Collector, The Magus, and The French Lieutenant’s Woman held us all spellbound. And then he suddenly declared that novels were not an important art form and he would concentrate on his poetry. It was like a smack in the face. He disappointed us all.

6. Although Hollywood is skating perilously close to audience contempt with their endless and uninspired remakes of comic-book fare.

Sunday, January 17, 2016

Socializing the Sexes

Maybe they socialize kids differently today—and from what I read and hear, I expect they do. The gender-neutral crowd must engage in serious playground monitoring and supervision to enforce the teacher’s, the new adult world’s, view of proper behavior and good relations between individuals. But it was different in my day of essentially free-range kids.

Oh, of course, we had adults to socialize us. Our parents were the first line of society’s defenses, followed only then by our teachers and other adults in our lives such as aunts and uncles, the family doctor, the pastor, the barber, the neighborhood police and postal workers, and various merchants we regularly visited, to name a few. But parents did the rough grinding, taking off the sharp corners and stubborn facets our budding personalities in order to tame the wild beast of self-will. After all, the parents got to us first, when we were just babies and knew that everything we touched was ours and most of it belonged in our mouths.

Mothers and fathers teach children by example and correction. They teach sons and daughters to be respectful and courteous, to think of others, to be honest, brave, and loyal to the family and by extension to whatever group they join. A child learns these things early, usually before being released into the world of school and introduced to people not in the family and not bound to him or her by loyalty and obligation.

And there the differences start—or they used to.

Little boys went off at recess to play team sports and competitive games—first dodge ball, then baseball, touch football, and any game where two groups can line up, working with each other on one side and against all the others on the opposite side. In the rough-and-tumble, a boy learns an important lesson. You can trash-talk the other team, and you can taunt a batter or a pitcher while he’s on the field, but you keep your talk game-related and good-natured. If you say personally demeaning and hurtful things, striking at another boy’s sense of self and “essential vanity,”1 then you will face consequences.

At a certain point, when a boy’s pride has been stung enough, he will lash out. He will drop the good-natured banter and fling himself at you, fists flailing. This is an emotional state, an outburst, a release of control. The boy who is lashing out soon learns that the other person usually can defend himself with equal or perhaps even harder punches. And the boy who taunted the first boy to the point of distraction learns that the male ego is a brittle thing and that fighting is hard work. With time, and after enough bruised noses and split knuckles, one boy learns to control his temper and the other learns to control his tongue. Both learn a solid lesson about impulse control.

It’s a difficult way to grow up, but the alternatives are all worse. If the boy being stung beyond endurance does not “stand up for himself,” then something fragile inside him starts to die. He will see others in his group regularly giving and taking punches, while he does not dare to fight. He might pride himself on his Buddha-like passivity and mature self-restraint, but the thought will lurk in the back of his mind that a certain amount of cowardice may be involved. And he will understand instinctively that these are only children’s tests, and that later adult tests are sure to come and they will be much harder. The boy who has not bloodied his knuckles on the playground may not be ready to fight, to resist, and to hold himself upright as a man.

If the boy doing the taunting does not get his share of “knuckle sandwiches,” then something evil inside him starts to grow. He will see that others in the group—some others at first, but more over time if this goes on—do not oppose his will and ego. Eventually, he will think he can get away with anything, say anything, and control others with his tongue and the force of his personality. He will become a bully who expects fear and deference in others. But here the thought does not lurk that these are only the tests of childhood. He will probably not suspect that somewhere, sometime, he might belabor an opponent who will not fold and who will have learned to use his fists to advantage.

And for the others, those boys standing by and watching, this brief burst of combat—the lashing out and the bashing back—is cathartic. It relieves tension and clears the air. That is why they cheer it on, not because they are savages, but because they know instinctively that this is how young male primates learn their role and their place in the monkey troop. If the boy being goaded backs down and refuses to fight, they feel a little sick, knowing that something is dying. And if the boy doing the goading is not opposed, they feel a bit fearful, knowing that someday one of them will have to challenge him.

When the process works—and it almost always used to—boys learn a valuable lesson. That the world is an uncertain place, and if you shoot your mouth off too freely, there is always someone around who will close it for you. That every man is an unpredictable quantity and so worth your caution if not your respect. That power has its limits when men have the means to attack and defend themselves. That leadership—the mysterious process of getting others to do what you want—is best exercised by establishing shared values and reaching through the other person’s natural defenses to touch a core of self-respect, good sense, and humor, rather than by shaming and goading, by breaking the other person’s spirit to your will.

But girls, I think, were socialized differently. However, I must admit that, while I was on the playground, I didn’t spend much time with the girls, so I only have general notions about how they once played together and socialized among themselves.

My sense is that little girls learned to relate through either individual exercise or cooperative play and tests of individual skill. They would compete as individuals in games like jacks and hopscotch. They would cooperate in feats like double-dutch jump rope. Later they would compete in non-contact sports like badminton, tennis, and volleyball. The point was to prove your own skill, speed, or cleverness but not necessarily to dominate your opponent physically. Did groups of girls on the playground ever break out into fights? I’m sure they did, once or twice, especially in the early grades. But girls were always supposed to be “nice” and “ladylike,” which meant never resorting to anything so vulgar and hurtful as fisticuffs.

Since I don’t believe that any group of human beings is composed of angels, and that the good and bad in human nature is pretty evenly divided between males and females, I do know that girls in a group are not, and never were, as perfectly sweet, cooperative, and supportive as their mothers or their teachers expected them to be. My sense is that, where little boys learned to use their fists to defend themselves, little girls used words and verbal cleverness. When a girl was pushed beyond endurance by another girl’s hurtful taunts, she had two options: to break down and cry, or to come back with something even more clever and hurtful. If you cry, you lose. If you can excel at spite and malice, you win. Buddha-like passivity and self-restraint have nothing to do with this verbal combat, although a crushing tactic on the high ground might be to say, “I choose not to reply to your vulgar accusations!”

Yes, a girl had to “stand up for herself” in terms of verbal give-and-take and not break down in tears. And yes, a girl who could take on all comers with her mouth, making them cry without ever feeling the sting of tears herself—and that’s kind of a psychopathic personality, isn’t it?—would learn to be a bully. But for girls the fight never, or almost never, made the transition from words to fists. It never reached the stage where physical skill and endurance took over from continual brainy cleverness, where a boy learned new and surprising things about himself and his opponent. So disputes among little girls could go on and on, building tensions, and remain unresolved through physical combat.

In my career over forty years in business—first in book publishing, then in engineering and construction, at an electric and gas utility, in the pharmaceutical business, and finally in biotechnology—I’ve seen the results of this differential socialization again and again. And remember, these were the decades when women were finding parity of position and pay with men, at least in the fields where I worked—editing, technical writing, and corporate communications.

In my earliest career, I was the lone editor in a printing and publishing company owned by a husband and wife team who had a second man outside the family as their partner. Because the two men had gone off to serve in World War II, the woman had long since become president of the company, with the final authority in all decisions and the daily running of the business. And she used it. The printing business at the time was almost all male, except in the bindery, so this was a woman running a shop full of men. I would see her tongue-lash and speak rudely and dismissively to skilled typographers, photographers, and pressmen who were twice her size physically and as old or older than she was. She would inflict contempt on them to the point at which, had I been doing the talking, I would have consciously prepared to duck a blow. But she was oblivious to the hardened eyes, pursed lips, and clenched fists that her words were provoking.2

Most of the women I have met and worked with since then were better at managing their relationships with men. And, as the decades progressed, normative behaviors also developed for relations between the sexes in the workplace. But still, I occasionally saw women speak pointedly and hurtfully to men in ways that no man would consider prudent or fair.3

But then, it was always my luck to work in mixed groups, mostly among writers and editors, where the men could be courteous without being seen as weak and the women could be forceful without making things personal. In most of these groups, we represented the best of civil society and learned to value each other’s skills and opinions.4

And then, toward the end of my career, I was the manager of Internal Communications—a new position with a one-person operation, me—and for a brief period the company put me under the female vice president of Human Resources, a department composed almost exclusively of women.5 And for me that was like being transformed into a little girl and transported back to the playground. Almost every discussion was a subtle—and sometimes not so subtle—attempt by one person to establish authority over the others. Almost every comment carried a personal innuendo and hurtful meanings. The fights between groups in that department would go on for months and most were never resolved. The air was never cleared, except in a show of submissiveness. But there was never anything “nice” or “ladylike” about the atmosphere, except that they never actually drew blood off one another.

When I once had to present a detailed proposal on an important communications project, these ladies demanded copies of my slides in advance, flipped through them before I could even start speaking, and then continuously interrupted me to address their own issues and observations at random, so that I had no thread of a case to make. (When I remarked on this later to my supervisor, she explained that this was just how the group handled important decisions.) If during that presentation I had lashed out in frustration or broken down in tears, I would have lost. So I hardened my eyes, pursed my lips, clenched my fists, and persevered.

Women and men are socialized differently, or at least they once were. I don’t know what the modern, more feminized, metrosexual approach will bring in later years for the current generation. But, as a man myself, I tend to prefer the way boys did it. And I’m sure most women could speak up and defend the girls’ method of settling disputes. But I’m not sure that refraining from socking someone on the jaw is the least hurtful way to handle your differences.

1. To borrow a phrase from Frank Herbert’s excellent novel The Dosadi Experiment.

2. In contrast, her husband and the business partner both could issue a rebuke to these journeyman printers with a wry comment, a wink, or a joke to take away the sting. They might speak of how the work could have been done better, rather than belabor how stupid the man was for botching it in the first place. These men all had the rough-and-tumble of the playground in their past and could now put personal antagonism behind them.

3. And I am not talking here about women “knowing their place,” “keeping their mouths shut,” or any other misogynist reference. The issue is that, in a disagreement or dispute, some women would occasionally strike for the jugular, saying things with maximum hurtful effect, and being totally oblivious to the other person’s essential vanity in a way that men learned early on the playground to avoid doing. These women had never experienced the unpredictable volatility of human nature and had never been confronted with a true fight-or-back-down situation.

4. The most memorable contrast came when I worked at the biotech company. I was a technical writer attached to the Consumables Development and Manufacturing (CDM) group, which produced chemical reagents that ran on the company’s genetic instruments. This was a unit consisting mostly of biologists and chemists, careers where women and men were about equally represented. We had very little daily contact with the company’s engineers who designed and built the instruments. In CDM, men and women sitting around a table might disagree by saying things like, “Well, yes, but don’t you think …?” On one project, however—preparing an established instrument package to meet FDA requirements—a team from CDM had to hold regular meetings with a team from the Instruments group, almost all of whom were males. We quickly learned that, in Instruments, people disagreed by catcalling and throwing rolls of tape at each other. Yes, they also winked and laughed with each other … but they threw things!

5. There was one other man, who unlike me had a Human Resources position alongside all these women, but he mostly kept his head down and avoided large meetings.

Sunday, January 10, 2016

My One Rule

In all the ideological confusion that seems to swirl around this country and through our two great political parties these days, I have come to formulate one solid and inviolable rule for myself: I never support, caucus with, or vote for people who want to see me and mine either dead or disadvantaged. This is as much a matter of principle as it is a survival tactic. As I pass through middle age on the way to elder status and then to my eventual rendezvous with and ride on a beam of light, I have come to terms with who and what I am, what I have achieved, and whom I will love and cherish. I have no regrets—but that’s another rule, up for discussion at some other time.

The persona which I place at the center of this One Rule has several dimensions: male, American citizen, inheritor of the Western tradition, and member of the Caucasian race. Let’s take those in inverse order.

First, my race. As someone who worked for fifteen years in the biological sciences, ten of those years concerned with genetic analysis, race doesn’t mean much to me. Genes get traded back and forth pretty freely on this planet, and any one of us humans has fractionally more in common, genetically, with the average chimpanzee than we share, on a SNP level,1 with any other human being—the diversity in our genomes is that great. But only a small percentage of those genes contribute to a person’s visible phenotype, the markers by which one person recognizes the race of another: hair type and color, skin color, certain facial and body features. And none of those phenotypes is immediately important to function or survival. Much, much more of our genetic variation has to do with physical and mental strengths and weaknesses that are smeared across the entire species and found in all races. These are things like immune resistance, ability to process certain chemicals inside our cleansing organs, susceptibility to microbial and auto-immune diseases, et leaping and bounding cetera. And most genetic variations do not absolutely confer any absolute benefit or disadvantage but simply establish a predilection that works in consort with age, accident, and environmental factors.

My bottom line is that we are all one species, one humanity, and race doesn’t confer any special status for either good or ill.

But if someone looks at me and thinks “white,”2 and then assigns to that designation a legacy of past misdeeds and takings of advantage—summed up in the phrase “white privilege”—then that’s a person with whom I won’t be caucusing. Sure, I have had advantages in my life. My parents were established, professional people years before they began rearing sons. I can reliably trace the family back through four generations on either side, and all of them were people who believed in educating themselves, working towards definite goals, building for the future, contributing to their community, and raising children with these values. Over the generations, that kind of lifestyle tends to build wealth—not just through inheritance of land and cash,3 but through aspects of character, attitudes, and ideals that give an individual advantage in any situation or society.

I cannot give up this sense of myself or these attitudes and ideals—nor the knowledge, skills, and preferences they have enabled me to acquire over a lifetime—to the benefit of any other person on Earth. Disabling myself won’t make anyone else whole.

Second, my inheritance of the Western tradition. This is not the same thing as being a member of the Caucasian race, because Western ideas and values are open to anyone who will subscribe to them. Yes, many valuable cultures exist on Earth, both now and in the past. But the culture that we trace back to the Hebrew monotheists, the Greek philosophers and playwrights, the Roman lawgivers and builders, and the seeds these cultures scattered again and again throughout Europe and then, by adventurous expansion, to much of the rest of the world—that is a unique heritage. The West was the first culture to examine the life and needs of the common man against the prerogatives of nobles, chiefs, and princes. The West created and expanded ideas about the rule of law rather than the whim of tyrants, the supremacy of justice over might, and the preference for rights rather than privileges. The West created the principles of open markets, prescriptive economics, and private insurance against foreseeable accidents. The West established standards for some of the greatest and most expressive art, music, architecture, and infrastructure on Earth. The West took the advanced mathematics of the Arabs and extended it with the principles of calculus, the practice of double-entry bookkeeping, the application of mathematics to ideas about physics and chemistry, and development the scientific method, which guides and extends rational inquiry beyond the limits of fanciful belief. By unlocking the talents of any or all members of society through the mechanisms of trade, economics, and investment, rather than relying solely on the talents of those in a leisured noble class, and by offering everyone potential advancement into a propertied middle class, the West increased the effective use of human creativity and ingenuity.

The world is a different place through the spread of these Western ideals, principles, and technologies. We live longer, more interesting, healthier lives than our ancestors of just a few generations ago. Our children will do better.

Third, my status as an American. I take pride that this country was the first nation founded on rational principles, argued over and examined by its citizens after the War of Independence, ratified in its Constitution, and still holding strong and solid after two centuries. This is the first country that has no real “national identity” in the same way that Swedes, Japanese, or Frenchmen can distinguish people native to their national territory from mere passersby and immigrants. The first colonies were a mixture of British, Dutch, and French settlers, and though the British eventually dominated the early culture, we have happily collected immigrants and their cultures ever since to make something unique in the world. With our founding in the Western tradition, as above, with its ideals of tolerance, equality, and justice, we have remained open to practically all comers. We learned early the secret of what biologists call “hybrid vigor”: that we’re stronger as a mixture, a fortuitous amalgamation, than as a hothouse flower of cultural and genetic purity. Our language also takes after the ability of English to absorb, change, and grow. The American spirit, not being dependent upon any one viewpoint, cultural tradition, or ethnic makeup, will endure for centuries where more rigid, protective, class- and caste-ridden societies will crumble and fade.

Various groups and ideologies might want to close the American mind and divert the American spirit, but this sense of freedom amid self-confidence will not be easily set aside.

The United States has invented many new things. One of the best was our ability, after the tragedy of two world wars, to make allies of our former enemies in Germany, Italy, and Japan. And we have done this before, after earlier wars with the Spanish, the Mexicans, and the British. Being of no fixed ethnic background ourselves, we are a forgiving people. The one blight on this record would be what the early settlers in their westward advancement across the continent did to the native people they encountered. But really, it was a conflict mismatch of astounding proportions. On the one side was a thin population of Stone Age people, most of whom lacked the rudiment technologies of agriculture and animal husbandry, concepts of mathematics or economics, and even simple machines like the wheel and the lever. They claimed a vast territory because their hunter-gatherer lifestyle required a great deal of land to support them. Facing these indigenous peoples was a post-Enlightenment, newly industrialized population, inheritors of five thousand years of agriculture, advancing technology, and complex civilization. The Native Americans’ stand against this wave was both valiant and foolish. No one at the time—and no one seriously since then—would propose that a growing population should have remained bottled up on the eastern seaboard simply out of respect for the property rights of people whose culture rejected the notion of property. The result was a foregone conclusion by any rational assessment.

Fourth and finally, my status as a male. Currently popular attitudes tend to views males as superfluous people, perhaps best described as “sperm donors.” But males, when properly brought up, are necessary members of society. Men raised in the Western tradition are the inheritors of concepts of personal honor that go back to Roman-inspired medieval codes of conduct: speak the truth, fight fairly, deal plainly, take responsibility for your actions, protect women and children, defend the weak against predation, be strong in adversity, plan ahead against future uncertainty, and prepare to sacrifice yourself for the good of family and country. Although we now live in a mostly civilized, highly structured, and rule-protected world, pockets of savagery still exist because human nature is dominated by fight-or-flight reflexes. Women may be able to take care of themselves in certain circumstances and with special training, but in biological terms their role is to make themselves vulnerable to attack and adversity while bearing and raising the next generation. In biological terms, it is the man’s role to sustain and defend women in their intergenerational role. You can argue against biology, and individuals can always self-select to go against biology and take up any roles they want.4 But rightly or wrongly, these roles and responsibilities are ingrained in the human condition.

I have always identified with, been ready to accept, and followed the male role. Even if others may now dispute and disdain it, the fact is that society has been structured around these biological roles since before human beings settled down to practice agriculture, animal husbandry, and civilization. A few decades of popular theorizing to the contrary will not wipe out these imperatives.

So this is where I stand: a white, male, American inheritor of the Western tradition. Is this a perfect persona with a perfect history? Of course not, because every group and tradition is made up of and carried forward by imperfect human beings. But it’s a pretty good tradition and I have tried to be a good person within it. I make no apologies, because this is not a perfect world. … But it’s a pretty good one.

1. A SNP, or single-nucleotide polymorphism, is a one-letter change in the coding for any gene. Given that it takes three nucleotides—represented by the letters A, C, G, or T—to code for any of the amino acids going into the manufacture of a protein, and that with 64 possible combinations (four nucleotides times three positions in the codon’s reading frame) being used to call out just 20 different amino acids, there’s a lot of room for typos that end up having no immediate effect on the protein’s structure.

2. I took the Genographic Project genetics tests a few years ago. My Y chromosome haplotype was as expected, centering geographically in northern France and southern England. But my mitochondrial genotype—the part inherited from my mother and her female line through the egg—is the X haplotype, which has an interesting split: about seven percent of native Europeans carry it, but also three percent of indigenous Americans, centering in the Algonquin peoples around the Great Lakes, exhibit this genetic marker. These are the native groups with which my Eastern Seaboard ancestors might conceivably have had sexual contact; so there may be a Native American great-great-grandmother in my genetic makeup. Still, I consider myself and present as Caucasian rather than any degree of Native American.

3. I didn’t inherit a lot of either land or cash from my father, who did well to keep us together through uncertain economic times. But no matter, he gave me a richer inheritance through his example, teaching, and discipline.

4. With the obvious exception that men can raise and nurture, but not bear, children.

Sunday, January 3, 2016

Is Science Perfect?

Science writer Matt Ridley posted a Quadrant Online article in June 2015 which examined the damage that the “climate wars” and the exaggerations and data manipulations of the global warming alarmists might be doing to the reputation of the scientific community in general. In fact, he believes science itself will eventually need “a reformation,” similar to the changes in church doctrine during the 15th and 16th centuries.1

I’m not sure I would agree with that. First, because climate science and its championing of human-caused global warming is only a fraction of the actual enterprise of science, which stretches across vast fields such as biology, geology, chemistry, physics, astronomy, and cosmology. Second, because the “climate wars” have been focused on a ding-dong battle2 between advocates and skeptics based on mathematical modeling and long-range prediction, rather than on actual discoveries and the experimentation to test and prove new principles.

The trouble with our current thinking about science—and one contribution to the vitriol driving the “climate wars”—is the notion that somehow science is a perfect process, a completely rational, supra-human activity that will in all cases yield something akin to “the truth.” Years of television and generations of actors in white coats, riding on the coattails of some pretty amazing discoveries in physics, chemistry, and biology over the last couple of centuries, have suggested to the lay public that science is somehow impervious to error or dispute. But science is still a human activity, and humans are flawed—wonderful, admirable, intelligent creatures, and the hottest thing going within a dozen light years on this end of a spiral arm of the Milky Way—but flawed. Still, we're trying, aren’t we?

As Ridley’s article suggests, even the most respected scientists are capable of harboring pet theories, succumbing to confirmation bias, and pursuing political expediency—particularly where grant money is concerned. Not until supercomputing artificial intelligences undertake the business of science—which means observing phenomena, wondering about causes, developing hypotheses, devising experiments to test them, and awarding grants to perform the experiments—will we have anything like a perfectly rational, soulless pursuit of knowledge. And even then, I would ask who programmed the intelligent software and what errors it might be prone to make and replicate on its own.

Until then, we will have scientific inquiry that hangs upon a few notions that transcend reason, analysis, modeling, and mathematics to access the darker chambers of the human psyche.

One such hidden chamber spawns the persistent belief that humankind is destroying itself, its habitat, and this planet, and so our activities must be curtailed and our ranks reduced. This belief, which in most people sinks below the level of rational analysis, goes back much farther than anthropogenic global warming. It underlay the efforts of the Reverend Thomas Malthus, who tried to show mathematically that humans were doomed because population increases geometrically while agricultural land in production could only increase arithmetically. Or the Club of Rome, which believed that economic growth must inevitably be curbed by the scarcity of natural resources. It is a lingering sense of guilt about the footprints we leave in the sand.

I would trace this sense of doom back to the creation story in Genesis. Humans once lived in a perfect garden, in balance with nature, in a kind of enduring stasis. We supposedly changed that when we “ate the fruit of the Tree of Knowledge” and came to know, and became able to choose between, good and evil. Then we ceased to be children—or advanced monkeys—and became full human beings responsible for our destiny and for all of our supposed woes. We believe this creation story not just because it’s written in a sacred book, but because we humans have a bit of race memory—similar to our stories of a past Great Flood3—that goes back to a hundred thousand years or more of hunter-gatherer existence. With our myths and legends, we can almost dimly remember a long summer afternoon when game was plentiful and easy to kill, when berries and fruits were bountiful and easy to find. And in these stories we tend to forget the wretched autumns of barren fields and forests, or the winters of freezing snow, gnawing hunger, and death.

Scientists—who are today the most fearless eaters from the Tree of Knowledge—are still susceptible to the human fear that we can know too much and are too immature to deal with the consequences of our knowledge. Or they may want to know in the abstract—through thought experiments, mathematical equations, and computerized modeling—but they oppose a thoroughgoing application of such knowledge. In this they may be guided by recent examples from the weaponization of atomic, chemical, and biological principles, or the adoption of new technologies without a commensurate analysis of their social and environmental effects.4

In a similar way, the human psyche has also been plagued by a belief in Utopia—literally, in Thomas More’s writing, “No Place.” Somehow, if we could only find the right principles and then all agree to work together as brothers, we could create a perfect society. We could arrive in a world where all human needs are met; where all human desires are either fulfilled or shown to be the product of bad thoughts and evil cravings; where no one is exalted at the expense of another; where money, possessions, and distinctions between individuals melt away; where harmony, justice, and equality are the rule rather than the exception.

I would trace this sense of human and societal possibility in either or both of two directions. One direction takes us backward to the natural state of the family and of any close-knit, tribal grouping. People in closed societies, where every member is known to the others and people have long since accounted for everyone else’s habits and foibles, can live in some kind—although often an imperfect kind—of harmony, sharing, and equality. The other direction looks forward and follows the slow but steady improvement in human civilization and in our understanding of law, economics, nature, and psychology, which started sometime before the Renaissance, continued through the Enlightenment, the Age of Science and Reason, the Industrial Revolution, the Cybernetic Revolution—and seems likely to continue from now until we reach utopia or endure some kind of atomic or climatic Armageddon. This view says we have come so far and are improving so rapidly that, really, we should be achieving social perfection any day now. And so, with just a little more effort and agreement, we can force humanity’s arrival at in an endless afternoon of peace, love, and brotherhood.

This atavism affects social and economic scientists more than physicists and chemists. The 19th century’s Marxists envisioned a stateless social existence of perfect sharing, akin to the feudal village without the feudal lord. The 20th century socialists believed if they could just remove the profit motive from industrial operations by having a benevolent government take over the means of production, then we would have riches for all. The modern Progressives believe that by neutralizing the wealth at the top of society—the demonized “1%”—they can raise the living standards of everyone else.

But these doomsday phobias and utopian plans all ignore the true nature of humanity. We are pretty smart and independent, compared to our apelike ancestors, and we resent being told by others how to live our lives, make choices, and prepare for the future. As a species, we may be remarkably cooperative,5 but we are still self-interested individuals and not perfectly sharing creatures like the members of a hive society. Some people and some cultures are more community-minded than others, to be sure, but none is made up of selfless drones, because self- and family preservation is a human survival instinct. Any attempts to change human nature—whether by genetics, educational and emotional conditioning, legal sanctions, or surgical intervention—will fail.

These phobias and plans also ignore the true nature of reality. Everything in the universe is in flux. From the internal dynamics of galaxies and stars to the surface conditions on every moon and planet, change is the nature of all things. Whether it’s the dissipation of physical order and energy through entropy, or its curious reversal in the case of living organisms and their evolution, everything changes. There is no stasis. There are no perfect states.

Humans aren’t perfectible. We’re malleable, but also contrary. We’re far too inventive, restless, wondering, and surly to remain frozen inside anyone’s idea of a perfect state for long. We will always skate toward the brink of destruction, but then human sense and survival instincts will pull us back. That’s what has kept us alive for a million years or so in a hostile world with a variable climate and our frail bodies without the benefit of superior muscles, claws, fangs, or stingers. We won’t be put inside a box. And we don’t go under quietly—or not all of us, and not all at once.

Our thinking processes and our current state of knowledge reflect this restlessness. We are sea creatures swimming in a universe of advancing and retreating ideas and notions. And that’s what keeps life interesting.

1. Ridley has described himself as a “lukewarmist”—a position I pretty much hold. As described in the article, he agrees that global climate does change; that humankind’s carbon burning probably has made some contribution to this ongoing climate process, but is not responsible for a preponderance of the effects; and that, while human beings may eventually have to make some adjustments and adaptations in their lifestyles and economic activities, the amount of change over human life-spans and political time frames will probably not be particularly dangerous or devastating.

2. To borrow a phrase from Frank Herbert’s Dune.

3. Every culture seems to have its own flood story, from Noah and the Flood in the Bible, to Deucalion’s Flood in Greek and Roman mythology, to the Great Flood of Gun-Yu in Chinese legend. Certainly, anyone within storytelling distance of the seacoast 11,000 years ago saw or heard about the waters advancing as the last Ice Age melted. And a garbled version of the event has come down to us today.

4. Really, who here thinks we could get a horseless carriage powered by gasoline-fueled internal combustion—either as mode of transportation or as shaper of urban landscapes—through an EPA review today? And the people who are betting on driverless cars haven’t yet tried to get one licensed for the public highways—or past an insurance company’s review process.

5. As someone once noted—and don’t ask me for a reference—imagine putting 130 breeding-age members of any other primate species inside an aluminum tube and asking them to sit quietly and cooperatively for four or five hours while you fly them to 35,000 feet—with concurrent changes in vibration, noise, humidity, and air pressure—to move them across a continent. With any other species, chaos, furniture smashing, and feces throwing would ensue.