Sunday, December 17, 2017

Dealing with Death

2001 Hotel Room

For those of you who don’t like personal stuff in blogs,1 you can stop reading right now and come back next year. If you can stand the personal view, then read on.

It’s been more than three months since my wife Irene died in early September. By now, I have gone through all her desk drawers, sorted years of old financial records from her photographs and memorabilia, and hauled the resulting boxes of wastepaper to the shredding service. I have also gone through her armoire and her various closets, separating slacks and tops from sweaters and jackets, bundling underwear and socks, and taking piles of clothing to Goodwill Industries. And, as of this writing, my co-trustee and I are in the final stages of closing her bank and credit card accounts, making the bequests in her will, and transferring her retirement accounts according to her instructions.

All of this is part of putting away a life, as much as taking her ashes to the gravesite and holding a remembrance in her honor among family members and friends.

What struck me is how long all this took.2 We all know that wills take a while to process, even if the deceased had a living trust, designated clear beneficiaries for each account, and the estate does not go to probate. Still, the companies that offer investment vehicles such as individual retirement accounts and annuities can need weeks to complete the paperwork that transfers the ownership and moves the assets into a trust account. But all that’s just on the legal and financial side.

Part of the delay in wrapping up the rest of her life was my own sense of emotional inertia. I managed to go through the drawers of old bank statements and receipts in the first couple of weeks after Irene died. Partly, this was because it was easier to deal with these paper records, which have an impersonal and antiseptic flavor to them. Partly, because I needed to find documents like the deed to the condo, title to her car, and any surprises like unacknowledged debts and obligations, before the legal part could proceed. Irene managed our household finances as part of the arrangements governing our marriage. We always kept separate bank accounts and credit cards, because Irene didn’t really trust me with money.3 So going through her accounts was a voyage into terra relatively incognita and once jealously defended.

I found it harder to go through her closets and prepare her clothes for donation. Partly, this was a lack of urgency, because those clothes weren’t going anyplace soon and I didn’t need the extra closet space. Partly, it was the personal nature of the exercise. Somehow, sorting her entire wardrobe for disposal brought home, as neither sitting in the lawyer’s office nor closing her the bank account could, that Irene was really gone and wasn’t coming back. Not at all. Not ever.

As with the paperwork, Irene’s closets were also an unknown area in our lives.4 My wife—and all women, I think—like to keep their wardrobe space private. Even though we dressed and undressed in the same bedroom, she wanted to appear to the world—and that included me—as a completed presentation. Her decisions about the components that went into the look of the moment—this top with those slacks, this sweater or jacket—were supposed to remain a relative mystery. She never asked for my approval of any piece of apparel. And Irene did the laundry anyway; so she could keep her clothing arrangements entirely separate from mine.

It’s not like I did no chores around the house. She managed the laundry as a preferred activity to vacuuming and dusting, because she hated the noise, the smell, and the routine dirt of those two chores. I didn’t mind cleaning the house, but I didn’t care for the dampness, fumes, and critical timing of running loads through the washers and dryers down in the laundry room. So, for forty years, we traded off these tasks. And then, I could cook but not as well as Irene, and she was more adventurous with recipes—my meals being a bachelor’s basic fare of homemade chili, hamburgers and hotdogs, pizza from scratch, and other fast-food staples. Oh, and bread, because my mother insisted her boys know how to bake. After Irene retired, she took over most of the cooking and mostly had fun with it. And because she had the car and reserved rights to the condo’s one assigned parking space, while I rode a motorcycle as my basic transportation, she did most of the shopping, which became part of her keeping the household accounts. And finally, we traded off walking the dog four times a day, because when you live in a twelfth-floor apartment, you can’t just open the back door and let the animal run out into the yard. Of such long-standing agreements is a marriage made.

Now that she’s gone—in fact, from the first week after she died—I inherited all of these chores for myself. And it’s surprising how small a deal they really are. I reverted to my bachelor routines of forty years ago for assessing and doing the laundry and shopping for my groceries. As with the cleaning, I organized these tasks into basic routines, put them on a schedule, and started executing them efficiently.

They say that when you start drinking again after a long period of abstinence, you don’t start over again with fresh tolerances and work your way back up to heavy indulgence. Even after years, the body remembers exactly where you left off. Stay sober for decades,5 and within a week of taking that first drink again, you’ll be downing a bottle of wine or a couple of six-packs a night. So it is with the routines and skills you develop early in bachelor life. Irene had little quirks about performing her chores: how to fold the towels, which items to let air-dry, how to load the dishwasher—even after I had already put my dishes in—and what pans had to be scrubbed by hand. Within two weeks after her death, the linen closet and the dishwasher looked like my bachelor days. Not less neat, and certainly not less clean, but just different. And I’ve also started modestly rearranging the furniture according to my own ideas, rather than the placements we could agree upon—or fail to reconcile—together.

But forty years of living with one person leaves a mark—no, the years wear deep grooves in your psyche. I may be adapting now to the old patterns of taking care of myself. But sometimes I hear a rattle in the hallway. I know it’s the wind, but I think it’s Irene with her keys. The dog hears it, too, and her head comes up and her ears go erect. I hear someone say, “Tom …” in a crowd, and it’s in Irene’s tone of voice. I know she won’t be coming back—I’ve done the necessary sorting and packing away—but she is still there at the edge of my mind.

The condo which was enough for two people now feels too big and empty.6 I feel like David Bowman from 2001: A Space Odyssey, alone at the end of his voyage in an absurd French-empire hotel room, listening for someone to come, for something to happen. But I don’t exactly listen, because I know I’m alone. Yes, I have friends—most of them associations that Irene and I made together—and family members in the area. But the days are long. And thank heavens for the dog as a companion to give the apartment some life. Thank Irene, actually, because this particular dog was her choice—it was her turn to pick—at the animal rescue shelter.

But now I’m caught at the end of my life, too old to start over. I’m wondering what, if anything, comes next.

1. Of course, my blogs are all personal, but some are on a higher intellectual and emotional plane than this.

2. In the recent movie Arrival, the heroine asks about one of the aliens who has gone missing after a bomb attack. Her interlocutor replies, “Abbott is death process.” I now understand that phrase, “death process,” all too well.

3. In every relationship, one person is always the Grasshopper from Aesop’s fable, while the other is the Ant. Guess who was the Grasshopper? In the same way, in every relationship, one partner is always Oscar from the Odd Couple, while the other is Felix. Guess who was Oscar?

4. One thing I learned from handling every piece of Irene’s clothing is how lightweight and insubstantial women’s clothes really are. Men’s slacks are made of heavier material and then thickly sewn with substantial waistbands, belt loops, gussets, and lots of pockets. Men’s shirts usually have reinforced collars, shoulder yokes, button plackets, and one or two chest pockets. Women’s slacks and blouses are thin material with hidden seams and virtually no pockets. Even items that would seem to be common to both, like polo shirts, are made of a different weight of yarn or a lighter weave. I guess this is because women’s clothing is supposed to drape and cling, while men’s clothing is meant to be essentially shapeless.

5. I now have thirty-two years of sobriety, and forty-four years without tobacco, both after going cold turkey for the sake of my health. Not that I’m thinking about starting up either vice again.

6. Friends keep asking if I plan to stay there or move. This was the home Irene and I moved into when we first got married and we never left. Partly, that was because we didn’t want to be house-rich and cash poor at Bay Area real-estate prices. Partly, it was because we never could agree on any other house we saw or any move we might make in any direction. But the condo will always be Irene’s-and-mine and not mine-alone. I think this is what my friends sense.

Sunday, December 10, 2017

Learning as a Form of Evolution

Neuron cells

I’ve been making some existential comparisons lately—Life Like a Sword and Language as a Map—so I thought I would round out the sequence of metaphors by looking at the way we form our knowledge.

The popular conception is that we acquire new knowledge the way a coin or stamp collector makes a new acquisition: pick up the fact, place it in our memory box, recall and retrieve as necessary. Our head then becomes a database of acquired facts, like a phone contact list or a Rolodex of business cards. Take one and slot it in. Modify it with an overlying note if any of the information happens to change. And remove the card—that is, commit ourselves to forgetting the subject heading and its content—when the information is shown to be wrong or is no longer of use.

But is that really how it works, all neat and tidy, like filing little pasteboard squares?

Actually, our brains are representational devices. We interpret our sensory input and apply it to the process of assembling a model or representation of what we know and think about the world outside our heads. We are model makers, map makers, myth makers, and story tellers. What we learn goes into a vast web or network or congeries of impressions, summaries, conclusions, and projections that collectively represent the world as we know it. We are constantly testing and expanding our worldview. We are always asking, consciously or not, “Is this the way the world really works?”

We are constantly—although perhaps unconsciously—looking for comparisons with and similarities to the things we already know. When we get a new fact or form a new impression, we test it against our worldview, the structure of our model of the world. We ask, “How does this fit in?” And if the fact or impression conflicts with what we know, our brain goes through a small crisis, a scramble for immediate understanding. We test the new knowledge against its background: “Where did I get that idea?” “What was its source?” and “Do I trust it?” We also experience a small—or sometimes large—tremor in our worldview: “Why do I actually think that?” “Could I have been wrong?” and “Is this new knowledge a better way of seeing things?”

The habit of referring back to our internal model runs deep. For example, when learning a new language, such as French from the perspective of an English speaker, we leverage the grammar and the words we already know and understand. When we learn a new French word like chien, we don’t immediately associate it with a four-footed pet of a certain size range, disposition, coloring, and similar physical details. Instead, we link it to the English word dog and then concatenate onto chien all the past impressions, learned attributes, and personal feelings we already associate with the concept in English. In the same way, we adapt French grammar and syntax to our known English way of speaking, and then we extend our knowledge with new concepts, like the issue of nouns and objects that we normally think of as inanimate and sexless now acquiring a specific gender. By learning a new language, we expand our general knowledge of both our own language and its place in the way the rest of the world communicates.

In this sense, each piece of new knowledge—both the facts, impressions, and understandings that we acquire by the happenstance of general reading and daily experience, and those we acquire by conscious study such as a new language, or the history of an unfamiliar place and people, or a closed curriculum like mathematics, physics, and chemistry—each discovery is making a series of minute changes in the brain’s internal environment. And the effect that these new facts and impressions have on our existing ideas—the current model or myth that is running in our heads—is like an organism’s response to accidental modification of a protein-coding gene: the new knowledge and the resulting change in our worldview either enable us to live more fully, completely, successfully, and confidently in the environment that we actually inhabit, or the changed worldview contributes to our failure to compete and thrive by causing us to interpret wrongly, make mistakes, and suffer feelings of doubt, denial, and depression.

But some facts or interpretations—perhaps most of them—don’t cause an immediate change in our relationship with the outside world. We can carry a bit of false data, a misremembered fact, or an untested impression in our heads for months or years at a time without it affecting our personal relationships, our social standing, or the decisions we make. And then, one day, we will learn something else that will contradict the comfortable model and bring on the crisis. In the same way, some mutations to a gene have neither a helpful nor harmful effect in the current environment. The modified gene and the changed protein it makes gets passed down from generation to generation without challenging the fit of the organism to its environment. But then the environment changes, and the organism is either better able to compete under the new conditions, or the changed environment shows up an inherent weakness, and the organism either thrives or dies. Sometimes the environment doesn’t have to change, but another mutation enhances the effect of that earlier genetic change, and the organism either excels against other members of its species or fails to compete.

As an example of the mutability of our worldview, both as individuals and as a collection of academics building a body of scientific or historical interpretations, consider the advance of human knowledge in the field of genetics.

At first, back in the early 1950s and the world of Watson and Crick, we valued the newly discovered DNA molecule and its messenger RNA strands solely for the proteins they made inside the cell body. Genetic scientists held to what was then called the “central dogma” of molecular biology, that DNA transcribes to RNA, which translates to proteins. Geneticists could point to the start and stop codes associated with the protein-coding genes. By finding and fishing out these codes, they could pull out sequences of DNA, copy them over to RNA, and analyze the resulting coded calls for each of the twenty possible amino acids in the developing protein string. These twenty amino acids are the universal building blocks for all of an organism’s complex proteins—in fact, for all life on Earth.

This central dogma held until about the year 2000, when the Human Genome Project and Celera Genomics published draft sequences of the entire three billion base pairs in twenty-three human chromosomes. Analyzing the code, geneticists then discovered that only about ten percent of this DNA was used for making proteins.1 So what was the other ninety percent doing? Many scientists figured that this genetic material was “junk DNA,” old code left over from our prior evolution, from genes that coded for proteins that our evolutionary ancestors might have needed as fish or reptiles, but with no meaning now and so abandoned to gradually mutate into genetic mush.2

The new facts about the frequency of protein-coding genes forced a reevaluation—a modification of the scientists’ mental model—of the nature of the genome. The scientific community remained with either the “junk” hypothesis or a condition of wonder until about 2004, when a new bit of knowledge emerged. Botanists working with certain flowers discovered that a short strand of a particular RNA, when introduced into a seed, can change the color of the flower. They hypothesized that the RNA either promoted a gene that had previously been silent or blocked a gene that had previously been expressed. They dubbed this effect “RNA interference,” or RNAi.

Soon, the genetic scientists were studying a class of short RNA strands, about fifty bases or less, that they called “microRNAs,” or miRNA. They began to see that these bits of RNA were used inside the cell nucleus to promote genes in different patterns of expression. And then Eric Davidson at Caltech, by working with sea urchin embryos, mapped out the network of genes in an undifferentiated embryonic cell that produced bits of microRNA to promote other genes to make different miRNAs—all without coding for any proteins. Depending on a cell’s position in the sphere of identical embryonic cells that develops shortly after fertilization, the pathway through this miRNA network changes. Some of these cells, through the proteins they eventually produce, become the internal gut, some the epidermal surface, and some become spines. By comparison with another organism far removed from sea urchins, the Davidson laboratory could trace out a similar network—which means it operates in most animals and plants, and likely in humans today. This miRNA network is the timing and assembly manual by which some embryonic cells in our bodies become liver cells, some brain cells, and some bone cells.

This discovery addressed a question that apparently no one had ever considered. If the entire genome is for producing proteins, then why doesn’t every cell in the human body make all the proteins required by all of the other cells? Why don’t neuron cells pump out liver enzymes and bone cells create and then, presumably, ignore neurotransmitters? Davidson’s work suggested that, while ten percent of the human genome makes proteins, functioning as the parts list of the human body, the other ninety percent is the sequential assembly manual.

But the story didn’t end there. Other geneticists noted that simple chemical compounds called methyl groups (CH3) often became attached to the promoter regions of genes—usually a sites where a cytosine base is followed by a guanine—and inhibited the gene’s expression. They at first considered this an environmental accident, randomly closing off gene function. But they also noted that an enzyme in the nucleus called “methyltransferase” worked to add these methyl groups to newly replicated DNA strands during cell division. If methylation was an accident, why was there a mechanism to preserve it in daughter cells?

From this question, the scientific community began studying methyl groups attached to DNA and learned that this was the cell’s way of ensuring that those brain cells didn’t begin producing liver enzymes or bone cells make neurotransmitters. Once a cell had differentiated to become a certain type of tissue, methylation locked out its other possibilities.3

So the community of microbiologists had to work gradually, discovery by discovery, to develop and refine their model of human genetics. From the central dogma of protein production being the only purpose of all DNA, to a whole new use of RNA to differentiate cell types, to the inclusion of “accidental” methyl groups to lock in that differentiation.

Every science goes through such an evolution and refinement of knowledge, discarding old ideas, patching in new discoveries, building, tearing down, and rebuilding the model, each time coming closer to what’s really going on in the world. In the same way, every human being learns certain skills and truths, discards old notions, patches in new understandings, building and tearing down his or her worldview, until the person attains something approaching … wisdom.

1. This did not leave them with too few genes to account for all of the body’s proteins, because they also discovered that many genes have alternate splicings. The scientists already knew that some gene sequences had “exons,” or patterns that expressed the code for the protein, interspersed with “introns,” or non-coding intrusions into that pattern. What they learned from the human genome was that the promoter region ahead of the start code could specify different ways to combine those exons to create variations in a family of proteins. Nature is more wonderful than we can imagine.

2. Not everyone agreed with this assessment. The human body spends too much time and energy replicating the entire genome each time a cell divides for us to be carting around this much junk. After all, the phosphate bonds (PO4) that are the backbone of each paired strand of DNA are also the working part of the cell’s energy molecule, adenosine triphosphate. And phosphorus is not so common, either in nature or in the body, that we can afford to hoard it and squander its potential with junk DNA.

3. Methylation also would explain why the early methods of reverting a body cell to a type of embryonic cell, by starving it until the cell almost dies, worked so poorly. This was how the scientists in Scotland cloned Dolly the Sheep, and in order to achieve the one viable Dolly, they had to sacrifice hundreds of attempts at cloned embryos and raise not a few genetic monsters. The starvation method must have essentially stripped out methylation as the cell approached its death, reverting the genome to is undifferentiated state.

Sunday, December 3, 2017

The Trials of Vishnu

Tower of Pisa

In the Hindu pantheon, three major gods form a trinity:1 Brahma the Creator, Shiva the Destroyer, and Vishnu the Preserver. Although I constantly try to emulate Brahma in my writing, creating imaginary worlds, people, and situations as if they were real, I find that in everyday life my patron deity is actually Vishnu.

I tend to hold on to things, sometimes cherishing them for sentimental value and sometimes simply because they might, one day, under the right circumstances, become useful again. For example, I have an old and now somewhat tattered bath mat that my paternal grandmother once crocheted. That’s from sentiment. I have in storage books that I decided years ago I wasn’t going to read right away but that I might need again. And in my closet I think I have the dress shoes I wore to my high-school prom, now fifty years ago. And at the back of my closet … even Vishnu doesn’t want to look there.

A lot of this preservation deals not so much with just keeping things as keeping them in their proper, pretty, pristine, and like-new state. This is a manifestation of my own particular form of obsessive-compulsive disorder, or OCD. For many people with this disorder, the compulsive task is repeatedly washing their hands, or checking for their car keys, or reconfirming that the stove is indeed turned off. Performed enough times, especially under conditions of stress, the disorder can be crippling to a normal life. My form of OCD deals primarily with the two S’s: surfaces—which includes scratches—and straightness.

If an object in my possession has a shiny surface, I am constantly checking it and cleaning it of dust and smudges: the screens of my iPhone, iPad, and computer monitor; the cases of any of these devices; and the shiny bits of my motorcycles such as gas tank, fenders, windscreen, and dial lenses.

Before I go for a ride, I use a wet paper towel to float off the dust, followed by a microfiber towel to wick off the water without leaving droplet marks or fine scratches. After I ride, I use the still-damp towel to clean off any bug carcasses and dust I’ve picked up along the way. If I find a scratch—however minute, no matter that it only shows up in direct sunlight and from certain angles—I bring out the wax or the plastic polish to address the defect. When I wash the motorcycle, I immediately follow it with a coat of wax or acrylic sealant to preserve the surface. If there’s a deeper scratch, visible under any light conditions, I bring out the polishing compound and work it to oblivion, hoping that the mar doesn’t go through the clear coat and into the paint. And if there’s a stone chip, I go after it with touchup paint, followed by compounding and sealant.

You might think that the solution here would be the new matte finishes that motorcycle manufacturers have introduced over the past several years. But they can’t be touched up for visible scratches. And then I worry about wear, especially the sides of the tank where my knees grip the surface and the fabric of my trousers would leave—horrors!—a shiny spot. There’s just no way to win this game.

This is why my favorite material is glass. It wipes down easily, and usually it will break—and thereby have to be thrown away—before it will scratch. For this reason I like tempered glass for my eyeglasses instead of the new plastic lenses. (I’ve had enough polycarbonate motorcycle windshields to know that, while they might take a bullet, they also scratch fairly easily.) I also favor polished titanium and stainless steel for my watches because of their wear resistance. I’m just picky that way.

If an object has a scratch or wear mark that I can’t polish out, I agonize over it. I see that point of infinitesimal damage more than the whole bright surface or the shape, design, and purpose that brought me to admire and desire to obtain the object in the first place. Is this a crippling affliction? To my daily round of activities—such as incessant hand washing would be—then no. But to my emotional stability—when I have actually considered selling a motorcycle because of a deep and unfillable stone chip in its lustrous black paint—well, yes.

The other aspect of my disorder is the alignment and straightening of things. Part of this was my upbringing as the son of a landscape architect. My mother had an innate sense of design—my father had it, too, but not to her degree—backed up by her training as a meticulous draftsman and landscape gardener. Even though her courses taught her to “avoid straight lines” when laying out a flowerbed, she appreciated things that were square and even. And she wanted everything to have its own place and its own space. So if I, as a youngster, pushed my desk into a corner of my bedroom—so that its edges were touching both walls—she would gently advise me to pull it out, at least an inch from the back wall and six or seven inches from the side wall, so that the desk “owned” its space in the room. Crowding furniture side by side and pushing area rugs up against the baseboard were a violation of her own particular feng shui.

So I practice straightness in my environment. Pictures hang level. Wall clocks have their twelve and six aligned vertically. Rugs are square with the room. When we bought the condominium, we had a hardwood floor installed instead of the usual wall-to-wall carpet. The pattern of the parquetry is a series of small wood oblongs arranged in larger squares. Thank God the installers aligned the sides of these squares with the walls of the room—although the plasterboard itself is none too straight in some places. Otherwise, I would live in a nightmare of constantly trying to square up the floor and walls in my mind, or squint until they almost aligned. But I do keep pushing the area rugs—which are all rectangles, no ovals or circles here—to align with the edges of the floor squares. And I judge the position of a table or chair by counting the wood blocks in the floor pattern at each leg. On my daily path through the apartment, I am constantly straightening a rug with my toe, squaring up the hang of a picture, pushing at a table edge, aligning a place mat, adjusting the spacing between items on a shelf. It’s an endless job.

Some of this compulsion has made me a better writer and editor. I see grammatical looseness as a violation of alignment. I see unfinished thoughts as incongruent with the shape of an argument. The nits of spelling and punctuation are minute scratches—some of which only I can see, and then only in certain lights—that must be polished or repaired. Of course, when it comes to laying out a newsletter page or a book cover, I try to give pictures and other graphic elements their own space and not crowd them. And I have an eye that is calibrated—or used to be—to a printer’s point, or a seventy-second of an inch, about a third of a millimeter.

For example, in choosing the photo of the Tower of Pisa, above, to accompany this article I selected the best image for its lighting and background. But then, in Photoshop, I had to rotate the image two-point-five degrees counterclockwise because, while the tower was leaning most dramatically, the buildings, the light pole in the background, and the implied horizon were also tilted. I notice these things. They bother me.

So, is this an affliction or a source of strength? I don’t know. It is quirks like this, if anything, that define me. But I do know that, when I am in my grave, my ghost will be haunting my last dwelling place, nudging ineffectually at a crooked rug and scrabbling with ectoplasmic fingers to straighten a tilted picture.

1. What is it with religion and triads? First, there’s the familiar Father, Son, and Holy Ghost—which never made much sense to me, because of the “ghost” part. And then, in the ancient Celtic religion, things usually came in threes and their bards were expected to declaim in rhyming triplets. And finally, the Scandinavians had the Norns, three old women, one to spin the thread of life, one to measure it, and one to cut it off. I find that in my writing, which generally comes from the subconscious, I sometimes feel the work is incomplete unless an argument is supported by three examples, a list includes three members, or an object is tagged with three distinct adjectives. I guess I take after the bards in that way.