Five years ago, in a meditation on the speed with which our technological base is expanding,1 I wrote the following:
“In the 20th century we started another transformation, from mechanics to informatics. Consider that for 100,000 years of human history a tool was a physical object like a hammer or an axe blade. But in the last 40 years techniques like crossfoot accounting, computerized spreadsheets, scheduling programs, risk analysis, and datamining have spawned a generation that uses nonmaterial tools on a daily basis. For 20,000 years, animal husbandry meant taming and working with whole animals or crossbreeding whole crops of corn. In the last 20 years the manufacture and use of enzymes, proteins, plasmids, and antigens has created a biological industry that employs the barest parts of, rather than whole, organisms.”
But that’s not quite accurate, is it?
Animal Husbandry
Humans have been using parts of animals from the very beginning. Hunter-gatherer cultures represent an advanced technology in themselves, and they leave almost nothing to waste. Consider the uses to which the Native Americans put the buffalo—an animal that they neither domesticated nor herded but hunted—deriving meat for their bellies and skins for their clothing and lodges. They gathered other animal parts, from porcupine quills to eagle feathers, for personal decoration. And the Eskimos use every part of the seal and walrus: blubber and meat for food; bones for needles, knives, and harpoon points; skins for clothing; sinews and nerves for cord and thread.
In a cave called Divje Babe in Slovenia, paleontologists have found a 43,000-year-old fragment of a bear femur with holes drilled for a flute. Debate continues whether this was a Homo neanderthalensis or H. sapiens artifact, but it seems indisputable that the holes are artificial and their spacing represents a diatonic musical scale.
Humans in ancient times used bees—hardly domesticated as pets, although they lived in wicker and straw hives of human construction that people placed conveniently close to their planted fields—to obtain honey. Similarly, ancient societies harvested shellfish for a particular shade of purple dye and scraped tree sap for incense and perfumes.
All of these were uses of animal parts in advance of any coordinated effort at domestication, care, and feeding such as humans have given to their dogs, horses, cows, and pigs from prehistoric times. And even domesticated animals have yielded more than their meat, hide, and muscle power. Horse hooves have been boiled for glue from prehistoric times. Sausage casings have been made from layers of the small intestines of sheep, goats, and pigs, while modern edible casings are made of collagen processed from cattle hides or cellulose processed from plants. And various plants have yielded more than their seeds, stalks, and roots for food, too. Consider the antiquity of weaving reeds and grasses into baskets and hats, or flax and cotton fibers into cloth. So people have been using animal and plant parts in industrial processes for hundreds if not thousands of years.
Still, I will stand by the intention of that paragraph above. In the last twenty years or so, especially since we learned to read and manipulate the genotypes of various organisms, our use of animal and plant proteins, enzymes, and antibodies has flourished. We routinely insert fragments of human or animal DNA into yeasts and various cell types of the Muridae, or rodent family, to create targeted proteins used in medicine and research. And laboratories all over the world are now working on genetic modifications that will let green algae produce and release lipids—various fatty molecules—that can be processed in place of oil drilled from the ground. The goal, of course, is to move directly from sunlight to a fuel precursor in one step that can be repeated endlessly with no net addition of carbon dioxide to the atmosphere.
And I don’t think I was wrong about the speed with which this industrial adaptation has accelerated. Our chemistry and biology give us a power over the carbon molecule and other artifacts from fossil life of which an 18th-century charcoal burner, a 19th-century coal-tar chemist, or a 20th-century petroleum engineer could barely conceive, let alone envy.
Information Processing
If knowledge is power, then humans have been wielding it since at least the invention of writing.
Consider the ancient Chinese classic I Ching, or Book of Changes. As a system of divination, it was no more accurate than reading Tarot cards or flipping coins, but that’s not the point. The I Ching’s arrangement of 64 hexagrams represent a system of thought, a proposed relationship among human characteristics, everyday circumstances, and the workings of chance. People have been using it to order their thinking and inspire their imaginations and insights for more than two millennia.
Consider Sun Tzu’s Art of War, probably the world’s first and best military handbook. The text is a distillation of principles that, yes, any experienced war leader could probably work out for himself. Like Machiavelli’s The Prince, the teachings are logical and easy to understand and build upward from one concept to the next. Like knowing the rules of chess and studying the most common gambits, reading either of these books in advance of going to war or entering politics gives the practitioner an edge over his or her untutored opponents.
Consider the principles of accounting. Crossfooting—the method of verifying a result by adding the totals across columns of numbers—was always possible and useful when working on a paper ledger in pencil or ink. But the system really took off with the invention of first the computer and then the spreadsheet, where the process became automatic and could be programmed to raise warning flags if the totals didn’t match.
Actually, though, the greatest advance in accounting was probably double-entry bookkeeping, which originated in Venice in the late 15th century. This system simply balances assets against liabilities to track the money and activity in any business. So taking out a loan, for example, puts cash in the asset side of the business ledger and simultaneously enters a debt to be repaid on the liability side.
Another accounting advance was reckoning the “time value of money”—that is, the rate of return that an amount of cash or other asset could earn if it were lent at the current interest rate or converted into some other form of investment over a specified period of time. The time value of money drives businesses to put their excess cash to work and lets the owner compare investment and asset strategies on a reasonable basis. Although this is easier to figure with a computer or calculator, the principle has existed as long as banking and alternative investments have been around, and the formulas for figuring present value and future value can be worked with pencil and paper. The whole realm of accounting is an informational tool that guides the decision-making of alert businessmen and –women.
So, yes, previous cultures have developed and used raw information as a tool. And this tendency only increased when the printing press made paper copies of organized information caches like books, pamphlets, and scholarly treatises widely available. Then it increased again when the computer automated the processing of information.
Still, I stand by the premise in the paragraph above. We are moving from an analysis of simple data patterns—like the movements of armies in the field or the flow of cash through a business—to appreciation of and predictions based on much more complex situations. For example, IBM is now selling a service called Watson Analytics. This is clearly based on the associational algorithms that IBM’s computer scientists built into the brain box called “Watson,” which turned it into a Jeopardy champion that could blend knowledge of historical and cultural references, facility with linguistic and logic puzzles, and human-scale memory recall into an instrument for recognizing and using patterns to obtain answers. IBM calls this “cognitive computing”—which means that the computer can analyze strange data and answer questions about them without a human programmer having to code the questions and the search terms for each step of the analysis.
The future is still coming at us faster and faster, but it all builds logically and progressively on the patterns of human invention that have come before.
1. See Coming at You from October 24, 2010.
No comments:
Post a Comment