![]() |
Humans have had what we call civilization—graduating from hunter-gatherer, nomadic lifestyles to settled towns and cities built on agriculture and individual craftsmanship—for about 5,000 years, give or take. Some people, looking at ancient stone ruins that nobody can link to a recordkeeping civilization, think this is more like 10,000 to 12,000 years. Still, a long time.
And in all that time, unless you believe in visiting extraterrestrials or mysterious technologies that were lost with those more-ancient civilizations, the main work performed in society was by human muscles and the main transportation was by domesticated animals. The human muscles were often those of enslaved persons, at least for the more dirty, strenuous, or repetitive jobs—as well as for body servants and household staff that the settled and arrogant could boss around. And the animals may have been ridden directly or used to pull a cart or sled, but the world didn’t move—overland at least, because water travel from early on was by oar or sail—without them.
Think of it! Millennium after millennium with not much change in “the way we do things around here.” Oh, sometime around those 5,000 years ago people learned to roast certain kinds of dirt and rock to get metals like copper, tin, and eventually iron. They learned to let jars of mashed seeds and fruits sit around partly filled with water until the stuff turned to alcohol. They learned to make marks on bark and clay tablets, or cut them in stone, to put words aside for others to read and know. All of these were useful skills and advances in knowledge, but still performed with human muscles or carried on the backs of animals.
The Romans invented concrete as a replacement for stone in some but not most building projects about 2,000 years ago. But it wasn’t until the general use of twisted steel reinforcing bars (“rebar”), about 170 years ago, and then pre- and post-tensioned cables embedded in the concrete, that building with the stuff really took off. Today, we use stone mostly for decorative work—or add small stones as “aggregate” to strengthen the concrete slurry—but big blocks and slabs are purely optional.
The Chinese invented mixing charcoal, sulfur, and potassium nitrate in an exothermic event—a form of rapidly expanding burning—about 1,200 years ago. But it wasn’t used militarily as the driving force behind projectiles fired from cannons for another 400 years. Before that, presumably, the powder was used for fireworks and firecrackers, probably to scare away evil spirits. It wasn’t until the Italian chemist Ascanio Sobrero invented nitroglycerin about 170 years ago, and Alfred Nobel turned it into dynamite about 20 years later, that the era of really big bangs began. And then, 100 years later during World War II, we developed “plastic explosives”—but still bound up with the energetic recombination of diatomic nitrogen atoms from unstable molecules—that made all sorts of mayhem possible.
The discovery that microbes were associated with most infections—the “gene theory of disease”—also started about 170 years ago. And not until then did doctors think to wash their hands between operations and use antiseptics and disinfectants like carbolic acid and simple alcohol to clean their instruments. The first antibiotic, salvarsan to treat syphilis, didn’t come for another 60 years. And the first general-use antibiotic, penicillin, is now less than 100 years old.
The first practical steam engine (external combustion) was used to pump water out of coal mines a little more than 300 years ago. But it didn’t come into general use as the motive power driving boats and land vehicles running on steel rails for another 100 years and more. The first commercial internal combustion engine was introduced about 160 years ago and didn’t become a practical replacement for the horse in private use until about 120 years ago. And before that, the Wright brothers used it to power the first heavier-than-air craft (people had been riding in balloons lifted by hot air or lightweight gases like hydrogen for 120 years by then). Less than 70 years after the Wright brothers, a mixture of liquid hydrogen and liquid oxygen shot a spacecraft away from Earth on its way to put people on the Moon.
The first electrical telegraph was patented by Samuel Morse about 190 years ago and quickly came into general use for long-distance communication. The first practical telephone, carrying a human voice instead of just a coded alphabet, arrived 40 years later. The first radio signals carrying Morse’s code came 20 years after that, and the first radio broadcast carrying voices and music just 10 years or so later. The first television signals with images as well as voice—but transmitted by wire rather than over the “air waves”—came just 20 years later.
The first computers came into general business use, and not as scientific and military oddities, about 60 years ago. Those were basements full of heavily air-conditioned components whose use was restricted to specially trained operators. We got the first small “personal computers” about twenty years later. And only in the last 40 years or so did the melding of digital and radio technologies create the first commercially available “mobile phone.” Today those technologies put a computer more powerful than anything that used to live in the basement into the palm of your hand, incorporating the capabilities of communicating by telephone, telegraph, and television; obtaining, storing, and playing music; capturing, showing, and sending both still and moving pictures; and performing all sorts of recordkeeping and communication functions. Most of that wasn’t possible until the first computer networking, which started as a scientific enterprise some 50 years ago and became available to the public about 20 years later.
Scientists have known the structure of our genetic material—deoxyribonucleic acid, or DNA—for about 70 years. And with that they could identify the bits of it associated with coding for some of our proteins. But it wasn’t until about 20 years ago that we recorded the complete genetic sequence for human beings, first as a species and finally for individuals and also for many other life forms of interest. Only then could gene sequences be established in relation to human vulnerability to diseases and congenital conditions. And only then could we begin to understand the function of viruses in human disease and how to fight them—if not exactly cure them.
So-called “artificial intelligence”1 has been around in practical, publicly available use for less two years. Most of these programs of “generative AI” will create mediocre writing samples and mediocre pictures and videos (well, interesting text and images but often full of telltale glitches). So far, they are clever toys that should be used with a long-handled spoon. Still, the whole idea shows promise. A Google company called Deep Mind is making scientific discoveries like analyzing proteins to determine their folding patterns. That’s incredibly tricky and requires tracing thousands of molecular bonding points, but understanding how protein sequences are folded helps us figure out how they function in the body and how they can be manipulated by altering their amino acid (which is to say their genetic) sequence. Other AI platforms are proposing and studying the behavior and function of novel chemical compounds, creating and testing in silico materials and applications without having to mix them physically in the laboratory. The world of AI is still in its infancy with much more to come.
And finally, for all those 5,000 or 12,000 years, human beings lived on a rocky globe at the center of a universe composed of Sun, Moon, and a few planets orbiting around the Earth under a nested shell of invisible spheres that held the fixed stars. It was only in the last 500 years or so that astronomers like Copernicus and Kepler came to understand that the Sun, not the Earth, was the center of this system, and then that the stars were much farther away and formed an “island universe” centered on the river of stars—actually a disk seen edge-on—that we call the Milky Way. And it was just less than 100 years ago that astronomer Edwin Hubble looked at fuzzy patches of light in the night sky—which astronomers at the time called nebulae or “clouds”—and figured out that some of them were actually other galaxies like the Milky Way but much more distant. A universe of upwards of a trillion galaxies, each with perhaps 100 billion stars, has been observed since then. Most of them were identified and theories about them proposed—including the notion that many or most of them hide a super-massive black hole at their centers—just within my own lifetime.
Phew!2
And my point to all this is that much of the world we know, and that many of us take for granted, has come about in just the last 200 years of scientific collaboration and advancement. Someone from ancient times brought forward to the world of, say, 1700 would have found it comprehensible. A lot of it would be richer, more refined, and better mannered than the world he or she knew. Some of it would have to be explained, like the rapid burning of gunpowder or the strength of the bright metal we call steel. But you could bring that person up to speed in an afternoon.
By the end of the 1800s, the world would be a lot more complicated and noisier, and the explanations would extend to a week or more.
By the end of the 1900s, the world would be a magical place, or possessed of invisible spirits, and the explanations would require an undergraduate course in several areas of study.
And in the last quarter-century, the advances are coming so fast that those of us not born with them are feeling uneasy, trying hard to keep an open mind and roll with what’s coming.
The future is coming at us all faster and faster. As a science fiction writer, I sometimes despair trying to figure out where computerization, or medical advances, or our core knowledge of physics and chemistry will take us in the next hundred years, let alone the next thousand. I don’t think we’ll blow ourselves up or poison ourselves or destroy our living world. But I’m not so sure that I or anyone alive today will understand it or feel at home in it.
1. As noted in a recent blog, these software platforms aren’t really “intelligent.” They are probability-weighting machines, easily programmed by non-specialist users, to analyze a particular database of material and create their output by predicting the next logical step in a predetermined pattern. That kind of projection can be incredibly useful in certain applications, but it’s not general, human-scale intelligence.
2. And note that I have not touched on similar stories about the advancement from alchemy to chemistry, the periodic table, and atomic theory; or various forms of energy from wood and wind to coal, oil, natural gas, nuclear fission, and photovoltaics; or advances in physics and the understanding of subatomic particles, general relativity, and quantum mechanics. All of this happening faster and faster.
No comments:
Post a Comment