As a science-fiction writer, although no scientist myself, I am a fan of science. My grandfather was a civil engineer, and my father a mechanical engineer. I started reading science fiction as a child. For most of my corporate career I worked with and interviewed engineers and scientists. And I still subscribe to the magazines Science, Nature, Scientific American, and Astronomy. I don’t always read the original articles, which can be pretty dense, but I read most of the magazine’s summaries for lay readers.
When I worked at the biotech company, I routinely scanned scientific articles for studies that used our genetic-analysis products, so that I could write my own brief summaries to report back to our employees about how our equipment was being used. That gave me a general view, at least in the realm of the biological sciences, of how far and deep what I now call the “enterprise of science” extends. Principal investigators and teams of researchers in university departments, corporate facilities, and government labs around the world are all at work extending and enhancing our knowledge and sharing their findings through peer-reviewed publications. And I’m sure the same effort goes on in chemistry, physics, and the other, softer sciences.
So in no sense do I consider myself a “science denier.” But still … I am aware that not all of the work that passes for science is conducted at the same level, with the same rigor, or deserves the same respect for results. Science is, after all, a human endeavor, and not all human beings think and act the same, or value the same things. So scientific results are variable.
Biology is pretty sound, I think. The field has progressed over the centuries and in the more recent decades from being a work of pure description and simple cataloguing. Using techniques from chemistry and now genetics, and since the application of evolutionary theory, biologists are discovering not only what things are but how they function and why they are related. Their work is not so much theoretical as analytical and demonstrable. That is, provable beyond doubt. Biology’s stepchild, medicine, with its study of mechanisms, pathogens, and now disease tendencies transmitted through genetics, has advanced human life immeasurably in the past two hundred years or so.
Chemistry is in the same advanced state. Since the breakthrough in understanding atomic structure and the ordering of the Periodic Table, chemists have been able to analyze existing molecules and compounds, understand their structure and even predict their characteristics, propose changes to their nature, and even create novel molecules and materials. The field is creating unheard of substances like high-temperature superconductors and advanced battery substrates. Chemistry and its stepchild, electronics—and its stepchild, computer science—have created the modern world.
Physics has made remarkable progress in understanding the world around us, focusing on the mechanics of light and sound, energy and inertia, and other, well, physical—that is to say, tactile, observable, and measurable—characteristics of what we might call the “real world.” But in the 20th century, and continuing into the 21st, physicists’ deep dive into the realm of the unseen and only guessable, with quantum mechanics and probability theory, seems to have veered into a kind of mysticism. To my layman’s eye, certain physicists now seem to be playing imaginative games where anything might be real if only the numbers can be made to add up. So we have String Theory, where all subatomic particles ultimately resolve into tiny, vibrating loops of “string”—but only so long as the real world consists of eleven dimensions and not just the observable three dimensions of space plus one of time. And the ruminations of Erwin Schrödinger and his sometimes-alive-sometimes-dead cat have led to the Many Worlds Theory, where every imponderable probability splits the universe into finer and finer branchings of alternate realities. This has just become a game of having fun with numbers.1
The stepchild of physics, astronomy, is on firmer ground—well, no ground, really, because all that the astronomers can work with are various wavelengths of photons and electromagnetic effects from various subatomic particles. Or at least, it’s firm when they can observe and compare physical results. Astronomy has been remarkably perceptive in analyzing these effects and resolving conjectures based on them into perhaps provable truths. For example, we can be pretty sure the Earth and other planets revolve around the Sun, that the Moon revolves around the Earth, and now that other stars have planets similar to our own. But we are still in the conjecture stage about the nature of neutron stars and black holes,2 and cosmology and its theories about dark matter and dark energy are dependent upon our incomplete understanding of the nature of time, space, and gravity.3
Psychology, among the softer sciences, has made great advances in understanding the human mind and behavior since the purely conjectural theories of Freud, but it works best when it pairs with neurology and molecular biology to understand the brain that underlies the functions of mind. Still, psychologists are studying the elusive product of a trillion firing synapses, similar in all human beings according to their genetic nature but different in their chemical environment and learned responses. Psychology is a work of aggregation among many individual samples—and the art lies in picking the samples and applying a priori conjectures. Its stepchild, sociology, is in an even more tenuous state, because unlike psychologists, the sociologist works with aggregates from the start rather than with the personal reports of individual human beings.
And then there is environmental science, which shares a good deal with chemistry and physics, and has benefited greatly from advances in the understanding of geology, plate tectonics, and ocean thermal energy and salt transports, not to mention satellite imaging and widespread use of local ground-based radar. We can now understand most weather phenomena and predict patterns out to several weeks and trends out to possibly next year. But the Earth’s atmospheric systems are still highly complex and subject to many and varied influences. The current view that the entire planet is warming overall due to one human-caused variable, the industrial release of carbon dioxide, and predictions of worldwide temperature increases in fractions of a degree eighty years into the future—well, that is not observable science. Such predictions are science’s bastard stepchild, computer modeling.
Computers can model complex systems by aggregating variables and assigning weights to them. As I understand the current process—having followed the story of “global warming” since James Hanson presented his case at an Energy Daily conference in 1987 (before which “climate change” was called “global cooling”)—carbon dioxide is a weak greenhouse gas, unless it dominates an atmosphere like that of Venus. The climate models all assume that the tiny push from additional carbon dioxide will create enough heating to cause water vapor to enter the atmosphere, where aerosolized water is a much more potent greenhouse gas. And the models are all based on positive feedbacks to this cycle, ignoring negative feedbacks like carbon dioxide being a necessary plant food, and that plants absorb carbon dioxide out of the air.
The models supporting anthropogenic global warming also tend to ignore the influence of the Sun, which is known to be a variable star. Michael Mann and company produced a global temperature chart based on tree rings4 taken and analyzed starting several thousand years ago. Their chart showed global temperatures holding steady for more than two thousand years and only rising—in the famous “hockey stick”—during the last two centuries. Mann purported to erase the Roman warm period, the Dark Age cold period, the Medieval warm period, and the Little Ice Age of the late Renaissance. But each of these fluctuations coincides with the 800-year cycles—400 years of rising peaks, 400 years of declining peaks—in the familiar 11-year sunspot cycles. A spotted sun is a warmer sun, because the spots are actually rips in the solar atmosphere, which normally holds back part of the solar output; they also create flares and mass ejections that release excess energy. Each of the historic warm or cold periods corresponds to a peak or a drought—and sometimes a total disappearance, called a “Maunder minimum,” after the astronomer who first identified it—of sunspot activity. Since the middle 1700s, the Earth has been climbing out of the minimum responsible for the Little Ice Age. Peak sunspot activity probably occurred in the 1990s, and the peaks during the most recent 11-year cycles since then have been falling off.5
So yes, the Earth’s climate does change, no question about that. But it does so in complex cycles. We have also had ancient periods of “snowball Earth,” when the planet was in the deep freeze, and more recently four great “ice ages” occurring at 400,000-year periods. None of this had to do with a weak greenhouse gas like carbon dioxide.6 I do not deny climate change: the Earth’s climate changes continually and is certainly changing now. But I doubt that human industrial activity is the only or the most important cause for the current changes, or that drastically altering our economy will have much effect in the next hundred years. Human beings have been adapting to cold and warm periods, along with rising and falling sea levels, for our entire span on this planet—and we will adapt to whatever else we might find out among the stars. We will live through whatever changes Earth throws at us in the next hundred years.
As I said, science is a human enterprise, and human beings are fallible. They make mistakes, and they are also susceptible to peer pressure and “group think.” They sometimes engage their own curiosity to study phenomena and build on ideas that are popular in their scientific circles. And not all of them are above proposing studies and shaping their conjectures to follow the grant money offered by interested parties.
I am not—no, not ever—a “science denier.” But I am a realist, and I know you have to be careful about what you believe, take for granted, and choose to follow.
1. For more on this, see my blogs Fun With Numbers (I) and (II) from September 2010.
2. For example, Stephen Hawking theorized that black holes evaporate, and the small ones that may have theoretically been created in the Big Bang must have evaporated completely. He based these conjectures on the possibility that the sudden creation of particles and antiparticles in the vacuum of space, which is supposed to be happening everywhere and at any time, along with their immediate self-annihilation, must often happen at the very edge of a black hole’s event horizon. The occasional loss of one particle or its antiparticle over the event horizon and into the hole must then draw energy and information out of the hole. And that, over time, would deplete and erase the black hole entirely. In terms of provable, knowable reality, Hawking might as well have said that pixies ate them.
3. See also Three Things We Don’t Know About Physics (I) from December 2012 and (II) from January 2013.
4. However, I always understood tree ring size to correlate more closely with available water, rather than with temperature. And hotter climates are not always dry.
5. To look at recent sunspot cycles, see the Cycle 24 Prediction from the Wikipedia page on this latest cycle.
6. One of the articles I read while scanning for our genetic analysis equipment (and I don’t have the reference now) said that drilling into and through the glaciers in Greenland had uncovered arboreal duff—twigs and pieces of dead trees—whose genetic drift from current identifiable species was about 700,000 years. This finding suggests to me that the interglacial period between the last ice age, which ended about 12,000 years ago and peaked somewhere in the last 200,000 years, and the one that preceded it never completely melted the glaciers over Greenland. Otherwise, this ancient duff would have been washed away and replaced with younger stuff.
Нынешний житель земли где бы он не жил- он потребитель. Идеология многих стран заточена на воспитание человека потребителя. Жизненый комфорт, питание развитых стран отупляет людей. Но индивидумы и их много, игнорирую блага и выбирают для своего мозга путь неординарного мышления. Ты тоже относишься к этим людям.
ReplyDelete