Know you what it is to be a child?… it is
to believe in belief….
– Francis Thompson, 19th c. British poet
We don’t forget our first ah-ha experience any more than we forget our first kiss. The difference is we have some idea of what to expect from a kiss, but we don’t know what to make of an enlightening incident. The experience lingers in memory as something special, but since we can’t account for it, we’re apt to keep it to o urselves.
Only in my thirties did I realize that an experience I’d had in my teens was the analogue of that first kiss. About six years after discovering that our third grade science book contained mistakes, it struck me that anything could be wrong. There were no infallible truths, no ultimate explanations.
In high school we were learning that science theories and models were not to be regarded as absolute truths, but rather taken to be useful descriptions that might someday be replaced with better ones. I accepted this way of holding scientific truth—it didn’t seem to undercut its usefulness. But I still wanted to believe there were absolute, moral truths, not mere assumptions, but unimpeachable, eternal verities. My mother certainly acted as if there were.
But one day, alone in my bedroom, I had the premonition that what was true of science applied to beliefs of every sort. I realized that, as in science, political, moral, or personal convictions could be questioned and might need amending or qualifying in certain circumstances. The feeling reminded me of consulting a dictionary and realizing that there are no final definitions, only cross references. I remember exactly where I was standing, and how it felt, when I discovered there was no place to stand, nothing to hold on to. I felt sobered, yet at the same time, strangely liberated. After all, if there were no absolutes, then there might be an escape from what often seemed to me to be a confining socialconformity.With this revelation, my hopes for definitive, immutable solutions to life’s problems dimmed. I shared my experience of unbelief with no one at the time, knowing that I couldn’t explain myself and fearing others’ mockery. I decided that to function in society I would have to pretend to go along with the prevailing consensus—at least until I could come up with something better. For decades afterwards, without understanding why, I was drawn to people and ideas that expanded my premonition of a worldview grounded not on immutable beliefs, but rather on a process of continually improving our best working assumptions.Science Models EvolveIt’s the essence of models that they’re works in progress. While nothing could be more obvious—after all, models are all just figments of our fallible imaginations—the idea that models can change, and should be expected to yield their place of privilege to better ones, has been surprisingly hard to impart.Until relatively recently we seem to have preferred to stick to what we know—or think we know—no matter the consequences. Rather than judge for ourselves, we’ve been ready to defer to existing authority and subscribe to received “wisdom.” Perhaps this is because of a premium put on not “upsetting the apple cart” during a period in human history when an upright apple cart was of more importance to group cohesiveness and survival than the fact that the cart was full of rotten apples.Ironically, our principal heroes, saints and geniuses alike, have typically spilled a lot of apples. Very often they are people who have championed a truth that contradicts the official line.A turning point in the history of human understanding came in the seventeenth century when one such figure, the English physician William Harvey, discovered that the blood circulates through the body. His plea—“I appeal to your own eyes as my witness and judge”—was revolutionary at a time when physicians looked not to their own experience but rather accepted on faith the Greek view that blood was made in the liver and consumed as fuel by the body. The idea that dogma be subordinated to the actual experience of the individual seemed audacious at the time.Another milestone was the shift from the geocentric (or Ptolemaic) model (named after the first-century Egyptian astronomer Ptolemy) to the heliocentric model (or Copernican) model (after the sixteenth-century Polish astronomer Copernicus, who is regarded by many as the father of modern science).Until five centuries ago, it was an article of faith that the sun, the stars, and the planets revolved around the earth, which lay motionless at the center of the universe. When the Italian scientist Galileo embraced the Copernican model, which held that the earth and other planets revolve around the sun, he was contradicting the teaching of the Church. This was considered sacrilegious and, under threat of torture, he was forced to recant. He spent the rest of his life under house arrest, making further astronomical discoveries and writing books for posterity. In 1992, Pope John Paul II acknowledged that the Roman Catholic Church had erred in condemning Galileo for asserting that the Earth revolves around the Sun.The Galileo affair was really an argument about whether models should be allowed to change without the Church’s consent. Those in positions of authority often deem acceptance of their beliefs, and with that the acceptance of their role as arbiters of beliefs, to be more important than the potential benefits of moving on to a better model. For example, the discovery of seashells on mountaintops and fossil evidence of extinct species undermined theological doctrine that the world and all living things were a mere six thousand years old. Such discoveries posed a serious challenge to the Church’s monopoly on truth.Typically, new models do not render old ones useless, they simply circumscribe their domains of validity, unveiling and accounting for altogether new phenomena that lie beyond the scope of the old models. Thus, relativity and quantum theory do not render Newton’s laws of motion obsolete. NASA has no need for the refinements of quantum or relativistic mechanics in calculating the flight paths of space vehicles. The accuracy afforded by Newton’s laws suffices for its purposes.Some think that truths that aren’t absolute and immutable disqualify themselves as truths. But just because models change doesn’t mean that anything goes. At any given time, what “goes” is precisely the most accurate model we’ve got. One simply has to be alert to the fact that our current model may be superseded by an even better one tomorrow. It’s precisely this built-in skepticism that gives science its power.Most scientists are excited when they find a persistent discrepancy between their latest model and empirical data. They know that such deviations signal the existence of hitherto unknown realms in which new phenomena may be discovered. The presumption that nature models are infallible has been replaced with the humbling expectation that their common destiny is to be replaced by more comprehensive and accurate ones.Toward the end of the nineteenth century, many physicists believed they’d learned all there was to know about the workings of the universe. The consensus was that between Newton’s dynamics and Maxwell’s electromagnetism we had everything covered. Prominent scientists solemnly announced the end of physics.There is nothing new to be discovered in physics now. All that remains is more and more precise measurement.– Lord Kelvin (1900)
Then a few tiny discrepancies between theory and experiment were noted and as scientists explored them, they came upon the previously hidden realm of atomic and relativistic physics, and with it technologies that have put their stamp on the twentieth century.
Albert Einstein believed that the final resting place of every theory is as a special case of a broader one. Indeed, he spent the last decades of his life searching for a unified theory that would have transcended the discoveries he made as a young man. The quest for such a grand unifying theory goes on.
No comments:
Post a Comment
NO ADVERTISING