Discussion: (0 comments)
There are no comments available.
| The American
When Darwin ceases to astonish us, we will cease to understand him.
When Darwin ceases to astonish us, we will cease to understand him.
Today is the 200th anniversary of the birth of the English naturalist Charles Darwin. To be widely remembered two centuries after your birth is not the usual lot of mortals, and it is rarer still to be a subject of intense and often quite bitter controversy. Had Darwin been content to remain a wealthy country gentlemen with a taste for dogs, no one today would be arguing about him, or celebrating his birthday. But Darwin was more than that. He was a natural thinker—a kind of tinkerer in ideas.
The Dutch philosopher Spinoza once wrote that “most people believe they sufficiently understand something when they cease to be amazed by it,” and if Darwin had one great gift above all others, it was his capacity to be continually amazed by the most ordinary things in the world around him. What most everyone else took for granted was often what troubled and puzzled Darwin the most. By his own admission, whenever he was confronted by a conundrum, he found himself generating a hypothesis in order to explain it. Living in the age of Queen Victoria, he nevertheless retained many of the intellectual characteristics associated with the boldly speculative imagination of the early Greek thinkers. Fiercely independent and free from debilitating skepticism, Darwin was not afraid to make bold and sweeping conclusions on the basis of slender evidence, while he was always prepared to revise, or even reject, his conclusions in the light of additional information. Sickly in body all his life, his mind kept its boyish vitality until the end.
To read Darwin’s short Autobiography is to encounter a man whose sterling qualities are palpable. This is someone, we cannot help thinking, that we would all like if we met him—humorously self-deprecating, kind-hearted, clear, and to the point. Even so, those who have dipped into his two most famous works, The Origin of Species and The Descent of Man, can understand why a man as modest and charming as Darwin can still lead to fighting words 200 years after his birth. Yes, it is true that he offered up a brilliant theory—the theory of evolution by natural selection; but he also swept away the foundational certitudes upon which Western civilization had rested for thousands of years. If Darwin was right, then the Bible was wrong. God had not created Adam; evolution had.
To accept Darwin’s new truth required a rejection of an enormous number of other hitherto established “truths” in a wide range of different fields. It was to gain one piece of knowledge at the cost of jettisoning virtually everything else we thought we knew.
To be sure, skepticism about the literal truth of the Bible did not begin with Charles Darwin. Spinoza in the middle of the 17th century had written a treatise, published anonymously, in which he had applied the cannons of critical reason to examine the sacred texts of Judaism and Christianity. The Huguenot Pierre Bayle later carried this project forward, and soon many intelligent Europeans came to believe much of the Bible simply could not be taken literally—a healthy skepticism that became the hallmark of the European Enlightenment. Educated men were no longer willing to believe that Joshua had made the sun stand still, or that the ass Balaam rode on had spoken to him, or that the walls of Jericho had come tumbling down at the concerted blare of trumpets, or that certain patriarchs lived to be 900 years old.
At first glance, the story of Adam and Eve might appear to belong to the same class of implausible fables. Like these other dubious scriptural incidents, the story of the Garden of Eden is chock full of manifest absurdities. Eve is created out of Adam’s rib; she is seduced by a talking snake that walks on legs; she is driven forth from paradise, along with her helpless mate, by a pair of angels, armed with flaming swords—all of which, it must be admitted, has the suspicious ring of a fairy tale.
Yet, despite this superficial similarity, there was a radical difference between the Garden of Eden fairy tale and the other ones, such as the story of Joshua’s making the sun stand still, or that of Balaam’s talking ass. If a man who has previously taken the Bible literally should decide to stop believing that Joshua made the sun stand still, he is no longer required to explain how such a miraculous event occurred. He simply stops believing in this particular fable, and thereby returns to the common-sense position that men cannot command the sun to stop at their good pleasure. Similarly, if he rejects the assertion that Balaam’s ass talked, he is again conforming to the common-sense view that holds asses incapable of articulate speech. In both cases, by jettisoning the implausible fable, the scriptural skeptic finds himself in a position where he has nothing left to account for. You don’t have to explain why the sun does not stop—you would only have to explain why it did. In fact, the skeptic, once he has eliminated all elements of the fabulous, doesn’t have to explain anything at all—it is rather those who continue to uphold these fables who find themselves confronted with the daunting tasks of explaining the inexplicable.
Discard the fable of Adam and Eve, however, and you at once find yourself faced with a quandary. If this is not the correct account of human origins, then what is? In rejecting all those dubious Biblical miracles, the 18th-century skeptic was simply reaffirming healthy common sense. But what was he reaffirming if he rejected the story of Genesis? In the 18th century, there was simply no healthy common sense alternative to the story of the creation of Adam out of malleable clay. The realistic alternative to the sun standing still was for the sun to move as it normally moves; but what, in the 18th century, was the realistic alternative to the story of man’s creation by God?
The myth of man’s special creation, when discarded, offered nothing obvious to take its place. Worse, it left a whole array of perplexing questions. The account in Genesis had offered neat and rather elegant explanations of the origin of human institutions—language, the family, agriculture, pastoralism, hunting, violence, and so forth. If these explanations were exposed as mere fables, what alternative explanations could there be?
Take away the story of God’s creation of Adam, and what alternatives were left to thinking men of the 18th century? Even intellectuals who were otherwise skeptical of the Biblical narratives continued to accept the core idea that man had been the subject of a special act of creation.
The dilemma faced by the skeptic when contemplating the fables of Genesis was similar to the dilemma faced by those who first argued for the Copernican theory of the universe. At first glance, making the sun the center of everything appears to us to have been an obvious step in the right direction—so why the vehement opposition to the Copernican revolution on the part of so many otherwise intelligent men?
In part this opposition was fueled by those who were concerned about the implicit challenge to the passage in Joshua about the sun standing still—how could the sun be made to stand still if it didn’t move to begin with? Yet by far the most powerful obstacle to the general acceptance of Copernicus was not scriptural literalism, but rather ordinary healthy “common sense.” If it was the Earth that was moving, and not the sun, why couldn’t we feel it moving, in the same way that we can feel it when a vehicle we are riding in is moving? Furthermore, if the Earth was moving as fast as it would have to move to make a circle around the sun every year, how could we possibly hope to hang on to it—common sense shows that we would have long ago been blown off the surface of our planet, along with everything else that was detachable.
The rejection of the Ptolemaic theory required a rejection of the “common sense” physics that had provided human beings with their working model of the universe since the time of Aristotle. Newton’s First Law begins by making a statement that flies in the face of all our ordinary ideas about motion: it declares that a body in motion will stay in motion for ever and ever, even if there was nothing to push or shove it along, so long as no other force acts upon it. Yet who had ever witnessed such a thing? Arrows, bullets, and cannon balls remained in motion without such pushing or shoving—a problem that had confounded Aristotle—but their motion always came to an end after a relatively short period of time.
In reference to our intuitive expectations concerning motion, the switch from the Ptolemaic to the Copernican system required a massive retooling of ordinary common sense. But, at least in this case, there was an alternative model to the Ptolemaic system, namely, the heliocentric model of the universe offered by the ancient Greek philosopher Democritus—a model that was already available to men like Copernicus and Galileo when they became increasingly dissatisfied with the geocentric model. But, take away the story of God’s creation of Adam, and what alternatives were left to thinking men of the 18th century? If human beings wished to shift their paradigm of human origins, in what direction should they shift it? Thus even intellectuals who were otherwise skeptical of the Biblical narratives continued to accept the core idea that man had been the subject of a special act of creation. What other choice did they have? Of course, the talking snakes could go—but Adam and Eve had to remain.
Those who think Darwin’s theory is obviously right should spend a little time thinking it could be wrong, while those who see it as obviously wrong should spend a little thinking it just might be right.
When Darwin proposed that human beings descended from an ancestor that we shared with apes and monkeys, he was playing the role of Copernicus. To accept Darwin’s new truth required a rejection of an enormous number of other hitherto established “truths” in a wide range of different fields. It was to gain one piece of knowledge at the cost of jettisoning virtually everything else we thought we knew—hardly anyone’s idea of a fair exchange. No wonder Darwin, like Copernicus, faced such stiff resistance. The wonder is that they weren’t resisted even more.
Most scientific breakthroughs do not come at such a high price in terms of our general level of cognitive certitude. Often a discovery in one field will shed unexpected light on seemingly unrelated fields, setting off a positive chain reaction as scientists in a variety of fields seek to apply the discovery in their own domain of inquiry. New information is acquired without the need to discard the old. Newton invented calculus to aid him in figuring out the laws of gravity, but it was immediately applied to a host of quite different problems. It was pure gain, with virtually no loss.
The case of Darwin, like that of Copernicus, was different. In these two instances, the intellectual chain reaction set off by the breakthrough was largely negative. Instead of offering new insights in a variety of different fields, the new discoveries called into question everything that authorities in their different fields felt that they knew for sure. Worse, both called into question the ordinary man’s notion of healthy common sense. Where the chain reaction is positive, the new breakthrough will lead to innumerable “Ah ha!” experiences as different individuals see how the breakthrough illuminates problems in their own particular areas of study. Where the chain reaction is negative, the new breakthrough will cause men who previously thought themselves as experts to realize that they had no clue what they were talking about. Hence no one should be surprised that some scientific breakthroughs have been immediately accepted and hailed, whereas others have been attacked and derided. Few human beings are exempt from the law of cognitive inertia: It is easy to welcome a discovery that adds to our achieved store of knowledge without subtracting anything from our store, but quite a different thing to welcome a discovery that forces us to rethink everything we thought we already knew.
There are many people today who are sincerely convinced that they have accepted Darwin’s theory, but who, by hook or by crook, are conniving to evade its inexorable implications on their most cherished notions of humanity.
Perhaps those who feel frustrated that there is still debate over Darwin should recall that the 200th anniversary of the birth of Copernicus fell on the date February 19, 1673—roughly 14 years before Newton published the book that would solve virtually all the problems that Copernicus’s great breakthrough had created when it hurled our planet from the center of the universe. Even at the distance of two centuries, we are still much closer to Darwin’s era than we would like to think. There are many people today who are sincerely convinced that they have accepted Darwin’s theory, but who, by hook or by crook, are conniving to evade its inexorable implications on their most cherished notions of humanity. Such people have ceased to be amazed that, despite the universal struggle for existence, a breed of successful monkeys has managed to create for themselves a set of moral ideals so lofty that only angels could ever hope to live by them.
When Darwin ceases to astonish us, we will cease to understand him. So today those who think his theory is obviously right should spend a little time thinking it could be wrong, while those who see it as obviously wrong should spend a little thinking it just might be right.
Lee Harris is the author of The Suicide of Reason. He appeared at the American Enterprise Institute in September of 2007 to discuss his book with AEI scholar Ayaan Hirsi Ali. You can see video or hear audio of the discussion here.
Image by Darren Wamboldt/The Bergman Group.
There are no comments available.
1150 17th Street, N.W. Washington, D.C. 20036
© 2016 American Enterprise Institute for Public Policy Research