The following article is copyright © 1992 by the Skeptics Society, P.O. Box 338, Altadena, CA 91001, (818) 794-3119. Permission has been granted for noncommercial electronic circulation of this article in its entirety, including this notice. A special Internet introductory subscription rate to Skeptic is available. For more information, contact Jim Lippard (email@example.com).
James Randi is one of these. So too are the founders and fellows of the Committee for the Scientific Investigation of Claims of the Paranormal (CSICOP), the predecessor to the Skeptics Society whose journal - the Skeptical Inquirer - has set the standard toward which this and other such publications must strive in the pursuit of skepticism. But what does it mean to be skeptical? The word is a troublesome one because of the heavy baggage it carries. The word has different meanings for different people. (We had considered many names but decided that as long as it is defined the word is a useful one. Within the name we had also considered Institute for Rational Skepticism, but rejected it for fear that we might become known as the IRS, an organization about which many people are already skeptical!)
Since this position is sterile and unproductive and held by virtually no one (except a few confused solipsists who doubt even their own existence), it is no wonder that so many find skepticism disturbing. A more productive meaning of the word skeptic is the second usage given by the OED: "One who doubts the validity of what claims to be knowledge in some particular department of inquiry; one who maintains a doubting attitude with reference to some particular question or statement."
The history of the word skeptic and skepticism is interesting and often amusing. In 1672, for example, the Philosophical Transactions VII records this passage: "Here he taketh occasion to examine Pyrrhonisme or Scepticisme, professed by a Sect of men that speak otherwise than they think." The charge is true. The most ardent skeptics enjoy their skepticism as long as it does not encroach upon their most cherished beliefs. Then incredulity flies out the window. I received a call recently from a gentleman who professed to be a skeptic, wanted to support the organization, and agrees with our skepticism about everything except the power of vitamins to restore health and attenuate disease. He hoped I would not be organizing any skeptical lectures or articles on this field, which, he explained, has now been proven scientifically to be effective. "Your field wouldn't be vitamin therapy would it?" I inquired. "You bet it is!" he responded.
It is easy, even fun to challenge others' beliefs, when we are smug in our certainty about our own. But when ours are challenged, it takes great patience and ego strength to listen with an unjaundiced ear. But there is a deeper flaw in pure skepticism. Taken to an extreme the position by itself cannot stand. The OED gives us this 1674 literary example (Tucker Lt. Nat. II): "There is an air of positiveness in all scepticism, an unreserved confidence in the strength of those arguments that are alleged to overthrow all the knowledge of mankind." Skepticism is itself a positive assertion about knowledge, and thus turned on itself cannot be held. If you are skeptical about everything, you would have to be skeptical of your own skepticism. Like the decaying sub-atomic particle, pure skepticism uncoils and spins off the viewing screen of our intellectual cloud chamber.
Skepticism alone does not produce progress. It is not enough simply to reject the irrational. Skepticism must be followed with something rational, or something that does produce progress. As the Austrian economist Ludwig von Mises warned against those anti-communists who presented no rational alternative to the system of which they were so skeptical (1956, p. 112):
An anti-something movement displays a purely negative attitude. It has no chance whatever to succeed. Its passionate diatribes virtually advertise the program they attack. People must fight for something that they want to achieve, not simply reject an evil, however bad it may be.Carl Sagan sounded a similar warning to the professional skeptics at the 1987 CSICOP annual meeting: "You can get into a habit of thought in which you enjoy making fun of all those other people who don't see things as clearly as you do. We have to guard carefully against it" (in Basil, 1988, p. 366).
It is for these reasons that it might be useful to modify the word skeptic with "rational." Again, it is constructive to examine the usage and history of this word that is so commonly used. Rational is given as: "Having the faculty of reasoning; endowed with reason" (OED, p. 2420). And reason as "A statement of some fact employed as an argument to justify or condemn some act, prove or disprove some assertion, idea, or belief" (p. 2431). It may seem rather pedantic to dig through the dictionary and pull out arcane word usages and histories. But it is constructive to know how a word was intended to be used and what it has come to mean. They are often not the same, and more often than not, they have multiple usages such that when two people communicate they are frequently talking at cross purposes. One person's skepticism may be another's credulity. And who does not think they are rational when it comes to their own beliefs and ideologies?
It is also important to remember that dictionaries do not give definitions. They give usages. For a listener to understand a speaker, and for a reader to follow a writer, important words must be defined with semantic precision for communication to be successful. What I mean by skeptic is the second usage above: "One who doubts the validity of what claims to be knowledge in some particular department of inquiry." And by rational: "A statement of some fact employed as an argument to justify or condemn some act, prove or disprove some assertion, idea, or belief." But these usages leave out one important component: the goal of reason and rationality. The ultimate end to thinking is to understand cause-and-effect relationships in the world around us. It is to know the universe, the world, and ourselves. Since rationality is the most reliable means of thinking, a rational skeptic may be defined as:
One who questions the validity of particular claims of knowledge by employing or calling for statements of fact to prove or disprove claims, as a tool for understanding causality.But what method shall we employ? Just being skeptical will lead us to no conclusion other than the Socratic conclusion that we do not know. The answer, in a word, is science, and the method, in two words, is the scientific method.
Science is a specific way of thinking and acting - a tool for understanding information that is perceived directly or indirectly ("observed or inferred"). "Past or present" refers to both the historical and the experimental sciences. Cognitive methods include hunches, guesses, ideas, hypotheses, theories, paradigms, etc.; behavioral methods include background research, data collection, data organization, colleague collaboration and communication, experiments, correlation of findings, statistical analyses, manuscript preparation, conference presentations, publications, etc. This definition is discussed in greater detail in the later essay. More controversial, and less likely to find agreement among practitioners, is a definition of the scientific method. In fact, one of the more insightful and amusing observations on this problem was made by the Nobel laureate and philosopher of science, Sir Peter Medawar (1969, p. 11):
Ask a scientist what he conceives the scientific method to be and he will adopt an expression that is at once solemn and shifty- eyed: solemn, because he feels he ought to declare an opinion; shifty-eyed, because he is wondering how to conceal the fact that he has no opinion to declare.A sizable body of literature exists on the scientific method and there is little consensus among the authors. This does not mean that scientists do not know what they are doing. Doing and explaining may be two different things. For the purpose of outlining a methodology for the rational skeptic to apply to questionable claims, the following four step process may represent, on the simplest of levels, something that might be called the "scientific method":
Observations are what flesh out the hypothetico-deductive process and serve as the final arbiter for the validity of the predictions. As Sir Arthur Stanley Eddington noted: "For the truth of the conclusions of science, observation is the supreme court of appeal" (1958, p. 9).
Through the scientific method we form the following generalizations:
Through the scientific method we aim for:
Objectivity: The basing of conclusions on external validation.
Mysticism: The basing of conclusions on personal insights that lack external validation.
There is nothing wrong with personal insight. Many great scientists attributed important ideas to insight, intuition, and other equally difficult-to-define concepts. Alfred Wallace said that the idea of natural selection "suddenly flashed upon" him during an attack of malaria. Timothy Ferris called Einstein, "the great intuitive artist of science." But insightful and intuitive ideas do not gain acceptance until they are externally validated. As Richard Hardison explained (1988, p. 259-260):
Mystical "truths," by their nature, must be solely personal, and they can have no possible external validation. Each has equal claim to truth. Tea leaf reading and astrology and Buddhism; each is equally sound or unsound if we judge by the absence of related evidence. This is not intended to disparage any one of the faiths; merely to note the impossibility of verifying their correctness. The mystic is in a paradoxical position. When he seeks external support for his views he must turn to external arguments, and he denies mysticism in the process. External validation is, by definition, impossible for the mystic.Science leads us toward:
Rationalism: The basing of conclusions on the scientific method. For example, how do you know the Earth is round?
Dogmatism: The basing of conclusions on authority rather than science. For example, how do you know the Earth is round?
Because of the importance of this self-correcting feature, there is in the profession what Richard Feynman calls "a principle of scientific thought that corresponds to a kind of utter honesty - a kind of leaning over backwards." Feynman says: "If you're doing an experiment, you should report everything that you think might make it invalid - not only what you think is right about it: other causes that could possibly explain your results" (1988, p. 247).
Despite these built in mechanisms science is still subject to a number of problems and fallacies that even the most careful scientist and rational skeptic are aware can be troublesome. We can, however, find inspiration in those who have overcome them to make monumental contributions to our understanding of the world and ourselves. Charles Darwin is a sterling example of a scientist who struck the right balance in what Thomas Kuhn calls the "essential tension" in science between total acceptance of and devotion to the status quo, and an open willingness to explore and accept new ideas (1962, 1977). This delicate balance forms the basis of the whole concept of paradigm shifts in the history of science. When enough of the scientific community (particularly those in positions of power) are willing to abandon the old orthodoxy in favor of the (formerly) radical new theory, then, and only then can the paradigm shift occur.
This generalization about change in science is usually made about the paradigm as a system, but we must recognize that the paradigm is a cognitive framework in the minds of individuals. Darwinian scholar Frank Sulloway identifies three characteristics of Darwin's intellect and personality that mark him as one of the handful of giants in the history of science who found the right balance (1991, p. 28): "First, although Darwin indeed had unusual reverence for the opinions of others, he was obviously quite capable of challenging authority and thinking for himself." Second, "Darwin was also unusual as a scientist in his extreme respect for, and attention to, negative evidence." Darwin included, for example, a chapter on "Difficulties on Theory" in the Origin of Species; as a result his objectors were rarely able to present him with a challenge that he had not already confronted or addressed. And third was Darwin's "ability to tap the collective resources of the scientific community and to enlist other scientists as fellow collaborators in his own research projects." Darwin's collected correspondence numbers greater than 16,000 extant letters, most of which involve lengthy discussions and question-and-answer sequences about scientific problems. He was constantly questioning, always learning, confident enough to formulate original ideas, yet modest enough to recognize his own fallibility.
A fourth that might be mentioned is that Darwin maintained a good dollop of modesty and cautiousness that Sulloway sees as "a valuable attribute" that helps "prevent an overestimation of one's own theories." There is much to be learned in this regard from Darwin's Autobiography. Darwin confesses that he has "no great quickness of apprehension or wit which is so remarkable in some clever men," a lack of which makes him "a poor critic: a paper or book, when first read, generally excites my admiration, and it is only after considerable reflection that I perceive the weak points." Unfortunately many of Darwin's critics have selectively quoted such passages against him, not seeing the advantage Darwin saw in the patient avoidance of regrettable mistakes made in haste (1892, p. 55):
I think that I have become a little more skillful in guessing right explanations and in devising experimental tests; but this may probably be the result of mere practice, and of a larger store of knowledge. I have as much difficulty as ever in expressing myself clearly and concisely; and this difficulty has caused me a very great loss of time; but it has had the compensating advantage of forcing me to think long and intently about every sentence, and thus I have been often led to see errors in reasoning and in my own observations or those of others.His is a lesson in science and in life well worth learning. What Sulloway sees as particularly special about Darwin was his ability to resolve the essential tension within himself. "Usually, it is the scientific community as a whole that displays this essential tension between tradition and change," Sulloway observes, "since most people have a preference for one or the other way of thinking. What is relatively rare in the history of science is to find these contradictory qualities combined in such a successful manner in one individual" (p. 32).
Carl Sagan summed up the essential tension between skepticism and credulity in his CSICOP lecture on "The Burden of Skepticism":
It seems to me what is called for is an exquisite balance between two conflicting needs: the most skeptical scrutiny of all hypotheses that are served up to us and at the same time a great openness to new ideas. If you are only skeptical, then no new ideas make it through to you. You never learn anything new. You become a crochety old person convinced that nonsense is ruling the world. (There is, of course, much data to support you.)There is some hope that rational skepticism, and the vigorous application of the scientific method, can help us navigate through the treacherous straights between pure skepticism and unmitigated credulity.
On the other hand, if you are open to the point of gullibility and have not an ounce of skeptical sense in you, then you cannot distingush the useful ideas from the worthless ones. If all ideas have equal validity then you are lost, because then, it seems to me, no ideas have any validity at all" (in Basil, 1988, p. 366).
It is assumed that human beings are born with the ability to perceive cause-and-effect relationships. When we are born we have no cultural experience whatsoever. But we do not come into the world completely ignorant. We know lots of things - how to see, hear, digest food, track a moving object in the visual field, blink at approaching objects, become anxious when placed over a ledge, develop a taste aversion for noxious foods, and so on. We also inherit the traits our ancestors evolved in a world filled with predators and natural disasters, poisons and dangers, and risks from all sides. We are descended from the most successful ancestors at understanding causality.
Our brains are natural machines for piecing together events that may be related and for solving problems that require our attention. One can envision an ancient hominid from Africa chipping and grinding and shaping a rock into a sharp tool for carving up a large mammalian carcass. Or perhaps we can imagine the first individual who discovered that knocking flint would create a spark with which to start a fire. The wheel, the lever, the bow and arrow, the plow - inventions intended to allow us to shape our environment rather than be shaped by it - started civilization down a path that led to our modern scientific and technological world.
In his discussion of the rewards of science, Vincent Dethier, whose words opened this manifesto, runs through the pantheon of the obvious ones - monetary, security, honor - as well as the transcendent: "a passport to the world, a feeling of belonging to one race, a feeling that transcends political boundaries and ideologies, religions, and languages." But he brushes these aside for one "more lofty and more subtle." This is the natural curiosity of humans in their drive to understand the world:
One of the characteristics that sets man apart from all the other animals (and animal he undubitably is) is a need for knowledge for its own sake. Many animals are curious, but in them curiosity is a facet of adaptation. Man has a hunger to know. And to many a man, being endowed with the capacity to know, he has a duty to know. All knowledge, however small, however irrelevant to progress and well-being, is a part of the whole. It is of this the scientist partakes. To know the fly is to share a bit in the sublimity of Knowledge. That is the challenge and the joy of science (pp. 118-119).Children are naturally are curious, inquisitive, and exploratory of their environment. It is normal to want to know how things work and why the world is the way it is. At its most basic level, this is what science is all about. As Richard Feynman observed: "I've been caught, so to speak - like someone who was given something wonderful when he was a child, and he's always looking for it again. I'm always looking, like a child, for the wonders I know I'm going to find - maybe not every time, but every once in a while" (1988, p. 16). The most important question in education is this: what tools are children given to understand the world?
On the most basic of levels we must think or die. Those who are alive are thinking and using reason to a greater or lesser extent. Those who use more reason, those who employ rational skepticism, will attain greater satisfaction because they understand the cause of their satisfaction. It cannot be otherwise. As Ayn Rand concluded in her magnum opus Atlas Shrugged (1957, p. 1012):
Man cannot survive except by gaining knowledge, and reason is his only means to gain it . . . . Man's mind is his basic tool of survival. Life is given to him, survival is not. His body is given to him, its sustenance is not. His mind is given to him, its content is not. To remain alive, he must act, and before he can act he must know the nature and purpose of his action. He cannot obtain his food without a knowledge of food and of the way to obtain it. He cannot dig a ditch - or build a cyclotron - without a knowledge of his aim and of the means to achieve it. To remain alive, he must think.Over three centuries ago the French philosopher and skeptic René Descartes, after one of the most thorough skeptical purges in intellectual history, concluded that he knew one thing for certain: "Cogito ergo sum." "I think therefore I am."
By a similar analysis, to be human is to think. Therefore, to paraphrase Descartes:
"Sum Ergo Cogito." "I Am Therefore I Think."