From Skeptic vol. 1, no. 1, Spring 1992, pp. 15-21.

The following article is copyright © 1992 by the Skeptics Society, P.O. Box 338, Altadena, CA 91001, (818) 794-3119. Permission has been granted for noncommercial electronic circulation of this article in its entirety, including this notice. A special Internet introductory subscription rate to Skeptic is available. For more information, contact Jim Lippard (lippard@skeptic.com).

A SKEPTICAL MANIFESTO

By Michael Shermer, Ph.D.

Contents: On the opening page of the splendid little book To Know a Fly, Vincent Dethier makes this humorous observation of how children grow up to become scientists: "Although small children have taboos against stepping on ants because such actions are said to bring on rain, there has never seemed to be a taboo against pulling off the legs or wings of flies. Most children eventually outgrow this behavior. Those who do not either come to a bad end or become biologists" (1962, p. 2). The same could be said of skepticism. In their early years children are knowledge junkies, questioning everything in their view, though exhibiting little skepticism. Most never learn to distinguish between inquisitiveness and credulity. Those who do either come to a bad end or become professional skeptics.

James Randi is one of these. So too are the founders and fellows of the Committee for the Scientific Investigation of Claims of the Paranormal (CSICOP), the predecessor to the Skeptics Society whose journal - the Skeptical Inquirer - has set the standard toward which this and other such publications must strive in the pursuit of skepticism. But what does it mean to be skeptical? The word is a troublesome one because of the heavy baggage it carries. The word has different meanings for different people. (We had considered many names but decided that as long as it is defined the word is a useful one. Within the name we had also considered Institute for Rational Skepticism, but rejected it for fear that we might become known as the IRS, an organization about which many people are already skeptical!)

The Meaning and Limits of Skepticism

Skepticism has a long historical tradition dating back to ancient Greek thought. The foremost historian of skepticism, Richard Popkin, tells us: "Academic scepticism, so-called because it was formulated in the Platonic Academy in the third century, B.C., developed from the Socratic observation, 'All I know is that I know nothing'"(1979, p. xiii). Two of the popular received meanings of the word by many people today are that a skeptic believes nothing, or is closed minded to certain beliefs. There is good reason for the perception of the first meaning. The Oxford English Dictionary (OED) gives this common usage for the word skeptic: "One who, like Pyrrho and his followers in Greek antiquity, doubts the possibility of real knowledge of any kind; one who holds that there are no adequate grounds for certainty as to the truth of any proposition whatever" (1971, Vol. 2, p. 2663).

Since this position is sterile and unproductive and held by virtually no one (except a few confused solipsists who doubt even their own existence), it is no wonder that so many find skepticism disturbing. A more productive meaning of the word skeptic is the second usage given by the OED: "One who doubts the validity of what claims to be knowledge in some particular department of inquiry; one who maintains a doubting attitude with reference to some particular question or statement."

The history of the word skeptic and skepticism is interesting and often amusing. In 1672, for example, the Philosophical Transactions VII records this passage: "Here he taketh occasion to examine Pyrrhonisme or Scepticisme, professed by a Sect of men that speak otherwise than they think." The charge is true. The most ardent skeptics enjoy their skepticism as long as it does not encroach upon their most cherished beliefs. Then incredulity flies out the window. I received a call recently from a gentleman who professed to be a skeptic, wanted to support the organization, and agrees with our skepticism about everything except the power of vitamins to restore health and attenuate disease. He hoped I would not be organizing any skeptical lectures or articles on this field, which, he explained, has now been proven scientifically to be effective. "Your field wouldn't be vitamin therapy would it?" I inquired. "You bet it is!" he responded.

It is easy, even fun to challenge others' beliefs, when we are smug in our certainty about our own. But when ours are challenged, it takes great patience and ego strength to listen with an unjaundiced ear. But there is a deeper flaw in pure skepticism. Taken to an extreme the position by itself cannot stand. The OED gives us this 1674 literary example (Tucker Lt. Nat. II): "There is an air of positiveness in all scepticism, an unreserved confidence in the strength of those arguments that are alleged to overthrow all the knowledge of mankind." Skepticism is itself a positive assertion about knowledge, and thus turned on itself cannot be held. If you are skeptical about everything, you would have to be skeptical of your own skepticism. Like the decaying sub-atomic particle, pure skepticism uncoils and spins off the viewing screen of our intellectual cloud chamber.

Skepticism alone does not produce progress. It is not enough simply to reject the irrational. Skepticism must be followed with something rational, or something that does produce progress. As the Austrian economist Ludwig von Mises warned against those anti-communists who presented no rational alternative to the system of which they were so skeptical (1956, p. 112):

An anti-something movement displays a purely negative attitude. It has no chance whatever to succeed. Its passionate diatribes virtually advertise the program they attack. People must fight for something that they want to achieve, not simply reject an evil, however bad it may be.
Carl Sagan sounded a similar warning to the professional skeptics at the 1987 CSICOP annual meeting: "You can get into a habit of thought in which you enjoy making fun of all those other people who don't see things as clearly as you do. We have to guard carefully against it" (in Basil, 1988, p. 366).

The Rational Skeptic

The second popular notion that skeptics are closed-minded to certain beliefs comes from a misunderstanding of skepticism and science. Skeptics and scientists are not necessarily "closed-minded" (though they may be since they are human). They may have been open-minded to a belief but when the evidence failed to support the belief, they rejected it. There are already enough legitimate mysteries in the universe for which evidence provides scientists fodder for their research, that to take the time to consider "unseen" or "unknown" mysteries is not always practical. When the non-skeptic says, "you're just closed-minded to the unknown forces of the universe," the skeptic responds: "We're still trying to understand the known forces of the universe."

It is for these reasons that it might be useful to modify the word skeptic with "rational." Again, it is constructive to examine the usage and history of this word that is so commonly used. Rational is given as: "Having the faculty of reasoning; endowed with reason" (OED, p. 2420). And reason as "A statement of some fact employed as an argument to justify or condemn some act, prove or disprove some assertion, idea, or belief" (p. 2431). It may seem rather pedantic to dig through the dictionary and pull out arcane word usages and histories. But it is constructive to know how a word was intended to be used and what it has come to mean. They are often not the same, and more often than not, they have multiple usages such that when two people communicate they are frequently talking at cross purposes. One person's skepticism may be another's credulity. And who does not think they are rational when it comes to their own beliefs and ideologies?

It is also important to remember that dictionaries do not give definitions. They give usages. For a listener to understand a speaker, and for a reader to follow a writer, important words must be defined with semantic precision for communication to be successful. What I mean by skeptic is the second usage above: "One who doubts the validity of what claims to be knowledge in some particular department of inquiry." And by rational: "A statement of some fact employed as an argument to justify or condemn some act, prove or disprove some assertion, idea, or belief." But these usages leave out one important component: the goal of reason and rationality. The ultimate end to thinking is to understand cause-and-effect relationships in the world around us. It is to know the universe, the world, and ourselves. Since rationality is the most reliable means of thinking, a rational skeptic may be defined as:

One who questions the validity of particular claims of knowledge by employing or calling for statements of fact to prove or disprove claims, as a tool for understanding causality.

But what method shall we employ? Just being skeptical will lead us to no conclusion other than the Socratic conclusion that we do not know. The answer, in a word, is science, and the method, in two words, is the scientific method.

Science and the Rational Skeptic

Needless to say, reviewing the usages and history of the word science would be inappropriately long here, and I have already done this to a certain extent in the essay at the end of this issue. For purposes of clarity science will be taken to mean: a set of cognitive and behavioral methods designed to describe and interpret observed or inferred phenomenon, past or present, aimed at building a testable body of knowledge open to rejection or confirmation.

Science is a specific way of thinking and acting - a tool for understanding information that is perceived directly or indirectly ("observed or inferred"). "Past or present" refers to both the historical and the experimental sciences. Cognitive methods include hunches, guesses, ideas, hypotheses, theories, paradigms, etc.; behavioral methods include background research, data collection, data organization, colleague collaboration and communication, experiments, correlation of findings, statistical analyses, manuscript preparation, conference presentations, publications, etc. This definition is discussed in greater detail in the later essay. More controversial, and less likely to find agreement among practitioners, is a definition of the scientific method. In fact, one of the more insightful and amusing observations on this problem was made by the Nobel laureate and philosopher of science, Sir Peter Medawar (1969, p. 11):

Ask a scientist what he conceives the scientific method to be and he will adopt an expression that is at once solemn and shifty- eyed: solemn, because he feels he ought to declare an opinion; shifty-eyed, because he is wondering how to conceal the fact that he has no opinion to declare.
A sizable body of literature exists on the scientific method and there is little consensus among the authors. This does not mean that scientists do not know what they are doing. Doing and explaining may be two different things. For the purpose of outlining a methodology for the rational skeptic to apply to questionable claims, the following four step process may represent, on the simplest of levels, something that might be called the "scientific method":
  1. Observation: Gathering data through the senses or sensory enhancing technologies.
  2. Induction: Drawing general conclusions from the data. Forming hypothesis.
  3. Deduction: Making specific predictions from the general conclusions.
  4. Verification: Checking the predictions against further observations.
Science, of course, is not this rigid; and no scientist consciously goes through such "steps." The process is a constantly interactive one between making observations, drawing conclusions, making predictions, and checking them against further evidence. This process constitutes the core of what philosophers of science call the hypothetico-deductive method, which involves "(a) putting forward a hypothesis, (b) conjoining it with a statement of 'initial conditions', © deducing from the two a prediction, and (d) finding whether or not the prediction is fulfilled" (Bynum, Browne, Porter, 1981, p. 196).

Observations are what flesh out the hypothetico-deductive process and serve as the final arbiter for the validity of the predictions. As Sir Arthur Stanley Eddington noted: "For the truth of the conclusions of science, observation is the supreme court of appeal" (1958, p. 9).

Through the scientific method we form the following generalizations:

These may be opposed to a construct, or a non-testable statement to account for a set of observations. The observation of living organisms on Earth may be accounted for by God or by evolution. The first statement is a construct, the second a theory. Most biologists would even call evolution a fact by the above definition.

Through the scientific method we aim for:

Objectivity: The basing of conclusions on external validation.

We avoid:

Mysticism: The basing of conclusions on personal insights that lack external validation.

There is nothing wrong with personal insight. Many great scientists attributed important ideas to insight, intuition, and other equally difficult-to-define concepts. Alfred Wallace said that the idea of natural selection "suddenly flashed upon" him during an attack of malaria. Timothy Ferris called Einstein, "the great intuitive artist of science." But insightful and intuitive ideas do not gain acceptance until they are externally validated. As Richard Hardison explained (1988, p. 259-260):

Mystical "truths," by their nature, must be solely personal, and they can have no possible external validation. Each has equal claim to truth. Tea leaf reading and astrology and Buddhism; each is equally sound or unsound if we judge by the absence of related evidence. This is not intended to disparage any one of the faiths; merely to note the impossibility of verifying their correctness. The mystic is in a paradoxical position. When he seeks external support for his views he must turn to external arguments, and he denies mysticism in the process. External validation is, by definition, impossible for the mystic.
Science leads us toward:

Rationalism: The basing of conclusions on the scientific method. For example, how do you know the Earth is round?

  1. The shadow on the moon is round.
  2. The mast of a ship is the last thing seen.
  3. The horizon is curved.
  4. Photographs from space.

Science helps us avoid:

Dogmatism: The basing of conclusions on authority rather than science. For example, how do you know the Earth is round?

  1. My parents told me.
  2. My teachers told me.
  3. My minister told me.
  4. My textbook told me.

Dogmatic conclusions are not necessarily invalid but they do pose another question: How did the authorities come by their conclusions? Did they use science or some other means?

The Essential Tension Between Skepticism and Credulity

It is important too that we recognize the fallibility of science and the scientific method. But within this fallibility lies its greatest strength: self-correction. Whether mistakes are made honestly or dishonestly, whether a fraud is unknowingly or knowingly perpetrated, in time it will be flushed out of the system through the lack of external verification. The cold fusion fiasco is a classic example of the system's swift consequences for error and hasty publication.

Because of the importance of this self-correcting feature, there is in the profession what Richard Feynman calls "a principle of scientific thought that corresponds to a kind of utter honesty - a kind of leaning over backwards." Feynman says: "If you're doing an experiment, you should report everything that you think might make it invalid - not only what you think is right about it: other causes that could possibly explain your results" (1988, p. 247).

Despite these built in mechanisms science is still subject to a number of problems and fallacies that even the most careful scientist and rational skeptic are aware can be troublesome. We can, however, find inspiration in those who have overcome them to make monumental contributions to our understanding of the world and ourselves. Charles Darwin is a sterling example of a scientist who struck the right balance in what Thomas Kuhn calls the "essential tension" in science between total acceptance of and devotion to the status quo, and an open willingness to explore and accept new ideas (1962, 1977). This delicate balance forms the basis of the whole concept of paradigm shifts in the history of science. When enough of the scientific community (particularly those in positions of power) are willing to abandon the old orthodoxy in favor of the (formerly) radical new theory, then, and only then can the paradigm shift occur.

This generalization about change in science is usually made about the paradigm as a system, but we must recognize that the paradigm is a cognitive framework in the minds of individuals. Darwinian scholar Frank Sulloway identifies three characteristics of Darwin's intellect and personality that mark him as one of the handful of giants in the history of science who found the right balance (1991, p. 28): "First, although Darwin indeed had unusual reverence for the opinions of others, he was obviously quite capable of challenging authority and thinking for himself." Second, "Darwin was also unusual as a scientist in his extreme respect for, and attention to, negative evidence." Darwin included, for example, a chapter on "Difficulties on Theory" in the Origin of Species; as a result his objectors were rarely able to present him with a challenge that he had not already confronted or addressed. And third was Darwin's "ability to tap the collective resources of the scientific community and to enlist other scientists as fellow collaborators in his own research projects." Darwin's collected correspondence numbers greater than 16,000 extant letters, most of which involve lengthy discussions and question-and-answer sequences about scientific problems. He was constantly questioning, always learning, confident enough to formulate original ideas, yet modest enough to recognize his own fallibility.

A fourth that might be mentioned is that Darwin maintained a good dollop of modesty and cautiousness that Sulloway sees as "a valuable attribute" that helps "prevent an overestimation of one's own theories." There is much to be learned in this regard from Darwin's Autobiography. Darwin confesses that he has "no great quickness of apprehension or wit which is so remarkable in some clever men," a lack of which makes him "a poor critic: a paper or book, when first read, generally excites my admiration, and it is only after considerable reflection that I perceive the weak points." Unfortunately many of Darwin's critics have selectively quoted such passages against him, not seeing the advantage Darwin saw in the patient avoidance of regrettable mistakes made in haste (1892, p. 55):

I think that I have become a little more skillful in guessing right explanations and in devising experimental tests; but this may probably be the result of mere practice, and of a larger store of knowledge. I have as much difficulty as ever in expressing myself clearly and concisely; and this difficulty has caused me a very great loss of time; but it has had the compensating advantage of forcing me to think long and intently about every sentence, and thus I have been often led to see errors in reasoning and in my own observations or those of others.
His is a lesson in science and in life well worth learning. What Sulloway sees as particularly special about Darwin was his ability to resolve the essential tension within himself. "Usually, it is the scientific community as a whole that displays this essential tension between tradition and change," Sulloway observes, "since most people have a preference for one or the other way of thinking. What is relatively rare in the history of science is to find these contradictory qualities combined in such a successful manner in one individual" (p. 32).

Carl Sagan summed up the essential tension between skepticism and credulity in his CSICOP lecture on "The Burden of Skepticism":

It seems to me what is called for is an exquisite balance between two conflicting needs: the most skeptical scrutiny of all hypotheses that are served up to us and at the same time a great openness to new ideas. If you are only skeptical, then no new ideas make it through to you. You never learn anything new. You become a crochety old person convinced that nonsense is ruling the world. (There is, of course, much data to support you.)

On the other hand, if you are open to the point of gullibility and have not an ounce of skeptical sense in you, then you cannot distingush the useful ideas from the worthless ones. If all ideas have equal validity then you are lost, because then, it seems to me, no ideas have any validity at all" (in Basil, 1988, p. 366).

There is some hope that rational skepticism, and the vigorous application of the scientific method, can help us navigate through the treacherous straights between pure skepticism and unmitigated credulity.

The Tool of the Mind

Science is the best method humankind has devised for understanding causality. Therefore the scientific method is our most effective tool for understanding the causes of the effects we are confronted with in our personal lives as well as in nature. There are few human traits that most observers would call truly universal. Most would consent, however, that survival of the species as a whole, and the achivement of greater happiness of individuals in particular, are universals that virtually every human being seeks. We have seen the interrelationship between science, rationality, and rational skepticism. Thus, we may go so far as to say that the survival of the human species and the attainment of greater happiness for individuals depend on the ability to think scientifically, rationally, and skeptically.

It is assumed that human beings are born with the ability to perceive cause-and-effect relationships. When we are born we have no cultural experience whatsoever. But we do not come into the world completely ignorant. We know lots of things - how to see, hear, digest food, track a moving object in the visual field, blink at approaching objects, become anxious when placed over a ledge, develop a taste aversion for noxious foods, and so on. We also inherit the traits our ancestors evolved in a world filled with predators and natural disasters, poisons and dangers, and risks from all sides. We are descended from the most successful ancestors at understanding causality.

Our brains are natural machines for piecing together events that may be related and for solving problems that require our attention. One can envision an ancient hominid from Africa chipping and grinding and shaping a rock into a sharp tool for carving up a large mammalian carcass. Or perhaps we can imagine the first individual who discovered that knocking flint would create a spark with which to start a fire. The wheel, the lever, the bow and arrow, the plow - inventions intended to allow us to shape our environment rather than be shaped by it - started civilization down a path that led to our modern scientific and technological world.

In his discussion of the rewards of science, Vincent Dethier, whose words opened this manifesto, runs through the pantheon of the obvious ones - monetary, security, honor - as well as the transcendent: "a passport to the world, a feeling of belonging to one race, a feeling that transcends political boundaries and ideologies, religions, and languages." But he brushes these aside for one "more lofty and more subtle." This is the natural curiosity of humans in their drive to understand the world:

One of the characteristics that sets man apart from all the other animals (and animal he undubitably is) is a need for knowledge for its own sake. Many animals are curious, but in them curiosity is a facet of adaptation. Man has a hunger to know. And to many a man, being endowed with the capacity to know, he has a duty to know. All knowledge, however small, however irrelevant to progress and well-being, is a part of the whole. It is of this the scientist partakes. To know the fly is to share a bit in the sublimity of Knowledge. That is the challenge and the joy of science (pp. 118-119).
Children are naturally are curious, inquisitive, and exploratory of their environment. It is normal to want to know how things work and why the world is the way it is. At its most basic level, this is what science is all about. As Richard Feynman observed: "I've been caught, so to speak - like someone who was given something wonderful when he was a child, and he's always looking for it again. I'm always looking, like a child, for the wonders I know I'm going to find - maybe not every time, but every once in a while" (1988, p. 16). The most important question in education is this: what tools are children given to understand the world?

On the most basic of levels we must think or die. Those who are alive are thinking and using reason to a greater or lesser extent. Those who use more reason, those who employ rational skepticism, will attain greater satisfaction because they understand the cause of their satisfaction. It cannot be otherwise. As Ayn Rand concluded in her magnum opus Atlas Shrugged (1957, p. 1012):

Man cannot survive except by gaining knowledge, and reason is his only means to gain it . . . . Man's mind is his basic tool of survival. Life is given to him, survival is not. His body is given to him, its sustenance is not. His mind is given to him, its content is not. To remain alive, he must act, and before he can act he must know the nature and purpose of his action. He cannot obtain his food without a knowledge of food and of the way to obtain it. He cannot dig a ditch - or build a cyclotron - without a knowledge of his aim and of the means to achieve it. To remain alive, he must think.
Over three centuries ago the French philosopher and skeptic René Descartes, after one of the most thorough skeptical purges in intellectual history, concluded that he knew one thing for certain: "Cogito ergo sum." "I think therefore I am."

By a similar analysis, to be human is to think. Therefore, to paraphrase Descartes:

"Sum Ergo Cogito." "I Am Therefore I Think."

Bibliography

Basil, R. 1988. Not Necessarily the New Age. Buffalo: Prometheus Books.
Bynum, W.F., E.J. Browne, R. Porter. Dictionary of the History of Science. Princeton: Princeton University Press.
Darwin, C. 1892. The Autobiography of Charles Darwin. Francis Darwin (Ed.). New York: Dover.
Dethier, V.G. 1962. To Know a Fly. San Francisco: Holden-Day.
Eddington, A. S. 1958. The Philosophy of Physical Science. Ann Arbor: University of Michigan Press.
Feynman, R.P. 1988. What Do You Care What Other People Think? New York: W.W.Norton.
Gould, S. J. 1983. Hen's Teeth and Horse's Toes. New York: W.W. Norton.
Hardison, R.C. 1988. Upon the Shoulders of Giants. New York: University Press of America.
Kuhn, T. S. 1962. The Structure of Scientific Revolutions. Chicago: University of Chicago Press.
_____ . 1977. The Essential Tension: Selected Studies in Scientific Tradition and Change. Chicago: University of Chicago Press.
Medawar, P. 1969. Induction and Intuition in Scientific Thought. London.
Mises, L.V. 1956. The Anti-Capitalistic Mentality. New York: D. Van Nostrand.
Oxford English Dictionary. 1971. Oxford.
Popkin, R. H. 1979. The History of Scepticism from Erasmus to Spinoza. Berkeley: University of California Press.
Rand, A. 1957. Atlas Shrugged. New York: Random House.
Sulloway, F. J. 1991. "Darwinian Psychobiography." A review of Charles Darwin: A new Life by John Bowlby. The New York Review of Books, October 10.
Return to table of contents.