Information icon.svg Results for the 2024 RationalWiki Moderator Election have now been posted. Thank you for participating in this election, and congratulations to the winners!
Bronze-level article

Scientific method

From RationalWiki
Jump to navigation Jump to search
Eyes wearing
inverted lenses

Philosophy of science
Icon philosophy of science.svg
Foundations
Method
Conclusions
Science is far from the perfect instrument of knowledge. It's just the best we have.
Carl Sagan, The Demon-Haunted World: Science as a Candle in the Dark[1]

The scientific method is an epistemological system for deriving and developing knowledge. Some people consider it the best method for making useful and practical additions to human knowledge about the physical world, and it has resulted in the technological leaps made in its spread throughout the Western world.[citation needed] The scientific method can also be described as a learning process.

Galileo Galilei (1564-1642) and Francis Bacon (1561-1626) figured among the first European scientists to use the scientific method as we know it, prioritizing it over the Ancient Greek tradition of knowledge-generation, a tradition that prioritised rational thought over empiricism. Prior to this, thinkers in the "Islamic Golden Age"Wikipedia[2] (7th to 12th centuries) made wide use of scientific methodology. In 1021 CE, for example, Ibn al-Haytham, working in Africa, emphasised the primacy of experimentation in his Book of Optics.[3][4][5][6] Arabic science also generated a system of peer review.[7][8] The scientific method was not always accepted during its period of development, and the work of Hungarian physician Ignaz Semmelweis (1818-1865) on disinfection procedures provides a telling example of what happens when know-it-alls ignore the method, or the conclusions arrived at by its use.

At the core of modern scientific practice method is the idea that the value of a hypothesis, theory, or concept is best determined by its ability to make falsifiable predictions that one can test against empirical reality. This means that supernatural entities or concepts that are meaningless or logically contradictory cannot be included in a scientific hypothesis (not least because of the difficulty of putting a sample of a deity in a test-tube). Consequently, when carrying out investigations scientists assume a position of methodological naturalism.

Humans, including scientists, are fallible and irrational apes by nature. The scientific method, accordingly, helps these highly imperfect beings iron out their biases, attain a reasonable degree of objectivity, and develop reliable and sometimes useful results.

How to do science[edit]

Creationists probably wouldn't come away with any points.
Aah, there's nothing more exciting than science. You get all the fun of sitting still, being quiet, writing down numbers, paying attention... Science has it all.
—Principal Skinner, Bart's Comet[9]

The scientific method isn't a simple, linear process, but is wrapped up in the complexities of research in the real world and the practicalities of what is possible. However, the idea of testing a hypothesis and refining knowledge based on observation is a constant theme of science.[10]

Despite the lack of simple linearity in reality, the method has often been codified into stages that make it easier to understand. Essentially, the following five steps make up the scientific method:

  1. Observe - Look at the world and find a result that seems curious. As Isaac Asimov put it, "The most exciting phrase to hear in science, the one that heralds new discoveries, is not Eureka! (I found it!) but rather, 'Hmm... that's funny...'"
  2. Hypothesize - Come up with a possible explanation.
  3. Predict - The most important part of a hypothesis or theory is its ability to make predictions that have yet to be observed. A hypothesis that makes no new predictions is scientifically worthless. Predictions must be falsifiable (theoretically, new evidence can show the prediction to be false) and specific (what is predicted must not be open to interpretation after the experiment begins, or else the only thing you're testing is your ability to reinterpret your incorrect theory).
  4. Test Predictions (in physical sciences this is called Experiment) - Compare the predictions with new[11] empirical evidence (usually experimental evidence, often supported by mathematics). This step is the reason why a hypothesis or theory has to be falsifiable — if there's nothing to falsify, then the experiment is pointless because it's guaranteed to tell you nothing new. Information from the experiment can disprove the original hypothesis, which might be refined into a better one.
  5. Reproduce - ensure the result is a true reflection of reality by verifying it with others.

The testing of hypotheses allows for error correction and the development of better models. One of the notable examples is the development of atomic theory - the theory that describes what atoms "look like." From Dalton's indivisible model, to Thomson's "plum pudding" model, to Rutherford's teeny-tiny nucleus model, and then to the Bohr Model and modern quantum physics, the atom developed in steps because each model made predictive statements that could be tested. Thus the theory is refined over time and as observational evidence increases in support of it. That evidence supports a hypothesis implies that the hypothesis is stronger (and more likely to be valid) than before the test. On the other hand, evidence against a hypothesis makes it invalid, thus falsifying it.[12] It is an inductive method, although its results can be used deductively as well.

All but the first two steps are omitted from the process in pseudosciences such as intelligent design (where step 3 would be impossible) and most borderline-supernatural alternative medicines like homeopathy. Pseudosciences do observe the world, and do come up with explanations, but are often unable or unwilling to follow through in testing them more thoroughly. Refining the hypotheses is also undesirable in pseudoscience as this could lead to abandoning the central dogma of the belief - imagine where modern technology would be if the scientists of the 20th century refused to modify the structure of the atom as new observational evidence came in? However, because observations and explanations still form a part of pseudoscience and can be phrased in a scientific style, pseudosciences may mistakenly appear to have scientific authority.

Application in different disciplines[edit]

In practice, different academic disciplines apply the scientific method in what may at first appear to be different ways, but fundamentally all use strong inference based on falsifiable hypotheses. For example, in fields such as astrophysics, evolution, and geology experiments can be difficult or impossible because of the scale of space and time involved. We can't set up a controlled experiment involving hundreds of light years, millions of years, or replicates of hundreds of Earths. Instead however we can use mathematical models of planetary behaviour to understand orbital patterns, comparative analysis of characteristics of fossil and extant organisms to build evolutionary trees, and bore-hole samples to interpret subsurface geology.

Role of observation and insight[edit]

Observation and insight are a key part of scientific inquiry. For example, in the history of biology, much of the early work involved detailed collection, description, and classification of organisms. The extensive early work documented in museum collections and old tomes, along with personal experience as an exploratory biologist on the HMS Beagle, served as the fodder for Charles Darwin's conception of evolution by natural selection. Similarly Albert Einstein's theory of relativity was based on a solid understanding of Newtonian physics, along with personal observations of relative movement while gazing out a train window. Observation and insight are the grist to generate hypotheses and theories, and the full scientific method is necessary for hypotheses and theories to withstand the test of time.

Skepticism[edit]

Scientific skepticism is a vital element in the scientific process, ensuring that no new hypothesis is considered a Theory (capped T) until sufficient evidence is provided and other scientists have had their chances to debunk it. Even then, all of science is always considered a "good working model" and the "best understanding we have at the present time." No scientific idea is ever considered "the final word," nor the Word of God. It is always assumed that someone, somewhere is out to disprove the current theory.

You must not say that this cannot be, or that that is contrary to nature. You do not know what Nature is, or what she can do; and nobody knows; not even Sir Roderick Murchison, or Professor Owen, or Professor Sedgwick, or Professor Huxley, or Mr. Darwin, or Professor Faraday, or Mr. Grove, or any other of the great men whom good boys [and girls] are taught to respect. They are very wise men; and you must listen respectfully to all they say: but even if they should say, which I am sure they never would, "That cannot exist. That is contrary to nature," you must wait a little, and see; for perhaps even they may be wrong.
—Sir Charles Kingsley[13]

Objectivity and bias[edit]

The scientific method helps us pursue the ideal of scientific objectivity, protecting against bias that could lead to false conclusions. Bias, in the sense of inclinations or preconceptions, is part of being human, and has a role in scientific inquiry insofar as it guides what questions to ask and how to ask them. At the same time bias leads to championing a particular conclusion a priori, independent of evidence, belief, not necessarily reality. The scientific method explicitly seeks to remove bias through rigorous hypothesis testing and reproducing results. Bias can enter in many different ways, including the initial framing of an inquiry, the time scale examined, and innate properties of the system being examined. For example, a pharmaceutical compound may be approved as safe because it appears safe and effective in short-term studies, while it may later be shown to be ineffective or unsafe in long-term studies. In essence, the scientific method serves as a tool to keep bias in check.

Philosophical perspectives[edit]

The philosophy of science dates back to the Greeks, but it began to take its modern form during the scientific revolution. Two competing schools of thought emerged at this point: the rationalist tradition associated with René Descartes and the empiricist tradition of Francis Bacon. During the 18th century, David Hume philosophically undermined the scientific method with his problem of induction[14] and his deconstruction of causation.[15]

A synthesis of rationalism and empiricism arose in the 18th century with the work of Immanuel Kant[16] and continued in the 19th century among pragmatist philosophers such as Charles Sanders Peirce.[17] During the 20th century, the logical positivists attempted to do away with pesky metaphysics and a number of other branches of philosophy altogether. The enterprise failed when it was noticed that the verification principle that logical positivism built on was self-refuting. Karl Popper (1902-1994) replaced verifiability with falsifiability, that is, for an idea to be popperly "scientific" it must be possible to devise an experiment (even a thought experiment) that could render it false. Popper intended falsification both as a solution to the demarcation problem and as a workaround for Hume's problem of induction.[18] Thomas Kuhn took a more historical approach to thinking about science, aiming to get a better picture of how science was practiced in reality. He described the dynamics of scientific change, coining the terms scientific revolution and paradigm shift to help describe what he saw as the way a fundamentally conservative set of ideas could be overturned and become a new, different set of conservative ideas. Kuhn rejected the idea that there was only one scientific method. This influenced the practitioners of what would become the sociology of science as well as other philosophers, such as Imre Lakatos. Lakatos conceived of science as split into numerous paradigms he called "research programmes", each making use of its own methodology and assumptions. (Summary: Humans remain humans and don't naturally think in a scientific manner, but have to learn it, and easily backslide.)

Other schools of "scientific criticism" look at science critically from an economic perspective, or focus on discourse, but these are more academic and less practical critiques.

Unintentional short circuiting of the scientific method[edit]

In order to look for "data" you need to have a model or "structure" of how the world works. The problem as James Burke pointed out in the "Worlds Without End" episode of Day the Universe Changed that structure can drive every part of your research even what you accept as reliable data.

This possibility of the structure driving the data rather than the data driving the structure had been hammered home in anthropological circles back in 1956 with Horace Miner's bitingly satirical "Body Ritual among the Nacirema."[19] Often referenced as a satirical look at American culture, it was also a look at anthropological work of the time and the "Look at these poor primitives who believe in magic that we are so much wiser than" attitude so common in professional publications of the time. Miner showed that with that model any culture (even that of then modern 1950s United States) could be dismissed as a bunch of magic-using savages.

In "Worlds Without End" Burke points out one of the reasons the Piltdown hoax lasted as long at it did was it fitted the then prevalent structure of finding a human like skull with an ape-like face. In fact, in 1913, David Waterston of King's College London stated in Nature that the find and an ape mandible and human skull[20] and French paleontologist Marcellin Boule said the same thing in 1915. In 1923 Franz Weidenreich stated after careful examination that the Piltdown find was a modern human cranium and an orangutan jaw with filed-down teeth[21] but because Piltdown fit the structure so well other scientists let the model drive their thinking rather than the evidence itself.

Extra Credit points out in God Does Not Play Dice - The Danger of Unquestioned Belief that you have to have a series of postulates to even begin to formulate anything but that if you hold on to the postulates as if they are fact then it can and will blind one to acknowledging the system being used may be flawed.


A related problem is that more information makes one more confident on the theory they have formulated but that does not correlate on how accurate it is.[22]

Cheating the scientific method[edit]

Pseudoscientists have discovered an obvious way to 'cheat' the scientific method. It goes like this:

  1. Pick a personal belief that you already 'know' is true, but for which you want 'proof'.
  2. Perform some related observations or experiments, and note the results.
  3. Generate a hypothesis that shoehorns said results into your personal belief.
  4. Falsely claim that your personal belief predicts the particular results, and that the observations/experiment confirmed your suspicions.

This is a blatant perversion of the scientific method, but to someone not versed in science, fallacies, or psychology, it might seem similar enough to be accepted as legitimate.

This manner of cheating has been used by proponents of intelligent design. Note that this isn't limited to pseudoscientists such as those trying to grant legitimacy to intelligent design, but is a mistake frequently made even by "proper" scientists, if they focus too much on finding evidence that supports their hypothesis (their "belief"), instead of focusing on attempting to find evidence that would refute it, or on attempting to find evidence that would refute competing hypotheses.

See also[edit]

Icon fun.svg For those of you in the mood, RationalWiki has a fun article about Pseudoscientific method.

External links[edit]

References[edit]

  1. Science and Democracy, Carl Sagan
  2. Falagas et al., Arab science in the golden age (750–1258 C.E.) and today, The FASEB Journal
  3. El-Bizri, Nader, "A Philosophical Perspective on Ibn al-Haytham's Optics", Arabic Sciences and Philosophy 15 (2005-08-05), 189–218
  4. Malik, Kenan (2010-10-22). "Pathfinders: The Golden Age of Arabic Science, By Jim Al-Khalili". The Independent. Retrieved 2014-10-22.
  5. Haq, Syed (2009). "Science in Islam". Oxford Dictionary of the Middle Ages. ISSN 1703-7603. Retrieved 2014-10-22.
  6. Sabra, A. I. (1989). The Optics of Ibn al-Haytham. Books I–II–III: On Direct Vision. London: The Warburg Institute, University of London. pp. 25–29. ISBN 0-85481-072-2.
  7. Spier, Ray (2002). "The history of the peer-review process". Trends in Biotechnology 20 (8): 357–8. doiWikipedia:10.1016/S0167-7799(02)01985-6. PMID 12127284. 
  8. Arabic science went into a steady decline as the Mongol invasionWikipedia ravaged the Islamic world in the 13th century. [1]
  9. The Simpsons Wiki: Bart's Comet
  10. Robinson, W. R. "The Inquiry Wheel, An Alternative to the Scientific Method." J. Chem. Ed. 2004(81): 791-2.
  11. As in, not the same one from the "observe" stage.
  12. That the scientific method works this way is implied by Bayesian mathematics.
  13. The Water Babies, Sir Charles Kingsley, 1862.
  14. The Problem of Induction: Hume, Induction, and Justification, Stanford Encyclopedia of Philosophy
  15. David Hume: Causation, Internet Encyclopedia of Philosophy
  16. Immanuel Kant, Stanford Encyclopedia of Philosophy
  17. Charles Sanders Peirce: Pragmatism, Pragmaticism, and the Scientific Method, Stanford Encyclopedia of Philosophy
  18. Interestingly, both the positivists and Popper acknowledged that their principles could not live up to their own standards (i.e., the principle of verification was itself unverifiable and falsification was unfalsifiable, but they did not consider this to be a problem.
  19. Miner, Horace (1956). Body Ritual among the Nacirema. American Anthropologist 58:3, June 1956.
  20. Gould, Stephen J. (1980). The Panda's Thumb. W. W. Norton and Co., pp. 108–124, ISBN 0-393-01380-4
  21. MacRitchie, Finlay (2011). Scientific Research as a Career. CRC Press. p. 30. ISBN 1439869650.
  22. "Fazio LK1, Brashier NM2, Payne BK3, Marsh EJ2. Knowledge Does Not Protect Against Illusory Truth" Journal of Experimental Psychology: General © 2015 American Psychological Association 2015, Vol. 144, No. 5, 993–1002