Reproducibility

From RationalWiki
(Redirected from Replication crisis)
Jump to: navigation, search
Part of a series on the

Philosophy of science

Icon philosophy of science.svg
Foundations
Method
Conclusions

Reproducibility is a key part of the scientific method and refers to the possibility of reproducing an experimental result by a third party following the same procedures as the original experimenter.

Having the experiment repeated by an independent investigator 'reduces' the impact of a single study's experimental error and increases the chances of the hypothesis being accepted by the scientific community. Of course, reproducibility, alone, is not enough when even random numbers can reproduce prior results.[1]

One of the signs of pseudoscience is a lack of reproducibility in their experiments. Scientific experiments can also lack reproducibility,[2] but this just indicates either that the original experiment was flawed, that a better understanding of the problem is needed, or it is something that for its nature happens very often[3].

Replication crisis[edit]

The replication crisis means that many scientific experiments fail to replicate. Experiment replication attempts are part of the philosophy of good science. However, in the contemporary academic community there are powerful disincentives against them. Academic journals are generally not interested in publishing replication attempts and publications are important for the career of scientists. In contrast academic journals are interested in novel experiments with significant results.

A failure to replicate an experiment does not necessarily mean that there was anything wrong with the original experiment or that questionable research practices took place; the scientific effect found in the original experiment may be real and the failed replication may be due to chance, or due to differences in methodology. Obtaining the same or similar results using different methodology does however increase confidence in the results.

The replication crisis is most pronounced in the field of psychology.[4] A large-scale replication project was published in 2015 of the results of replication attempts of original published psychology experiments from 2008. Among social psychology experiments, only 14 out of 55 (25%) replicated; among cognitive psychology experiments, only 21/41 (50%) replicated.[5][6] However, the replication crisis has also emerged in the biomedical field. The drug company Bayer could only replicate 20% to 25% of 67 peer-reviewed studies in oncology, women's health, and cardiovascular diseases that were performed between 2007 and 2011.[7] Another drug company, Amgen, could only replicate 6 out of 53 "landmark" studies in cancer research.[8]

Notably, though, the most famous "failed replication" study seems to have its own replicability issues. Its failure to replicate a given study tended to coincide with divergent methodology from the original.[9]

See also[edit]

External links[edit]

References[edit]

  1. NJL Brown, DA MacDonald, MP Samanta, HL Friedman, and JC Coyne (2014). A critical reanalysis of the relationship between genomics and well-being. Proceedings of the National Academy of Sciences of the United States of America, 111(35):12705-12709.
  2. Is redoing scientific research the best way to find truth? During replication attempts, too many studies fail to pass muster
  3. For example the claimed detections of magnetic monopolesWikipedia's W.svg
  4. Closed Thinking: Without scientific competition and open debate, much psychology research goes nowhere (Science News, May 16, 2013)
  5. Psychology results evaporate upon further review: Surprising reports, findings with marginal statistical significance least likely to be reproduced, study concludes by Bruce Bower (2:00pm, August 27, 2015) Science News.
  6. Estimating the reproducibility of psychological science by B. A. Nosek et al. 28 August 2015: Vol. 349 no. 6251 Science DOI: 10.1126/science.aac4716.
  7. Prinz, F., Schlange, T., & Asadullah, K. (2011). Believe it or not: how much can we rely on published data on potential drug targets? Nature reviews Drug discovery, 10(9), 712. DOI:10.1038/nrd3439-c1
  8. Begley, C. G., & Ellis, L. M. (2012). Drug development: Raise standards for preclinical cancer research. Nature, 483, 531–533. DOI:10.1038/483531a
  9. Study that undercut psych research got it wrong, Harvard Gazette

<references>