Summary: Meta-research aims to identify and improve problems that exist across scientific disciplines (e.g. physics, biology, economics etc.) – for example, by improving common statistical methods. Surprisingly, our analysis shows that many hard sciences such as climate physics, macroeconomics, and cancer biology, have problems with reproducibility that meta-research addresses. By improving many fields of science at once, meta-research is a potentially high-value funding opportunity. Thus, donors might want to focus on indirect “superficial” contributions with wide applicability rather than relatively “profound” contributions limited to one narrow field, because funding meta-research can be much more effective than funding a single science “just as a lake can contain a lot more water than a well, even if the well is deeper”[1]. Or as John Ionides puts it “Even 1% improvement in the yield and translation of useful discoveries effected through better research practices reflects value equivalent of many Nobel or Breakthrough prizes.”[2]


Table of contents

What is meta-research?

The scale of the problem: waste in research

Case studies  where better research could have prevented deaths

Neglectedness / Crowdedness

Tractability

Arguments against investing in meta-research

“Little relevance to hard sciences” argument

Scientific fields with reproducibility problems

Does meta-research contribute to differential technological development?

References


What is meta-research?

Meta-research, sometimes called ‘meta-science’, has been formally defined as:

“an evolving scientific discipline that aims to evaluate and improve research practices. It includes thematic areas of methods, reporting, reproducibility, evaluation, and incentives (how to do, report, verify, correct, and reward science”.[3]

For instance, many research findings are based on flawed statistics and cannot be replicated – a phenomenon often referred to as the “replication crisis”[4]. This extends far beyond the field of psychological science which is sometimes singled out) to many more ‘hard’ sciences.  Surprisingly, our literature review shows that reproducibility concerns have been voiced in virtually all main branches of empirical sciences including mathematical sciences, computer science (e.g. signal processing[5]), physics (e.g. climate science), chemistry, biology (e.g. cancer biology), economics (e.g. research on the value of a statistical life used for budget prioritization at government agencies), and other social sciences (e.g. health psychology). We give a more comprehensive overview of which fields are affected with more examples below. The reproducibility crisis leads some researchers to assume that a lot of research and spending on research is wasted (more below).

Meta-research seeks to address issues such as this. Table 1 shows the different research areas within meta-research and the specific topics investigated in it.

Superficial innovations that have very wide applicability might be systemically undervalued compared to profound innovations in a narrow field. Advances in tools (e.g. basic statistics and scientific instruments, such as the microscope or the computer) and in institutional innovations such as the peer-reviewed journal have wide applications and have been very useful across the board[6]. There might be a bias here where contributions of scientific ‘breakthroughs’ in narrow fields e.g. drug discovery, are often funded very heavily. 

Table 1 taken from [7]. “Major themes covered by meta-research.”

For a more detailed introduction on meta-research see [8].

The scale of the problem: waste in research

In 2010, global expenditure on health R&D, i.e. life sciences (mostly biomedical) research by both the public sector and the industry[9] was about $240 billion[10]. 85% or $200 billion of this investment might be wasted because of poor research practices such as asking the wrong questions, having a poor methodology, poor reporting, not publishing the results[11], or even questionable research practices (see Figure 3). For instance, just 22% of taxpayer-funded trials comply with mandatory government registry of clinical studies, ClinicalTrials.gov [12] as cited in [13]. Figure 1 and Figure 2 below show the different stages of research waste. At each stage of the research process, a substantial part of studies drops out of the race for reproducibility due to poor quality. This adds up to the majority of studies being of poor quality and only very few of which ‘survive’ and make it to the finish line of good, replicable research.[14]

Figure 1a: Waste in biomedical research taken from researchwaste.net http://researchwaste.net/wp-content/uploads/2014/03/Waste-at-four-stages-of-research.png based on [15]

Figure 1b: Stages in research production that lead to waste. Dashed box represents an addition to the 2009 model by National Institute for Health Research in Figure 1a.

Another more recent study similarly concludes that more than half all of preclinical research can’t be reproduced, at a cost $28 billion per year in the US alone[16].

There are critics of this mode: one recent paper “summarized the empirical evidence that most published research is not reproducible, not replicable, or false” and found that “the most extreme opinions about these issues likely overstate the rate of problems in the scientific literature.” [17].

We find this filter model from above somewhat plausible and in keeping with our subjective impression of the research process: a great number of studies are not very valuable because of the many errors that are systemic in many scientific fields. Thus, the $200 billion a year being wasted on poor research is plausible to us on the face of it, but it should not be taken literally and should be seen as a rough upper bound on how much research funding is wasted.  One reason for why it might be somewhat pessimistic, is because even if studies are filtered out they might have value. For instance, imagine a study on a cancer drug shows very strong effect sizes so that everyone in the large treatment group was completely cured of cancer whereas none of the people in the control group were. If this study was not properly blinded or not interpreted in the context of a systematic review, the research might still be useful and not wasted. The filter model might unfairly categorize such a study as ‘research waste’. However, this does not mean that the filter model is very far off: studies show that about half of all clinical trials remain unpublished[18] and thus do not have much value (it might have some value because it can give junior researchers experience in running experiments and the studies outcomes, though not published, might be disseminated through informal networks).

In the end, it is plausible that resources spent on research could have a much higher social impact than resources spent on say, entertainment or leisure, so research dollars should probably be valued more highly than say public expenditure on leisure. Thus, making research better is a very important problem. If research has created a lot of social impact even though substantial portions of research have been flawed in the past (i.e. the money spent on it was wasted), then it seems plausible that significantly increasing the proportion of good research might be very good indeed. Or as Ioannidis puts it: “Even 1% improvement in the yield and translation of useful discoveries effected through better research practices reflects value equivalent of many Nobel or Breakthrough prizes.”[19]

Case studies  where better research could have prevented deaths

There are some concrete case studies on how many lives bad research has indirectly affected in a negative way. For instance, one study[20] finds that if researchers had done a better systematic review, some beneficial and harmful side effects of treatments could have been identified earlier: dangerous antiarrhythmic drugs could have been taken off the market years earlier if a proper systematic review had been conducted, and this might have saved between 20,000 and 75,000 lives per year in the United States[21], [22]as cited in [23]. It is conceivable that there are other similar cases in which drugs did actually work but never found their way into the market because of bad systematic review practices.

There have been similar concerns outside of clinical trials in the context of systematic review in epidemiology. For instance, the evidence for the cause of sudden infant death syndrome – babies lying on their front – could have been recognized at least ten years earlier with the help of proper systematic review.  This might have saved tens of thousands of lives; for similar reasons, the dangers of secondhand smoking could have been acted on much earlier and saved lives[24].

In the effective charity space, people often donate to deworming charities. Some argue that the evidence for the effectiveness of mass-deworming programmes rests primarily on the results from a single trial[25], which was not pre-registered properly. We think if the study of the trial had been preregistered, the evidence for deworming would be substantially stronger.

Neglectedness / Crowdedness

Is meta-research neglected? Which scientific field does it belong to and does it already receive considerable funds? Table 1 shows the distribution of funds from both federal and philanthropic sources among the different research disciplines. The life sciences, medicine, and translational research generally receive the majority of funding from both government and philanthropists. Meta-research could be categorized as fundamental, transdisciplinary research and would thus not be in the category that receives the most funds[26]. This suggests that it is neglected and that especially given sizable philanthropic funding for biomedical science it might have hit steeply diminishing marginal returns on further investment.

Philanthropists should perhaps converge toward funding more fundamental science in line with the government because they generally might have less expertise in prioritizing funding than scientists who review grant proposals. They might also be more biased to fund more proximal welfare improvement (e.g. funding a new vaccine, but not basic, fundamental biological research that makes better vaccines possible)[27]. Because of this, marginal donor dollars should thus probably fund more fundamental science, which seems to be more neglected.

Table 2 taken from[28]: “Comparison of Federal funding obligations to academia by research field (2008) to Major Philanthropic gifts (>$10M) by field (2005-2011 average) for the period 1999-2009 in US$ millions (taken from the NSF Science and engineering Statistics 2012)”

Figure 2 taken from[29]: “Major Individual gifts to science philanthropy by subject in Millions $ (adapted from Chronicle of Philanthropy data). The subject analysis has only been completed for science philanthropy gifts over $10M. These amount to $19BN in individual gifts for the period 2005-2011 of which $10B are categorized as science philanthropy. NOTE: the category “interdisciplinary” is for gifts to support research across the entire campus.” 

Table 3 taken from[30]: “Federal obligations for research, by broad field of science and engineering and agency in rank order: FY 2014 (in Millions of current dollars) adapted from:

DOD = Department of Defense; DOE = Department of Energy; HHS = Department of Health and Human Services; NASA = National Aeronautics and Space Administration; NSF = National Science Foundation; nec = not elsewhere classified; USDA = Department of Agriculture.  "Other sciences nec" is used for multidisciplinary or interdisciplinary projects that cannot be classified within one of the broad fields of science.”

Apart from taking the top down, bird’s-eye view of the funding landscape, we also did a bottom-up search of individual funders who fund meta-research explicitly:

Meta-research might perhaps soon become more crowded as a cause area. For instance, the replication crisis entering the public consciousness with many mainstream media articles and even Late Night television reporting on it.[46],[47],[48] Generally, it is acknowledged that the replication crisis is real and problematic, that (partial) solutions to the replication crisis exist, and that science can and should be improved. In contrast, research on other promising topics such as cognitive enhancement, anti-aging research, or the safety of artificial intelligence might be more neglected and less well funded, because it is perceived as ‘weird’ or unnatural.

Another argument against the neglectedness of meta-research is that it’s easy to understand and thus perhaps more readily funded. By its very nature, making grants in meta-research does not require subject matter expertise in any one scientific subject (as opposed to say making good grants in computational neurochemistry). Many donors seem to focus more heavily on easy to understand research[49]. However, contrary to this hypothesis, interdisciplinary research, which meta-research is a part of, has been shown to get consistently lower funding[50].

Tractability

Stakeholders seem to agree that there are systemic problems in science and that (partial-) solutions to these problems exist. It does appear to be a tractable area.

However, there are arguments to be made that it is hard to improve science through meta-research and that it is not very tractable. Special interest groups such as publishing houses are profit-driven and might be resistant to changing existing practices in order to improve research. Also, institutional inertia within the academy might work against attempts to improve science.

Arguments against investing in meta-research

“Little relevance to hard sciences” argument

One argument against funding more meta-research is that it will mostly help softer sciences such as psychology as opposed to harder sciences such as physics, which might be seen to have fewer systemic problems.

For example, the experiments run on the Large Hadron Collider seem to be very well blinded[51] and significance levels appear to be much more stringent in high energy physics[52] which has “five-sigma” significance levels, where the likelihood that the results occur by chance is 1 in 3.5 million. In contrast, many social sciences accept less stringent levels of statistical significance, where the likelihood that the results occur by chance is 1 in 20 (‘p<0.05’). And studies show that the softer social sciences do in fact report more positive results than harder sciences with life sciences being in the middle[53], suggesting they are less methodologically rigorous.

A similar argument has been brought up by the Open Philanthropy Project:

The key obstacles to biomedical progress are related to reproducibility and reliability of studies.” I think that reproducibility is important, and potentially relevant to most types of research, but it is most core to clinical trials (F). Studies on humans are generally expensive and long-running, and so they may affect policy and practice for decades without ever being replicated. By contrast, for many other kinds of research, there is some cheap effective “replication” - or re-testing of the basic claims - via researchers trying to build on insights in their own lab work, so a non-reproducible study might in many cases mean a relatively modest waste of resources. I’ve heard varying opinions on how much waste is created by reproducibility-related issues in early-stage research, and think it is possible that this issue is a major one, but it is far from clear that it is the key issue.[54]

In economics, it has been argued that given that most articles are ignored and not cited, and so that it is not important that they are wrong and never get replicated[55] (it turns out that ‘most papers never get cited’ is a bit of myth and the issue is more complicated, with one analysis suggesting that only about 21% of all papers are never cited)[56]. In contrast, important and heavily cited papers (e.g. in labor economics) are replicated conceptually and the data are increasingly deposited in open access repositories. Moreover, exact replication of a particular study is not as important as testing the underlying fundamental theory in a different context. If the underlying theory turns out to be wrong then important mistakes do get caught, because they are tested on different data.[57] 

However, this argument has been criticized in biology, but we believe that it also generalizes to other fields such as economics:

“There is actually a strong disincentive toward reproducing something in the very same way as originally done. There is pressure to have something new to say, to do things differently. This leads to conceptual rather than exact replication. Then, when the pieces of evidence on the same theme are substantially different, the successful scientist weaves a common story, a consistent narrative. Triangulation has thus become the art of building biological fairy tales. I do acknowledge that conceptual replication and triangulation can be useful in some situations. However, they have a major drawback: almost anything can fit into a triangulation narrative by invoking some speculative “biological plausibility” as the connecting glue.

It is likely that most of these conceptual and triangulation links are overstated leaps of faith. Otherwise, it is very difficult to explain why we have so many successful narratives in the basic sciences, but very few of these proposed discoveries eventually work in humans. Moreover, a published conceptual replication with a different design and/or experimental conditions does not say how many laboratories have tried and how many different designs and experimental conditions failed and remain unpublished. In the current environment where there is florid selective reporting and chasing of significant results, it is unknown whether 0, 3, 10, or 100 experiments and variants have been tried and failed for each successful publication. The approach of the reproducibility project to preregister the replication effort is thus essential. There is a preregistered detailed protocol and even a preregistered report. Essentially, one writes the paper (without the exact numbers) before running the experiments. This approach safeguards that selective reporting bias will not be an issue.” [58]

Further, we believe that there is good evidence that suggest that meta-research is more generalizable than the above arguments suggest. Problems with reproducibility that meta-research seems to address do not seem to be limited to the softer social sciences such as psychology. Based on a survey of 1,500 scientists conducted by the journal, Nature, a surprisingly high number of scientists from harder sciences such as physics and chemistry report having failed to reproduce experiments by others in their own fields[59] (see Figure 2).

Figure 2 taken from [60]: Survey on reproducibility conducted by the journal Nature 

A systematic review suggests that while the replication crisis in biomedical research has received most of the attention, concerns have touched almost every field in the biological and social sciences and beyond.[61] 

Below, we review the literature on which scientific fields have reported worries about reproducibility that meta-research might be able to address.

We find that even though softer sciences seem to be particularly affected by methodological concerns, many different scientific fields ranging from hard to soft sciences are worried about poor research methodology. Some problems even seem to be getting worse with some data showing that negative results are disappearing from most disciplines (though the increase was stronger in the social and some biomedical disciplines).[62] 

Finally, even if we grant that harder sciences such as physics are not as affected by methodological problems, most funding goes to the, relatively soft, biomedical sciences and not to harder sciences. So meta-research might still have high social return by improving the majority of science funding. If we accept that funding meta-research improves much of biomedical science by a non-trivial amount, and thus is better than funding biomedical science directly, then the only way to fund more effective science than meta-research, would be by funding harder science such as physics directly, as they are not as affected by meta-research. For this to be the case, physics would need to have a very high social return on investment – which is not inconceivable (e.g. breakthroughs in creating clean energy would be highly valuable). Some have argued that investment in physics research does not actually have high social returns on investment either.[63] Similarly, there has been some recent skepticism around the idea of whether breakthrough fundamental science ideas in biomedical science such as genetics really have higher social return on investment than public health interventions. [64],[65]

Does meta-research contribute to differential technological development?

Finally, some people have argued that we need differential technological development as opposed to accelerating (scientific) innovation across the board [66]. This is because much research is often so-called dual use: it can improve well-being, but at the same time carry substantial risks with it. An example of this is advances of nuclear physics which has both applications for providing low-carbon energy but also enabled the creation of nuclear weapons. Similar concerns are now emerging in the life sciences. [67] One might worry that funding meta-research indiscriminately speeds up all research, including research which carries a lot of risks. However, for the above reasons, we believe that meta-research improves predominately social science and applied clinical science (“p-value science’) and so has a strong differential technological development element, that hopefully makes society wiser before more risks from technology emerge through innovation. However, there are some reproducibility concerns in harder sciences such as basic biological research and high energy physics that might be sped up by meta-research and thus carry risks from emerging technologies[68].

Scientific fields with reproducibility problems

Reproducibility issues that meta-research might be able to address have been voiced in the following fields:

Table 4 taken from[107]: “Examples of Some Reported Reproducibility Concerns in Preclinical Studies. ALS indicates amyotrophic lateral sclerosis; MIAME, minimum information about a microarray experiment; NGS, next-generation sequencing; and VPA, valproic acid (model of autism).”

Table 5 taken from [108]: “Additional Basic Science Fields Where Concerns Regarding Reproducibility Have Been Raised”

References


[1] "Three Ways to Advance Science - Nick Bostrom." https://www.nickbostrom.com/views/science.pdf. Accessed 19 Sep. 2018.

[2] "Meta-research: Why research on research matters - PLOS." 13 Mar. 2018, https://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.2005468. Accessed 24 Oct. 2018.

[3] "Meta-research: Evaluation and ...." https://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.1002264. Accessed 1 Oct. 2018.

[4] "Replication crisis - Wikipedia." https://en.wikipedia.org/wiki/Replication_crisis. Accessed 19 Sep. 2018.

[5] "Reproducible research in signal processing - IEEE Xplore Document." http://ieeexplore.ieee.org/document/4815541. Accessed 16 Jan. 2017.

[6] "Three Ways to Advance Science - Nick Bostrom." https://www.nickbostrom.com/views/science.pdf. Accessed 19 Sep. 2018.

[7] "Meta-research: Evaluation and ...." https://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.1002264. Accessed 1 Oct. 2018.

[8] "A manifesto for reproducible science - Nature." 10 Jan. 2017, https://www.nature.com/articles/s41562-016-0021. Accessed 1 Oct. 2018.

[9] “Of the US$214 billion invested in high-income countries, 60% of health R&D investments came from the business sector, 30% from the public sector, and about 10% from other sources (including private non-profit organisations)”: "Mapping of available health research and development ... - The Lancet." 20 May. 2013, http://www.thelancet.com/journals/lancet/article/PIIS0140-6736(13)61046-6/abstract. Accessed 20 Jan. 2017.

[10] "Mapping of available health research and development data: what's ...." 20 May. 2013, https://www.ncbi.nlm.nih.gov/pubmed/23697824. Accessed 1 Oct. 2018.

[11] "Biomedical research: increasing value, reducing waste - The Lancet." https://www.thelancet.com/journals/lancet/article/PIIS0140-6736(13)62329-6/abstract. Accessed 1 Oct. 2018.

[12] "Compliance with mandatory reporting of clinical trial results on ...." 3 Jan. 2012, https://www.bmj.com/content/344/bmj.d7373. Accessed 1 Oct. 2018.

[13] "1 Transparency for Clinical Trials and NIH-Sponsored Research by ...." 3 Mar. 2015, https://www.arnoldfoundation.org/wp-content/uploads/2015/05/Transparency-for-Clinical-Trials-and-NIH-Sponsored-Research1.pdf. Accessed 1 Oct. 2018.

[14] “Since research must pass through all four stages shown in the figure, the waste is cumulative. If the losses estimated in the figure apply more generally, then the roughly 50% loss at stages 2, 3, and 4 would lead to a greater than 85% loss, which implies that the dividends from tens of billions of dollars of investment in research are lost every year because of correctable problems. Although we have mainly used evidence about the design and reporting of clinical trials, we believe it is reasonable to assume that the problems also apply to other types of research.” "Avoidable waste in the production and reporting of research evidence ...." 15 Jun. 2009. http://www.thelancet.com/journals/lancet/article/PIIS0140673609603299/fulltext?rss=yes Accessed 20 Jan. 2017.

[15] "Avoidable waste in the production and reporting of research evidence ...." 15 Jun. 2009, https://www.thelancet.com/journals/lancet/article/PIIS0140673609603299/fulltext. Accessed 1 Oct. 2018.

[16] "The Economics of Reproducibility in Preclinical Research - PLOS." 9 Jun. 2015, https://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.1002165. Accessed 17 Oct. 2018.

[17] "Is Most Published Research Really False? - Annual Reviews." https://www.annualreviews.org/doi/abs/10.1146/annurev-statistics-060116-054104. Accessed 25 Oct. 2018.

[18] "Timing and completeness of trial results posted at Clinical Trials ... - PloS." 3 Dec. 2013, http://journals.plos.org/plosmedicine/article?id=10.1371/journal.pmed.1001566. Accessed 20 Jan. 2017.

[19] "Meta-research: Why research on research matters - PLOS." 13 Mar. 2018, https://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.2005468. Accessed 24 Oct. 2018.

[20] "How to increase value and reduce waste when research priorities are ...." 8 Jan. 2014, https://www.thelancet.com/journals/lancet/article/PIIS0140-6736(13)62229-1/abstract. Accessed 1 Oct. 2018.

[21] "Registering Clinical Trials | JAMA | JAMA Network." https://jamanetwork.com/journals/jama/fullarticle/197661. Accessed 1 Oct. 2018.

[22] "Deadly Medicine: Why Tens of Thousands of Heart Patients Died in ...." https://www.amazon.com/Deadly-Medicine-Thousands-Patients-Americas/dp/0684804174. Accessed 1 Oct. 2018.

[23] "Transparency for Clinical Trials and NIH-Sponsored Research - Laura ...." 19 May. 2015, http://www.arnoldfoundation.org/transparency-clinical-trials-nih-sponsored-research/. Accessed 1 Oct. 2018.

[24] "How to increase value and reduce waste when research priorities are ...." 8 Jan. 2014, https://www.thelancet.com/journals/lancet/article/PIIS0140-6736(13)62229-1/abstract. Accessed 1 Oct. 2018.

[25] "Reanalysis of the Miguel and Kremer deworming experiment | GiveWell." https://www.givewell.org/international/technical/programs/deworming/reanalysis. Accessed 21 Sep. 2018.

[26] However, some translational biomedical researchers e.g. those who work on clinical trial methodology might be categorized as meta-researchers and thus the field might benefit from sizable life sciences and medicine funding pots.

[27] However, it could also be the case that philanthropic funding is more optimally allocated: there is more political pressure for federal funding to serve the national interest. Also, it might be that philanthropists try to fill the gap in the federal research funding landscape and this is why their allocation differs from federal funding.

[28] "Evaluating the Role of Science Philanthropy in American Research ...." http://www.nber.org/papers/w18146. Accessed 1 Oct. 2018.

[29] "Evaluating the Role of Science Philanthropy in American Research ...." http://www.nber.org/papers/w18146. Accessed 1 Oct. 2018.

[30] "nsf.gov - Federal Funding for Research Increases by 6% in FY 2014 ...." https://www.nsf.gov/statistics/2016/nsf16311/. Accessed 1 Oct. 2018.

[31] "Research Integrity - Laura and John Arnold Foundation." http://www.arnoldfoundation.org/initiative/research-integrity/. Accessed 11 Jan. 2017.

[32] "Meta-Research Innovation Center at Stanford | METRICS." http://metrics.stanford.edu/. Accessed 11 Jan. 2017.

[33] "Center for Open Science - Wikipedia." https://en.wikipedia.org/wiki/Center_for_Open_Science. Accessed 11 Jan. 2017.

[34] "Grants - Laura and John Arnold Foundation." http://www.arnoldfoundation.org/grants/. Accessed 13 Jan. 2017.

[35] "Our Landscape of the Open Science Community | Open Philanthropy ...." 11 Jul. 2013, http://www.openphilanthropy.org/blog/our-landscape-open-science-community. Accessed 13 Jan. 2017.

[36] "2 - Advanced Search | The Andrew W. Mellon Foundation." https://mellon.org/grants/grants-database/advanced-search/?page=2&city=&year-end=&country=&amount-low=&amount-high=&q=&p=114&state=&per_page=25&year-start=. Accessed 13 Jan. 2017.

[37] "Hypothes.is Project - The Andrew W. Mellon Foundation." 13 Jun. 2016, https://mellon.org/grants/grants-database/grants/hypothesis-project/21600654/. Accessed 13 Jan. 2017.

[38] "Gordon and Betty Moore Foundation: Home." https://www.moore.org/. Accessed 13 Jan. 2017.

[39] "Tackling data-driven challenges in science - Article Detail." 28 Dec. 2016, https://www.moore.org/article-detail?newsUrlName=tackling-data-driven-challenges-in-science&tagToFilterBy=ab660061-a10f-68a5-8452-ff00002785c8. Accessed 13 Jan. 2017.

[40] "Call for Reproducibility Workflows | Berkeley Institute for Data Science." 26 Apr. 2016, https://bids.berkeley.edu/news/call-reproducibility-workflows. Accessed 15 Jan. 2017.

[41] "ORI Funds Seven Grants on Research Integrity | ORI - The Office of ...." https://ori.hhs.gov/blog/ori-funds-seven-grants-research-integrity. Accessed 15 Jan. 2017.

[42] "Meta-research | Open Philanthropy Project." http://www.openphilanthropy.org/conversations/meta-research. Accessed 15 Jan. 2017.

[43] "Meta-research - The GiveWell Blog." 11 Jun. 2012, http://blog.givewell.org/2012/06/11/meta-research/. Accessed 15 Jan. 2017.

[44] "Chan Zuckerberg Initiative acquires and will free up ... - TechCrunch." 23 Jan. 2017, https://techcrunch.com/2017/01/23/chan-zuckerberg-initiative-meta/. Accessed 24 Jan. 2017.

[45] "Semantic Scholar." https://www.semanticscholar.org/. Accessed 24 Jan. 2017.

[46] "Science|New Truths That Only One Can See - The New York Times." 21 Jan. 2014, https://www.nytimes.com/2014/01/21/science/new-truths-that-only-one-can-see.html. Accessed 15 Jan. 2017.

[47] "Combating bad science: Metaphysicians - The Economist." 15 Mar. 2014, http://www.economist.com/news/science-and-technology/21598944-sloppy-researchers-beware-new-institute-has-you-its-sights-metaphysicians. Accessed 15 Jan. 2017.

[48] "Scientific Studies: Last Week Tonight with John Oliver (HBO) - YouTube." 8 May. 2016, https://www.youtube.com/watch?v=0Rnq1NpHdmw. Accessed 25 Jan. 2017.

[49] "Investigating Neglected Goals in Scientific Research | Open ...." 26 Mar. 2015, http://www.openphilanthropy.org/blog/investigating-neglected-goals-scientific-research. Accessed 15 Jan. 2017.

[50] "Interdisciplinary research has consistently lower funding ... - Nature." https://www.nature.com/articles/nature18315. Accessed 30 Sep. 2018.

[51] "Blinding and unblinding analyses | CMS Experiment - CERN." 28 Jun. 2012, http://cms.web.cern.ch/news/blinding-and-unblinding-analyses. Accessed 16 Jan. 2017.

[52] "Should you get excited by your data? Let the Look-Elsewhere Effect ...." 19 Aug. 2011, http://cms.web.cern.ch/news/should-you-get-excited-your-data-let-look-elsewhere-effect-decide. Accessed 16 Jan. 2017.

[53] "“Positive” Results Increase Down the Hierarchy of the Sciences - PloS." 7 Apr. 2010, http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0010068. Accessed 16 Jan. 2017.

[54] "The Path to Biomedical Progress | Open Philanthropy Project." 27 Feb. 2015, http://www.openphilanthropy.org/blog/path-biomedical-progress. Accessed 16 Jan. 2017.

[55] "Replication in Labor Economics: Evidence from Data, and ... - SSRN." 19 Dec. 2016, https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2886774. Accessed 16 Jan. 2017.

[56] "The science that's never been cited - Nature." 13 Dec. 2017, https://www.nature.com/articles/d41586-017-08404-0. Accessed 21 Sep. 2018.

[57] "Replication in Labor Economics: Evidence from Data, and ... - SSRN." 19 Dec. 2016, https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2886774. Accessed 16 Jan. 2017.

[58] "The Reproducibility Wars: Successful, Unsuccessful, Uninterpretable ...." http://clinchem.aaccjnls.org/content/63/5/943. Accessed 17 Oct. 2018.

[59] "1,500 scientists lift the lid on reproducibility: Nature News & Comment." 25 May. 2016, http://www.nature.com/news/1-500-scientists-lift-the-lid-on-reproducibility-1.19970. Accessed 16 Jan. 2017.

[60] "1,500 scientists lift the lid on reproducibility: Nature News & Comment." 25 May. 2016, http://www.nature.com/news/1-500-scientists-lift-the-lid-on-reproducibility-1.19970. Accessed 16 Jan. 2017.

[61] "What does research reproducibility mean? | Science Translational ...." 1 Jun. 2016, http://stm.sciencemag.org/content/8/341/341ps12.full. Accessed 16 Jan. 2017.

[62] "Negative results are disappearing from most disciplines and countries ...." http://link.springer.com/article/10.1007/s11192-011-0494-7. Accessed 16 Jan. 2017.

[63] "PhysWell - foreXiv - Jess Riedel." 15 Mar. 2016, http://blog.jessriedel.com/2016/03/15/physicswell/. Accessed 16 Jan. 2017.

[64] “Drs Miller and Sittig challenge our claim that the big ideas that have long dominated biomedical research have underperformed. Interestingly, they admit that morbidity and mortality can be reduced by “getting the general population to eat less and exercise more; reducing the incidence of drunk driving; increasing the use of seat belts, child restraints, and bicycle helmets; and reducing cardiovascular disease risk through aspirin, blood pressure control, cholesterol reduction, and smoking cessation.” Indeed, these interventions constitute a terrific research agenda! Why not devote more research funds to finding out how to make these interventions more efficient?”. Joyner MJ, Paneth N, Ioannidis JPA. Underperforming Big Ideas in Biomedical Research—Reply. JAMA. 2017;317(3):322. doi:10.1001/jama.2016.20003

[65] "What Happens When Underperforming Big Ideas in Research ... - JAMA." 4 Oct. 2016, http://jama.jamanetwork.com/article.aspx?articleid=2541515. Accessed 21 Jan. 2017.

[66] "Differential technological development - Wikipedia." https://en.wikipedia.org/wiki/Differential_technological_development. Accessed 2 Oct. 2018.

[67] "Dual Use Research of Concern - NIH Office of Science Policy." 11 Sep. 2017, https://osp.od.nih.gov/biotechnology/dual-use-research-of-concern/. Accessed 2 Oct. 2018.

[68] "On methods for correcting for the look-elsewhere effect in searches for ...." 11 Feb. 2016, https://arxiv.org/abs/1602.03765. Accessed 1 Oct. 2018.

[69] "Reproducible Research in the Mathematical ... - Stanford University." 20 Jun. 2014, https://web.stanford.edu/~vcs/papers/PCAM_20140620-VCS.pdf. Accessed 16 Jan. 2017.

[70] "Incorrect results in software engineering experiments - ScienceDirect." 28 Mar. 2015, http://www.sciencedirect.com/science/article/pii/S0164121215000679. Accessed 16 Jan. 2017.

[71] "Enhancing reproducibility for computational ... - Stanford University." 23 Dec. 2016, http://web.stanford.edu/~vcs/papers/ERCM2016-STODDEN.pdf. Accessed 16 Jan. 2017.

[72] "Reproducible Pattern Recognition Research: The Case of Optimistic ...." 27 Dec. 2016, https://arxiv.org/abs/1612.08650. Accessed 16 Jan. 2017.

[73] "Reproducible research in signal processing - IEEE Xplore Document." http://ieeexplore.ieee.org/document/4815541. Accessed 16 Jan. 2017.

[74] "On methods for correcting for the look-elsewhere effect in searches for ...." 11 Feb. 2016, https://arxiv.org/abs/1602.03765. Accessed 1 Oct. 2018.

[75] "EconStor: Publication bias in measuring climate sensitivity." https://www.econstor.eu/handle/10419/120431. Accessed 16 Jan. 2017.

[76] "No evidence of publication bias in climate change science | SpringerLink." 28 Dec. 2016, http://link.springer.com/article/10.1007/s10584-016-1880-1. Accessed 16 Jan. 2017.

[77] "The $10 trillion value of better information about the ... - View inline." 5 Oct. 2015, http://rsta.royalsocietypublishing.org/content/373/2054/20140429. Accessed 16 Jan. 2017.

[78] "Selective reporting and the social cost of carbon - ScienceDirect." http://www.sciencedirect.com/science/article/pii/S0140988315002327. Accessed 16 Jan. 2017.

[79] "A Third Wave in the Economics of Climate Change | SpringerLink." 1 Oct. 2015, http://link.springer.com/article/10.1007/s10640-015-9965-2. Accessed 16 Jan. 2017.

[80] "Reproducibility in Chemical Research - Wiley Online Library." http://onlinelibrary.wiley.com/doi/10.1002/anie.201606591/full. Accessed 16 Jan. 2017.

[81] "Striving for Reproducible Science - ACS Publications - American ...." 17 Nov. 2015, http://pubs.acs.org/doi/pdfplus/10.1021/acs.analchem.5b04300. Accessed 16 Jan. 2017.

[82] "Reproducibility in Science | Circulation Research." 2 Jan. 2015, http://circres.ahajournals.org/content/116/1/116. Accessed 16 Jan. 2017.

[83] "An open investigation of the reproducibility of cancer biology ... - eLife." https://elifesciences.org/content/3/e04333. Accessed 16 Jan. 2017.

[84] "An open investigation of the reproducibility of cancer biology ... - eLife." 10 Dec. 2014, https://elifesciences.org/content/3/e04333. Accessed 25 Jan. 2017.

[85] "Plan to replicate 50 high-impact cancer papers shrinks to just 18 ...." 31 Jul. 2018, http://www.sciencemag.org/news/2018/07/plan-replicate-50-high-impact-cancer-papers-shrinks-just-18. Accessed 22 Sep. 2018.

[86] "Managing risks in drug discovery: reproducibility of published findings.." https://www.ncbi.nlm.nih.gov/pubmed/26883784. Accessed 16 Jan. 2017.

[87] "Methods to increase reproducibility in differential gene expression via ...." 14 Sep. 2016, http://nar.oxfordjournals.org/content/45/1/e1.short. Accessed 16 Jan. 2017.

[88] "Questionable research practices in ecology and evolution - PLOS." 16 Jul. 2018, https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0200303. Accessed 22 Sep. 2018.

[89] "Reproducible and replicable pain research: a critical ... - LWW Journals." https://journals.lww.com/pain/Fulltext/2018/09000/Reproducible_and_replicable_pain_research___a.5.aspx. Accessed 8 Nov. 2018.

[90] "How Modeling Standards, Software, and Initiatives Support ... - NCBI." 2 Jun. 2016, https://www.ncbi.nlm.nih.gov/pubmed/27295645. Accessed 16 Jan. 2017.

[91] "The elephant in the room: reproducibility in toxicology | Particle and ...." 22 Aug. 2014, http://particleandfibretoxicology.biomedcentral.com/articles/10.1186/s12989-014-0042-8. Accessed 16 Jan. 2017.

[92] "What's To Know About The Credibility Of ... - Wiley Online Library." 11 Apr. 2013, http://onlinelibrary.wiley.com/doi/10.1111/joes.12032/abstract. Accessed 16 Jan. 2017.

[93] "False-Positive Psychology: Undisclosed Flexibility in Data Collection ...." http://journals.sagepub.com/doi/pdf/10.1177/0956797611417632. Accessed 24 Jan. 2017.

[94] "Transparency, Reproducibility, and the Credibility of Economics ...." http://emiguel.econ.berkeley.edu/research/transparency-reproducibility-and-the-credibility-of-economics-research. Accessed 16 Jan. 2017.

[95] "FAQ: Reinhart, Rogoff, and the Excel Error That Changed History ...." 18 Apr. 2013, https://www.bloomberg.com/news/articles/2013-04-18/faq-reinhart-rogoff-and-the-excel-error-that-changed-history. Accessed 16 Jan. 2017.

[96] "The error that could subvert George Osborne's austerity programme ...." 18 Apr. 2013, https://www.theguardian.com/politics/2013/apr/18/uncovered-error-george-osborne-austerity. Accessed 16 Jan. 2017.

[97] "Economic downturns, universal health coverage, and ... - Science Direct." 19 Aug. 2016, https://www.sciencedirect.com/science/article/pii/S0140673616005778. Accessed 1 Oct. 2018.

[98] https://www.yahoo.com/news/financial-crisis-may-caused-500-000-cancer-deaths-224552583.html?guccounter=1 

[99] "GBD Compare | IHME Viz Hub." http://vizhub.healthdata.org/gbd-compare/. Accessed 22 Sep. 2018.

[100] "ECONOMICS SERIES SWP 2016/1 The power of bias in economics ...." https://www.deakin.edu.au/__data/assets/pdf_file/0007/477763/2016_1.pdf. Accessed 16 Jan. 2017.

[101] "Transparency, Reproducibility, and the Credibility of Economics ...." http://emiguel.econ.berkeley.edu/research/transparency-reproducibility-and-the-credibility-of-economics-research. Accessed 16 Jan. 2017.

[102] "Evaluating replicability of laboratory experiments in economics | Science." 3 Mar. 2016, http://science.sciencemag.org/content/early/2016/03/02/science.aaf0918. Accessed 22 Sep. 2018.

[103] "The Power of Bias in Economics Research ... - Wiley Online Library." https://onlinelibrary.wiley.com/doi/abs/10.1111/ecoj.12461. Accessed 22 Sep. 2018.

[104] "Accounting and Public Policy: The Importance of Credible Research ...." http://aaajournals.org/doi/abs/10.2308/apin-51158. Accessed 16 Jan. 2017.

[105] "Estimating the reproducibility of ...." http://science.sciencemag.org/content/349/6251/aac4716. Accessed 16 Jan. 2017.

[106] "Estimating the reproducibility of ...." 28 Aug. 2015, http://science.sciencemag.org/content/349/6251/aac4716. Accessed 22 Sep. 2018.

[107] "Reproducibility in Science | Circulation Research." 2 Jan. 2015, http://circres.ahajournals.org/content/116/1/116. Accessed 16 Jan. 2017.

[108] "Reproducibility in Science | Circulation Research." 2 Jan. 2015, http://circres.ahajournals.org/content/116/1/116. Accessed 16 Jan. 2017.