Content area
Full Text
Verifying results requires disparate lines of evidence - a technique called triangulation. Marcus R. Munafo and George Davey Smith explain.
Several studies across many fields estimate that only around 40% of published findings can be replicated reliably. Various funders and communities are promoting ways for independent teams to routinely replicate the findings of others.
These efforts are laudable, but insufficient. If a study is skewed and replications recapitulate that approach, findings will be consistently incorrect or biased. Consider a commonly used assay in which the production of a fluorescent protein is used to monitor cell activity. If the compounds used to manipulate cell activity are also fluorescent, as has happened1, reliably repeatable results will not yield robust conclusions.
We have both spent much of our careers advocating ways to increase scientific certainty. One of us (M.R.M.) participated in work by UK funding agencies to develop strategies for reproducible science, and helped to craft a manifesto for reproducibility2.
But replication alone will get us only so far. In some cases, routine replication might actually make matters worse. Consistent findings could take on the status of confirmed truths, when they actually reflect failings in study design, methods or analytical tools.
We believe that an essential protection against flawed ideas is triangulation3. This is the strategic use of multiple approaches to address one question. Each approach has its own unrelated assumptions, strengths and weaknesses. Results that agree across different methodologies are less likely to be artefacts.
Isn't this how science is meant to operate? Perhaps so, but scientists in today's hyper-competitive environment often lose sight of the need to pursue distinct strands of evidence.
The problem was aptly described in May 2017, when cancer researcher William Kaelin lamented that the goal of the scientific paper had shifted from testing narrow conclusions in multiple ways to making a broadening series of assertions, each based on limited evidence4. Consequently, he said, "papers are increasingly like grand mansions of straw, rather than sturdy houses of brick".
The scientific community should address this lack of depth strategically and establish practices that facilitate triangulation. Specifically, we advocate a system to support multidisciplinary teams, each created around a common question (see 'Triangulation'). This, we believe, would result in robust insights - mansions of...