University of Southern California

Research

Rigor, Transparency and Reproducibility

Rigor and Transparency in Research

Guidelines for Researchers at the University of Southern California Toward Reproducibility

Principles

USC’s Academic Senate has endorsed these principles (see full document for details):

Transparency: Sharing of data and information describing data collection methods, raw data and research analyses. When feasible, for hypothesis driven research, an electronic disclosure of the hypothesis should be logged prior to the onset of data collection.

Pre-registration of research studies is often required, particularly for clinical trials (see clinicaltrials.gov).

Good Institutional Practices: Training programs and courses in rigorous experimental design, research standards, statistics, meta-analyses and objective evaluation of data should be offered by the university.

Consideration in Merit Review and Promotion: Strategies that encourage robustness of research design, data and code sharing and high-quality mentoring should be considered.

Participation in Reproducibility Work: Efforts aimed at replicating prior data are encouraged and
should be included as technical reports and be publicly shared.

Increasing Visibility of the Topic of Reproducibility: Reproducibility in research should be incorporated in the broader curriculum throughout the schools and departments of USC. Transparency and rigor in research practices should be both discussed and encouraged amongst all members of the research community, including students, technicians, assistants, and other staff

Authenticating Key Resources: Reproducibility depends on controlling the accuracy and consistency of all research inputs and research methods. This includes:

 

What is Reproducibility

Reproducibility means that an experiment will achieve results within statistical margins of error when repeated under like conditions. The idea of reproducibility is central to the scientific method, through which empirical studies are conducted to validate or create theories or hypotheses that describe nature and human behavior. It is central to all areas of science, including social and behavioral science. The accumulation of knowledge that occurs through science depends on sharing outcomes of experiments with an assurance that the experiments were conducted with precision, and offering open access so that others may repeat and verify the experiments through well-documented design and outcomes. USC is committed to these principles. Memo from Vice President of Research.

It should be noted that the inability to reproduce research is not predominantly an issue of scientific misconduct or conflicts of interest, though either may be present. Irreproducibility is primarily attributed to sloppiness in experimental design, documentation and execution. Following good scientific practice through rigor and transparency, utilizing well-trained students and staff, is the primary way to make research reproducible.

Reproducibility is related to the concept of generalizability. Federal regulations on human subject research define research as follows: “a systematic investigation, including research development, testing and evaluation, designed to develop or contribute to generalizable knowledge.” In a human experiment, research seeks to discover outcomes that are transferrable to other situations or populations, in that way being reproducible. However, contextual factors may be a consideration that affects reproducibility, and should be considered in study design and reporting. A behavioral experiment conducted in China in 1950 would not necessarily be reproducible when repeated today in Europe, for instance.

Expectations from Research Sponsors

Investigators are encouraged to cite the commitment of USC’s Academic Senate to rigor and transparency in research in their applications for funding. Specific considerations follow:

National Institutes for Health: Newly revised grant application instructions clarify expectations to ensure that NIH is funding the best and most rigorous science, highlight the need for applicants to describe details that may have been previously overlooked, highlight the need for reviewers to consider such details in their reviews through revised review criteria, and minimize additional burden. These new instructions and revised review criteria focus on four areas deemed important for enhancing rigor and transparency:

Investigators are strongly encouraged to discuss these revised application instructions with NIH program staff prior to submission of applications. Further information is provided at the following website.

National Science Foundation: The NSF has developed a reproducibility “framework” to define and enhance the confidence in and reliability of science and engineering research. The framework suggests a discipline-specific approach to improving research rigor and transparency; proposes to develop approaches to data collection, sharing, and curation; seeks to create best practice standards for research practices, including instrumentation, models, and interpretation of findings across disciplines; and encourages the dissemination of replications and studies yielding negative as well as positive results. Further information may be found here.

Research sponsors may also expect a data sharing plan at time of proposal submission. Mandates are sometimes specific to the type of research and program, where field specific data repositories are created. Example requirements can be found here:

 

Resources Available to USC Faculty

Digital.usc.edu provides access to various resources to enable sharing of research information. Researchers should in particular consider use of the Open Science Framework as the vehicle for recording and sharing research design, hypotheses and research data.

These resources and others permit research investigators to share information that goes well beyond what would normally be permitted in the space of a journal article. It is also possible to control versions of documents, datasets, software code and other study inputs through such resources.

National Academies report on Fostering Integrity in Research – Free PDF available for download.

Other References:

Baker M. How quality control could save your science. Nature. 2016 Jan 28;529(7587):456-8.

Enhancing Research Reproducibility: Recommendations from the Federation of American Societies for Experimental Biology. 2016 Jan 14.

Freedman LP, Cockburn IM, Simcoe TS. The Economics of Reproducibility in Preclinical Research. PLoS Biol. 2015 Jun 9;13(6):e1002165.

Nature, Challenges in Irreproducible Research

Open Science Collaboration. PSYCHOLOGY. Estimating the reproducibility of psychological science. Science. 2015 Aug 28;349(6251):aac4716.

Van Bavel, JJ, Mende-Siedlecki, P, Brady, WJ and Reinero. DA. Contextual sensitivity in scientific reproducibility, PNAS 2016 113 (23) 6454-6459; published ahead of print May 23, 2016, doi:10.1073/pnas.1521897113.

The Basics of Reproducibility

Reproducibility of Scientific Data– Presentation by Center for Excellence in Research (CER)

Data Reproducibility & Good Research Practices, by Mike Jamieson- Presentation from Navigating the University Faculty Retreat, August 19, 2016

Best Practices in Data Analysis and Sharing in Neuroimaging using MRI

Scientific Paper of the Future

Towards Continuous Scientific Data Analysis and Hypothesis Evolution (Yoland Gil, et al.)