Rigor, Transparency and Reproducibility
The USC Office of Research supports open scholarship and research, so that we can maximize transparency, effectiveness and public benefits of our work. Toward that end, we endorse the FAIR Data Principles — making data findable, accessible, interoperable and reusable – supported by infrastructure provided by the university. We support making our research data and products available to others at minimal or no cost, within reasonable bounds, to enable: sharing of knowledge and perspectives, collaborations, broadened participation, and review of our research for accuracy, reliability and reproducibility. Reasonable bounds include protection of privacy of human data; restrictions needed for national security; and delays for: peer review or publication, sponsor review (but not approval) and protection of intellectual property.
USC’s Academic Senate has endorsed these principles (see full document for details):
Transparency: Sharing of data and information describing data collection methods, raw data and research analyses. When feasible, for hypothesis driven research, an electronic disclosure of the hypothesis should be logged prior to the onset of data collection.
Pre-registration of research studies is often required, particularly for clinical trials (see clinicaltrials.gov).
Good Institutional Practices: Training programs and courses in rigorous experimental design, research standards, statistics, meta-analyses and objective evaluation of data should be offered by the university.
Consideration in Merit Review and Promotion: Strategies that encourage robustness of research design, data and code sharing and high-quality mentoring should be considered.
Participation in Reproducibility Work: Efforts aimed at replicating prior data are encouraged and
should be included as technical reports and be publicly shared.
Increasing Visibility of the Topic of Reproducibility: Reproducibility in research should be incorporated in the broader curriculum throughout the schools and departments of USC. Transparency and rigor in research practices should be both discussed and encouraged amongst all members of the research community, including students, technicians, assistants, and other staff
Authenticating Key Resources: Reproducibility depends on controlling the accuracy and consistency of all research inputs and research methods. This includes:
- Biological and chemical resources (including chemicals, cell lines, antibodies and other biologics)
- Software code, physical materials and devices
- Calibration of research tools and instruments, and systematically tracing when calibration occurs
- Use of electronic notebooks for recording these practices, and consideration to open and public reporting of authentication efforts
What is Reproducibility
Reproducibility means that an experiment will achieve results within statistical margins of error when repeated under like conditions. The idea of reproducibility is central to the scientific method, through which empirical studies are conducted to validate or create theories or hypotheses that describe nature and human behavior. It is central to all areas of science, including social and behavioral science. The accumulation of knowledge that occurs through science depends on sharing outcomes of experiments with an assurance that the experiments were conducted with precision, and offering open access so that others may repeat and verify the experiments through well-documented design and outcomes. USC is committed to these principles. Memo from Vice President of Research.
It should be noted that the inability to reproduce research is not predominantly an issue of scientific misconduct or conflicts of interest, though either may be present. Irreproducibility is primarily attributed to sloppiness in experimental design, documentation and execution. Following good scientific practice through rigor and transparency, utilizing well-trained students and staff, is the primary way to make research reproducible.
Reproducibility is related to the concept of generalizability. Federal regulations on human subject research define research as follows: “a systematic investigation, including research development, testing and evaluation, designed to develop or contribute to generalizable knowledge.” In a human experiment, research seeks to discover outcomes that are transferrable to other situations or populations, in that way being reproducible. However, contextual factors may be a consideration that affects reproducibility, and should be considered in study design and reporting. A behavioral experiment conducted in China in 1950 would not necessarily be reproducible when repeated today in Europe, for instance.
Expectations from Research Sponsors
Investigators are encouraged to cite the commitment of USC’s Academic Senate to rigor and transparency in research in their applications for funding. Specific considerations follow:
National Institutes for Health: Newly revised grant application instructions clarify expectations to ensure that NIH is funding the best and most rigorous science, highlight the need for applicants to describe details that may have been previously overlooked, highlight the need for reviewers to consider such details in their reviews through revised review criteria, and minimize additional burden. These new instructions and revised review criteria focus on four areas deemed important for enhancing rigor and transparency:
- Scientific Premise of Proposed Research
- Rigorous Experimental Design
- Consideration of Sex and Other Relevant Biological Variables
- Authentication of Key Biological and/or Chemical Resources
Investigators are strongly encouraged to discuss these revised application instructions with NIH program staff prior to submission of applications. Further information is provided at the following website.
National Science Foundation: The NSF has developed a reproducibility “framework” to define and enhance the confidence in and reliability of science and engineering research. The framework suggests a discipline-specific approach to improving research rigor and transparency; proposes to develop approaches to data collection, sharing, and curation; seeks to create best practice standards for research practices, including instrumentation, models, and interpretation of findings across disciplines; and encourages the dissemination of replications and studies yielding negative as well as positive results. Further information may be found here.
Research sponsors may also expect a data sharing plan at time of proposal submission. Mandates are sometimes specific to the type of research and program, where field specific data repositories are created. Example requirements can be found here:
Resources Available to USC Faculty
Digital.usc.edu provides access to various resources to enable sharing of research information. Researchers should in particular consider use of the Open Science Framework as the vehicle for recording and sharing research design, hypotheses and research data.
These resources and others permit research investigators to share information that goes well beyond what would normally be permitted in the space of a journal article. It is also possible to control versions of documents, datasets, software code and other study inputs through such resources.
National Academies report on Fostering Integrity in Research – Free PDF available for download.
Video presentation of Private Secrets, Data Sharing and Trust in Science
Transparency and Opennesss Promotion (TOP) guidelines provide standards for journals to support transparency of methods and resources
USC Norris library can assist in data management plan creation
Baker M. How quality control could save your science. Nature. 2016 Jan 28;529(7587):456-8.
Freedman LP, Cockburn IM, Simcoe TS. The Economics of Reproducibility in Preclinical Research. PLoS Biol. 2015 Jun 9;13(6):e1002165.
Open Science Collaboration. PSYCHOLOGY. Estimating the reproducibility of psychological science. Science. 2015 Aug 28;349(6251):aac4716.
Van Bavel, JJ, Mende-Siedlecki, P, Brady, WJ and Reinero. DA. Contextual sensitivity in scientific reproducibility, PNAS 2016 113 (23) 6454-6459; published ahead of print May 23, 2016, doi:10.1073/pnas.1521897113.
Reproducibility of Scientific Data– Presentation by Center for Excellence in Research (CER)
Data Reproducibility & Good Research Practices, by Mike Jamieson- Presentation from Navigating the University Faculty Retreat, August 19, 2016