Registered Reports: Piloting a Pre-Results Review Process at the Journal of Development Economics

Registered Reports: Piloting a Pre-Results Review Process at the Journal of Development Economics

Template G Content Blocks
Sub Editor

Editor's note: This is a cross-posting which appeared originally on the World Bank Development Impact Blog.

By  Andy FosterDean Karlan, and Ted Miguel.

The world is a messy place. What happens when the results of an empirical study are mushy or inconsistent with prevailing theories? Unfortunately, papers with unclear or statistically insignificant results often go unpublished, even if they have rigorous research designs and good data. In such cases, the research community is typically only left to consider the papers that tell a “neat” and clean story. When economic and social policy relies on academic knowledge, this publication bias can be costly to society.

In a new effort to potentially help address this problem, the Journal of Development Economics (JDE) now offers authors the opportunity to have their prospective empirical projects reviewed and approved for publication before the results are known. This track for article submissions will be available as part of a pilot project, which will allow us to better understand the extent to which pre-results review can be applied at the JDE, and possibly in economics as a whole. This new article format may not apply to all papers currently being published at the JDE; we envision that it will be particularly attractive to development economists working on research projects for which the data is yet to be collected (or researchers who have not yet had access to the data they will analyze).

To our knowledge, this appears to be among the first attempts to introduce pre-results peer review in an economics journal. However, more than 90 academic journals in biology, medicine, political science, psychology, and other disciplines are already implementing pre-results review, with the subsequent published articles usually called ‘Registered Reports’ (RRs). We have tried to learn from their efforts in preparation for this pilot, and are coordinating our activities with the Berkeley Initiative for Transparency in the Social Sciences (BITSS) to help support authors and referees during this pilot phase.

What is pre-results review?
The JDE pre-results submissions track is a two-stage review process, available under “Registered Reports” on the JDE online submissions portal.
At Stage 1, authors will submit plans for their prospective empirical projects, which would typically contain a literature review, research question(s), hypotheses, and a detailed methodological framework. This submission--that looks much like a pre-analysis plan--will be peer-reviewed based on the merits of its research question and theoretical and methodological framework before any empirical results are realized. If positively evaluated, the plan will be accepted based on pre-results review. This constitutes a commitment by the journal to later publish the full paper regardless of the nature of its empirical results, as long as the data collection and analysis maintains standards of quality and was implemented in alignment with the research design accepted at Stage 1.

Where the intervention cannot be implemented as pre-specified, the Editors will try to be reasonable about accommodating deviations and will strive to err in favor of authors as long as deviations are reported transparently[1] (learn more in the JDE RR Author Guidelines here). We hope that this pilot will help us better identify examples of deviations that are unacceptable. Nonetheless, following acceptance based on pre-results review, authors can post the paper on their websites and accurately label it as “accepted at the JDE based on pre-results review.”

Note: being accepted in Stage 1 does not constitute a commitment to then submit for Stage 2. Should an author want to first submit the final paper to a top general interest journal, they are welcome to do so. We would ask that they still include in the paper a mention of having been approved by the JDE pre-results review process.

Authors will then collect and analyze their data and submit the full paper (including the results and discussion sections) for final review in Stage 2. At this stage, reviewers will be asked to assess whether the data are of sufficient quality to test the proposed hypotheses, the intervention (if any) was implemented as specified, and the data were appropriately analyzed and interpreted. Authors will be free to include findings of analyses beyond those specified at Stage 1, but these should be clearly noted as the result of exploratory analyses.

The final publication resulting from the pre-results review process will look like any other article published in the JDE, with two exceptions: first, the main text will contain a footnote stating that the paper was submitted for pre-results review, and second, the pre-specified research plan will be included in the supplementary online appendix.

What evidence do we have, and what are we hoping to learn from this pilot? 
The scientific publishing community is still learning about the potential benefits and challenges of this new approach to reviewing research articles, as just over 90 studies have been published as registered reports across all implementing journals to date.

We have some evidence that journal editors with experience with this format are optimistic about its potential to reduce publication bias, however, it also appears submission rates have been low.[2]

In this pilot, the JDE aims to explore whether the pre-results review could be useful in the field of development economics. Several questions will be important to consider as the pilot progresses:

  • Are we receiving high quality studies and/or results that would otherwise go unpublished? Does this approach to article review help reduce publication bias?
  • Is there sufficient interest from development economics researchers in this format (i.e., are we receiving many submissions)?
  • Is the format sustainable? Does it increase the level of effort demanded from referees? Are we as a journal spending considerable time reviewing plans that are never implemented, or are ultimately withdrawn by the authors?
  • Is this format particularly beneficial to junior scholars, who may be able to share an Acceptance based on pre-results review with a tenure committee or hiring committee?
  • To what extent can we realistically expect authors to implement their research design largely as specified in their Stage 1 submission? Or will the real-world complications of field work generally make this impossible?

We also hope to learn whether this approach to article review can be sustainably integrated into the JDE’s publishing process.

How can authors submit their work?
If interested in submitting a research study for pre-results review, authors should initiate a submission as a ‘Registered Report Stage 1: Proposal’ on the JDE’s online submissions portal. Please see the JDE Registered Reports Author Guidelines for additional information.

Both JDE and BITSS staff are available to support authors in preparing submissions. Please contact Aleksandar Bogdanoski (abogdanoski@berkeley.edu) with questions about study pre-registration, pre-analysis plans and how to submit your article for pre-results review, or at JDE please contact either Lead Editor Prof. Andrew Foster (andrew_foster@brown.edu) or Editor Prof. Dean Karlan (karlan@northwestern.edu). This is a pilot project and your feedback is extremely valuable; please reach out to us with comments or suggestions for improvement.

 
[1] If a deviation is major, for example a change in the primary hypothesis, then the authors will have the option of withdrawing their paper and submitting it as a new “normal” submission. This will not impact the likelihood for the paper to be accepted.
[2] BITSS surveyed editors of the 76 journals that had implemented pre-results review as of December 2017. Out of 36 responses, we learned that 92% of respondents thought that pre-results review has the potential to reduce publication bias in their discipline. In the same survey, 44% of editors reported low submissions as a challenge with registered reports; some note that their journal had yet to receive any submissions for pre-results review.
March 12, 2018