"Unlocking the file drawer" to ensure research results—even null results—are shared
... and other topics discussed at the annual meeting of the Berkeley Initiative for Transparency in the Social Sciences
Last month we presented at an event hosted by the Berkeley Initiative for Transparency in the Social Sciences (BITSS) on the "file drawer problem": when a study is not published in an academic journal, how can we ensure its results, even null results, are still reported—instead of remaining inaccessible in a (metaphorical) file drawer?
The event, a one-day workshop followed by BITSS' one-day annual meeting, brought together an interdisciplinary group of researchers, funders, journal editors, design specialists, and research administrators to discuss integrated approaches for improving the tracking of funded research outputs, with special consideration for projects that yield null results.
The workshop, “Unlocking the File Drawer,” focused on post-trial reporting of results in social science research; we discussed ideas for making it easier for researchers to report the results of their studies. At the annual meeting the next day, participants presented on new research on issues in research transparency, some of which was funded through BITSS' SSMART grant program.
Some of the ideas that emerged over the course of the two-day event included:
A template for null results reporting
Scott Williamson, a graduate student at Stanford, presented his work with co-authors at the Immigration Policy Lab (IPL) on a proposal of a null results reporting template. The template is designed to make it easier for researchers to think about how they can publish their results in formats other than peer-reviewed journals (which are unlikely to accept papers reporting null results).
The template would be approximately five pages in length (i.e., short), with a clear outline and structure, and including several components taken from pre-analysis plans or already completed analysis (reducing the need for new work).
Progress on post-trial reporting in registries
Nici Pfeiffer from the Center for Open Science, which hosts the Open Science Framework, mentioned that they have a process for sending reminder "nudges" to researchers to update their pre-registration with results.
Merith Basey of Universities Allied for Essential Medicines talked about their joint work with TranspariMed in relation to FDAAA 2007, a law that requires certain clinical trials to report results within twelve months of completing the study. They've released results of their own research on US universities' compliance with clinical trial transparency regulations (non-compliant clinical trials can technically be fined $10,000 per day for violating the regulations).
Cecilia Mo of UC Berkeley brought up whether a successful replication should also be considered a null finding since they are underreported, and they get put into the file drawer.
I (James Turitto) provided updates on the American Economic Association’s RCT registry, which is managed at J-PAL, and reported on changes made to the registry over the past few years (like the introduction of a registry review system and Digital Object Identifiers) and the current status of post-trial results reporting by researchers.
Jessaca Spybrook of Western Michigan University presented on REES: The Registry of Efficacy and Effectiveness Studies. It’s an education registry plus pre-analysis plan repository, and systematic categorization distinguishes this registry from others. (See slides from her presentation.)
Journal updates on steps to improve research transparency
Andrew Foster of Brown University, Editor-in-Chief of the Journal of Development Economics, provided an update on the pre-results review pilot at the Journal of Development Economics. So far, there have been 85 pre-results review submissions to the JDE, and the project has been largely successful.
Lars Vilhuber of Cornell University, American Economic Association Data Editor, presented on the AEA’s new data publication policy.
New in open access and open data
Elizabeth Marincola of the African Academy of Sciences (AAS) spoke about AAS’s innovative open access publishing platform, AAS Open Research. Open Research allows African scientists to publish their research quickly on a fully accessible, peer reviewed platform. Types of research that is (or will be) published on this platform include all fields of science, traditional research articles, systematic reviews, research protocols, replication and confirmatory studies, data notes, negative or null findings, and case reports. (See slides from her presentation.)
Daniella Lowenberg and John Chodacki recently published a book, Open Data Metrics, which proposes a path forward for the development of open data metrics by discussing data citations and how to credit researchers for the data they produce.
Assessing the effectiveness of pre-analysis plans
Daniel Posner of UCLA presented on Pre-Analysis Plans: A Stocktaking, joint work with George K. Ofosu, which analyzes a representative sample of 195 PAPs from the AEA and EGAP registration platforms to assess whether PAPs are sufficiently clear, precise, and comprehensive to be able to achieve their objectives of preventing “fishing” and reducing the scope for post-hoc adjustment of research hypotheses.
Recorded presentations for both days can be found here: Day 1: Unlocking the File Drawer, Day 2: Annual Meeting.
About BITSS and J-PAL transparency work
BITSS aims to enhance the research practices across various disciplines in the social sciences (including economics, psychology, and political science) in ways that promote transparency, reproducibility, and openness. J-PAL works closely with BITSS and has sought to be a leader in making research more transparent for over a decade by developing a registry for randomized evaluations and publishing data from research studies conducted by our network of 194 affiliated professors at universities around the world. To learn more about J-PAL’s research transparency efforts, read about our core activities.