Moving social psychology into the age of open science
It is 2011 and you’ve just read an article in the prestigious Journal of Personality and Social Psychology that seemed to offer statistical support for ESP and parapsychology. The article was written by a well-known researcher from an Ivy League institution. The media is freaking out and calling into question all of psychological research. Six years later, the problem is worsened when you find out a different well-known researcher from the same Ivy League institution was caught falsifying data. You might start questioning psychology as well, with valid questions such as:
• What methods are scientists using to test their hypotheses?
• Are they making up data or analyzing it just to find significant results (e.g., fishing or p-hacking)?
• Do these results replicate or is only one scientist able to find them?
• Is our understanding of science biased because only significant results are being published (i.e., publication bias & file-drawer problem)?
People, both in the scientific community and outside, are requesting openness, accessibility, and transparency in science, which has led to the Open Science movement. Open Science takes several forms and has many benefits for researchers and consumers of science, answering many of the concerning questions above. For example, within the Open Science Framework (OSF), you can do things such as:
• Share materials: Rather than extending the length of your article, you can share materials on OSF so other researchers can replicate your findings.
• Pre-register hypotheses: Scientists state their hypotheses and planned analyses ahead of time to limit p-hacking and fishing.
• Share data: Other researchers can examine the findings, reducing concerns about falsification, p-hacking, and fishing.
• Register a report: Journals that are committed to Open Science are now accepting registered reports, which are submitted early in the research cycle, prior to data collection. If a journal accepts a pre-registered report, they are validating the project as important research that deserves to be conducted – saying, regardless of whether the results are significant, it will be published. As scientists, it would be nice to know what research has been done that has not supported hypotheses, so that other researchers are not forced to duplicate the efforts.
For the average person outside of academia, the benefits of Open Science include better access to the primary sources that might be misrepresented in mass media, and consequently, an increased sense of trust in the scientific community. With recent examples of scientific misconduct and a general growing skepticism of experts (anti-vax, climate change denial), it is clear that science must become more open and transparent. If people have access to scientists’ methods and data, with the opportunity to analyze that data themselves, their trust in the field may improve.
Excitingly, Legal Psychology is moving into the Open Science Era. Law and Human Behavior (LHB), the flagship journal of the American Psychology-Law Society, under the editorial leadership of Bradley McAuliff, has committed to contributing to the Open Science movement. In Taking the Next Steps: Promoting Open Science and Expanding Diversity in Law and Human Behavior, McAuliff and the other LHB editors (2019) outline LHB’s open science strategy, which includes: implementation of the Transparency and Openness Promotion (TOP) Guidelines; improving author-reviewer fit; and promoting diversity of content and people.
The (TOP) Guidelines are designed to reduce questionable research practices. Readers should have clear insight into what researchers do in a study. Specifically, LHB is focused on techniques that prevent authors from skewing their data and cherry-picking significant results, including requiring authors to:
• Report all independent and dependent variables (rather than just a subset);
• Report any excluded measures and describe why they were excluded;
• Report all statistical tests, even if not significant; and
• Not round down p-values to make results appear significant.
LHB is also making science more accessible to the reader by changing the formatting of abstract to highlight the most important aspects of a study. The new format will have the following clearly-labeled sections: Objective, hypotheses, method, results, and conclusions. An outline like this one can also be especially useful for non-experts interested in a research topic, as academic writing can be intimidating and difficult for many people. And LHB affirmed its commitment to publish the best possible articles on a variety of topics by diverse authorship.
LHB’s commitment to the Open Science movement and encouraging diversity in science is refreshing. The Legal Decision Lab is looking forward to the new era of Legal Psychology. Hopefully it will be filled with clearly explained methods, easily replicable experiments, articles that include strong research (even without significant results), and diverse research.