From Outcome to Process Focus: Fostering a More Robust Psychological Science Through Registered Reports and Results-Blind Reviewing:

Published on Jul 1, 2018in Perspectives on Psychological Science8.19
· DOI :10.1177/1745691618767883
James A. Grand8
Estimated H-index: 8
(UMD: University of Maryland, College Park),
Steven G. Rogelberg30
Estimated H-index: 30
(UNCC: University of North Carolina at Charlotte)
+ 2 AuthorsScott Tonidandel22
Estimated H-index: 22
(Davidson College)
A variety of alternative mechanisms, strategies, and “ways of doing” have been proposed for improving the rigor and robustness of published research in the psychological sciences in recent years. In this article, we describe two existing but underused publication models—registered reporting (RR) and results-blind reviewing (RBR)—that we believe would contribute in important ways to improving both the conduct and evaluation of psychological research. We first outline the procedures and distinguishing features of both publication pathways and note their value for promoting positive changes to current scientific practices. We posit that a significant value of RR and RBR is their potential to promote a greater focus on the research process (i.e., how and why research is conducted) relative to research outcomes (i.e., what was observed or concluded from research). We conclude by discussing what we perceive to be five common beliefs about RR and RBR practices and attempt to provide a balanced perspective of the...
  • References (33)
  • Citations (6)
📖 Papers frequently viewed together
1 Author (Dennis Drotar)
3 Citations
1 Citations
78% of Scinapse members use related papers. After signing in, all features are FREE.
#1Kevin R. Murphy (UL: University of Limerick)H-Index: 57
#2Herman Aguinis (GW: George Washington University)H-Index: 56
The practice of hypothesizing after results are known (HARKing) has been identified as a potential threat to the credibility of research results. We conducted simulations using input values based on comprehensive meta-analyses and reviews in applied psychology and management (e.g., strategic management studies) to determine the extent to which two forms of HARKing behaviors might plausibly bias study outcomes and to examine the determinants of the size of this effect. When HARKing involves cherr...
4 CitationsSource
#1James A. Grand (UMD: University of Maryland, College Park)H-Index: 8
#2Steven G. Rogelberg (UNCC: University of North Carolina at Charlotte)H-Index: 30
Last. Donald M. Truxillo (PSU: Portland State University)H-Index: 31
view all 8 authors...
Credibility and trustworthiness are the bedrock upon which any science is built. The strength of these foundations has been increasingly questioned across the sciences as instances of research misconduct and mounting concerns over the prevalence of detrimental research practices have been identified. Consequently, the purpose of this article is to encourage our scientific community to positively and proactively engage in efforts that foster a healthy and robust industrial and organizational (I-O...
16 CitationsSource
#2Ken KelleyH-Index: 25
Last. Scott E. MaxwellH-Index: 50
view all 3 authors...
The sample size necessary to obtain a desired level of statistical power depends in part on the population value of the effect size, which is, by definition, unknown. A common approach to sample-si...
53 CitationsSource
#1Ernest H. O’Boyle (UI: University of Iowa)H-Index: 10
#2George C. Banks (Longwood University)H-Index: 22
Last. Erik Gonzalez-Mule (UI: University of Iowa)H-Index: 8
view all 3 authors...
The issue of a published literature not representative of the population of research is most often discussed in terms of entire studies being suppressed. However, alternative sources of publication bias are questionable research practices (QRPs) that entail post hoc alterations of hypotheses to support data or post hoc alterations of data to support hypotheses. Using general strain theory as an explanatory framework, we outline the means, motives, and opportunities for researchers to better thei...
85 CitationsSource
#1Jeffrey S. Harrison (UR: University of Richmond)H-Index: 29
#2George C. Banks (Longwood University)H-Index: 22
Last. Jeremy C. Short (OU: University of Oklahoma)H-Index: 37
view all 5 authors...
Publication bias is the systematic suppression of research findings due to small magnitude, statistical insignificance, or contradiction of prior findings or theory. We review possible reasons why publication bias may exist in strategy research and examine empirical evidence regarding the influence of publication bias in the field. Overall, we conclude that publication bias affects many, but not all, topics in strategic management research. Correlation inflation due to publication bias ranged in...
29 CitationsSource
#1John Antonakis (UNIL: University of Lausanne)H-Index: 32
Abstract In this position paper, I argue that the main purpose of research is to discover and report on phenomena in a truthful manner. Once uncovered, these phenomena can have important implications for society. The utility of research depends on whether it makes a contribution because it is original or can add to cumulative research efforts, is rigorously and reliably done, and is able to inform basic or applied research and later policy. However, five serious “diseases” stifle the production ...
64 CitationsSource
#1John P. A. Ioannidis (Stanford University)H-Index: 151
Policy Points: Currently, there is massive production of unnecessary, misleading, and conflicted systematic reviews and meta-analyses. Instead of promoting evidence-based medicine and health care, these instruments often serve mostly as easily produced publishable units or marketing tools. Suboptimal systematic reviews and meta-analyses can be harmful given the major prestige and influence these types of studies have acquired. The publication of systematic reviews and meta-analyses should be rea...
265 CitationsSource
#1G FindleyMichael (University of Texas at Austin)H-Index: 19
#2Nathan M. Jensen (University of Texas at Austin)H-Index: 3
Last. Thomas B. Pepinsky (Cornell University)H-Index: 15
view all 4 authors...
In 2015, Comparative Political Studies embarked on a landmark pilot study in research transparency in the social sciences. The editors issued an open call for submissions of manuscripts that contained no mention of their actual results, incentivizing reviewers to evaluate manuscripts based on their theoretical contributions, research designs, and analysis plans. The three papers in this special issue are the result of this process that began with 19 submissions. In this article, we describe the ...
15 CitationsSource
#1George C. BanksH-Index: 22
#2Steven G. RogelbergH-Index: 30
Last. Deborah E. Rupp (Purdue University)H-Index: 36
view all 5 authors...
Purpose Questionable research or reporting practices (QRPs) contribute to a growing concern regarding the credibility of research in the organizational sciences and related fields. Such practices include design, analytic, or reporting practices that may introduce biased evidence, which can have harmful implications for evidence-based practice, theory development, and perceptions of the rigor of science.
22 CitationsSource
#1Daniel T. Gilbert (Harvard University)H-Index: 53
#2Gary King (Harvard University)H-Index: 74
Last. Timothy D. Wilson (UVA: University of Virginia)H-Index: 56
view all 4 authors...
A paper from the Open Science Collaboration (Research Articles, 28 August 2015, aac4716) attempting to replicate 100 published studies suggests that the reproducibility of psychological science is surprisingly low. We show that this article contains three statistical errors and provides no support for such a conclusion. Indeed, the data are consistent with the opposite conclusion, namely, that the reproducibility of psychological science is quite high.
148 CitationsSource
Cited By6
#1Russell Craig (Durham University)
#1Russell CraigH-Index: 31
view all 4 authors...
Abstract This paper explores the nature and impact of research misconduct in psychology by analyzing 160 articles that were retracted from prominent scholarly journals between 1998 and 2017. We compare findings with recent studies of retracted papers in economics, and business and management, to profile practices that are likely to be problematic in cognate social science disciplines. In psychology, the principal reason for retraction was data fabrication. Retractions took longer to make, and ge...
#1Herman Aguinis (GW: George Washington University)H-Index: 56
#2George C. Banks (UNCC: University of North Carolina at Charlotte)H-Index: 22
Last. Wayne F. Cascio (University of Colorado Denver)H-Index: 44
view all 4 authors...
Abstract Efforts to promote open-science practices are, to a large extent, driven by a need to reduce questionable research practices (QRPs). There is ample evidence that QRPs are corrosive because they make research opaque and therefore challenge the credibility, trustworthiness, and usefulness of the scientific knowledge that is produced. A literature based on false-positive results that will not replicate is not only scientifically misleading but also worthless for anyone who wants to put kno...
#1Qingwei Chen (SCNU: South China Normal University)H-Index: 1
#2TT Ru (SCNU: South China Normal University)
Last. Guofu Zhou (SCNU: South China Normal University)H-Index: 2
view all 8 authors...
Lighting Research & Technology (LRT) is an influential journal in the field of light and lighting dating back to 1969. To celebrate its 50th birthday, the current study explored its bibliometric ch...
#1George C. Banks (UNCC: University of North Carolina at Charlotte)H-Index: 22
#2James G. Field (WVU: West Virginia University)H-Index: 7
Last. Steven G. Rogelberg (UNCC: University of North Carolina at Charlotte)H-Index: 30
view all 7 authors...
Open science refers to an array of practices that promote openness, integrity, and reproducibility in research; the merits of which are being vigorously debated and developed across academic journals, listservs, conference sessions, and professional associations. The current paper identifies and clarifies major issues related to the use of open science practices (e.g., data sharing, study pre-registration, open access journals). We begin with a useful general description of what open science in ...
4 CitationsSource
#1Herman Aguinis (GW: George Washington University)H-Index: 56
#2N. Sharon Hill (GW: George Washington University)H-Index: 8
Last. James R. Bailey (GW: George Washington University)H-Index: 18
view all 3 authors...
We offer best-practice recommendations for journal reviewers, editors, and authors regarding data collection and preparation. Our recommendations are applicable to research adopting different epist...
1 CitationsSource
#1Ernest H. O’Boyle (IU: Indiana University)H-Index: 10
#2George C. Banks (UNCC: University of North Carolina at Charlotte)H-Index: 22
Last. Zhenyu Yuan (UI: University of Iowa)H-Index: 6
view all 5 authors...
Moderated multiple regression (MMR) remains the most popular method of testing interactions in management and applied psychology. Recent discussions of MMR have centered on their small effect sizes and typically being statistically underpowered (e.g., Murphy & Russell, Organizational Research Methods, 2016). Although many MMR tests are likely plagued by type II errors, they may also be particularly prone to outcome reporting bias (ORB) resulting in elevated false positives (type I errors). We te...
2 CitationsSource
#1Jonathan Wai (UA: University of Arkansas)H-Index: 12
#2Diane F. Halpern (CMC: Claremont McKenna College)H-Index: 41
The open science or credibility revolution has divided psychologists on whether and how the “policy” change of preregistration and similar requirements will affect the quality and creativity of future research. We provide a brief history of how norms have rapidly changed and how news and social media are beginning to “disrupt” academic science. We note a variety of benefits, including more confidence in research findings, but there are possible costs as well, including a reduction in the number ...
1 CitationsSource