Match!

Replications in Economics: A Progress Report

Published on Dec 3, 2014in Econ Journal Watch
Maren Duvendack12
Estimated H-index: 12
,
Richard Palmer-Jones18
Estimated H-index: 18
,
W. Robert Reed20
Estimated H-index: 20
Sources
Abstract
This study reports on various aspects of replication research in economics. It includes (i) a brief history of data sharing and replication; (ii) the results of the authors’ survey administered to the editors of all 333 “Economics” journals listed in Web of Science in December 2013; (iii) an analysis of 155 replication studies that have been published in peer-reviewed economics journals from 1977-2014; (iv) a discussion of the future of replication research in economics, and (v) observations on how replications can be better integrated into research efforts to address problems associated with publication bias and other Type I error phenomena.
Figures & Tables
  • References (83)
  • Citations (32)
📖 Papers frequently viewed together
335 Citations
4,119 Citations
1986
214 Citations
78% of Scinapse members use related papers. After signing in, all features are FREE.
References83
Newest
#1Michael A. Clemens (CGD: Center for Global Development)H-Index: 27
The welcome rise of replication tests in economics has not been accompanied by a single, clear definition of replication. A discrepant replication, in current usage of the term, can signal anything from an unremarkable disagreement over methods to scientific incompetence or misconduct. This paper proposes an unambiguous definition of replication, one that reflects currently common but unstandardized use. It contrasts this definition with decades of unsuccessful attempts to standardize terminolog...
34 CitationsSource
#1Laura Camfield (Department for International Development)H-Index: 22
#2Maren DuvendackH-Index: 12
Last. Richard Palmer-JonesH-Index: 18
view all 3 authors...
The thrust for evidence‐based policymaking has paid little attention to problems of bias. Statistical evidence is more fragile than generally understood, and false positives are all too likely given the incentives of policymakers and academic and professional evaluators. Well‐known cognitive biases make bias likely for not dissimilar reasons in qualitative and mixed methods evaluations. What we term delinquent organisational isomorphism promotes purportedly scientific evaluations in inappropriat...
12 CitationsSource
#1John Gibson (University of Waikato)H-Index: 35
#2David L. Anderson (Queen's University)H-Index: 5
Last. John Tressler (University of Waikato)H-Index: 8
view all 3 authors...
The ranking of an academic journal is important to authors, universities, journal publishers and research funders. Rankings are gaining prominence as countries adopt regular research assessment exercises that especially reward publication in high impact journals. Yet even within a rankings-oriented discipline like economics there is no agreement on how aggressively lower ranked journals are down-weighted and in how wide is the universe of journals considered. Moreover, since it is typically less...
30 CitationsSource
#1Stan J. Liebowitz (UTD: University of Texas at Dallas)H-Index: 26
type="main" xml:lang="en"> This article examines how economics departments judge research articles and assign credit to authors. It begins with a demonstration that only strictly prorated author credit induces researchers to choose efficient sized teams. Nevertheless, survey evidence reveals that most economics departments only partially prorate authorship credit, implying excessive coauthorship. Indeed, a half-century increase in coauthorship may be better explained by incomplete proration than b...
22 CitationsSource
#1Annie Franco (Stanford University)H-Index: 4
#2Neil Malhotra (Stanford University)H-Index: 26
Last. Gabor Simonovits (Stanford University)H-Index: 6
view all 3 authors...
We studied publication bias in the social sciences by analyzing a known population of conducted studies—221 in total—in which there is a full accounting of what is published and unpublished. We leveraged Time-sharing Experiments in the Social Sciences (TESS), a National Science Foundation–sponsored program in which researchers propose survey-based experiments to be run on representative samples of American adults. Because TESS proposals undergo rigorous peer review, the studies in the sample all...
342 CitationsSource
#1Vegard IversenH-Index: 13
7 CitationsSource
#1Simone SchnallH-Index: 1
Johnson, Cheung, and Donnellan (2014a) reported a failure to replicate Schnall, Benton, and Harvey (2008)’s effect of cleanliness on moral judgment. However, inspection of the replication data shows that participants provided high numbers of severe moral judgments – a ceiling effect. In the original data percentage of extreme responses per moral dilemma correlated negatively with the effect of the manipulation. In contrast, this correlation was absent in the replications, due to almost all items...
15 Citations
#1David J. Johnson (MSU: Michigan State University)H-Index: 11
#2Felix Cheung (MSU: Michigan State University)H-Index: 11
Last. M. Brent Donnellan (MSU: Michigan State University)H-Index: 48
view all 3 authors...
Schnall, Benton, and Harvey (2008) hypothesized that physical cleanliness reduces the severity of moral judgments. In support of this idea, they found that individuals make less severe judgments when they are primed with the concept of cleanliness (Exp. 1) and when they wash their hands after experiencing disgust (Exp. 2). We conducted direct replications of both studies using materials supplied by the original authors. We did not find evidence that physical cleanliness reduced the severity of m...
52 CitationsSource
In empirical economics, a twofold lack of incentives leads to chronic problems with replicability: For authors of empirical studies providing replicable material is not awarded in the same way as publishing new irreplicable studies is. Neither is authoring replication studies. We offer a strategy to set incentives for replicability and replication. By integrating replication studies in the education of young scholars, we raise the awareness for the importance of replicability among the next gene...
1 Citations
#1Zacharias ManiadisH-Index: 6
#2Fabio Tufano (University of Nottingham)H-Index: 5
Last. John A. List (U of C: University of Chicago)H-Index: 75
view all 3 authors...
Some researchers have argued that anchoring in economic valuations casts doubt on the assumption of consistent and stable preferences. We present new evidence that questions the robustness of certain anchoring results. We then present a theoretical framework that provides insights into why we should be cautious of initial empirical findings in general. The model importantly highlights that the rate of false positives depends not only on the observed significance level, but also on statistical po...
93 CitationsSource
Cited By32
Newest
Source
#1Emma McManus (UEA: University of East Anglia)H-Index: 3
#2David Turner (UEA: University of East Anglia)H-Index: 31
Last. Tracey Sach (UEA: University of East Anglia)H-Index: 30
view all 3 authors...
The International Society for Pharmacoeconomics and Outcomes Research (ISPOR) modelling taskforce suggests decision models should be thoroughly reported and transparent. However, the level of transparency and indeed how transparency should be assessed are yet to be defined. One way may be to attempt to replicate the model and its outputs. The ability to replicate a decision model could demonstrate adequate reporting transparency. This review aims to explore published definitions of replication s...
1 CitationsSource
#1Stephan B. Bruns (GAU: University of Göttingen)H-Index: 6
#2Johannes König (University of Kassel)H-Index: 1
Last. David I. Stern (ANU: Australian National University)H-Index: 46
view all 3 authors...
We replicate Stern (1993, Energy Economics), who argues and empirically demonstrates that it is necessary (i) to use quality-adjusted energy use and (ii) to include capital and labor as control variables in order to find Granger causality from energy use to GDP. Though we could not access the original dataset, we can verify the main original inferences using data that are as close as possible to the original. We analyze the robustness of the original findings to alternative definitions of variab...
4 CitationsSource
Source
Six years ago, the International Initiative for Impact Evaluation (3ie) launched a programme to promote and fund replication studies of impact evaluations in international development. We designed the programme with the objective of improving the quality of evidence for development policy-making, using replication research to both validate the results of published impact evaluations and build the incentives for more transparent and high quality research going forward. The programme’s focus is in...
1 CitationsSource
Abstract Management literature may be populated by studies that report exaggerated levels of significance, and one potential solution to this problem is providing support for replication research. Drawing on the analysis of editorials published by top management journals between 1970 and 2015, I show how the issue of replication research was framed and discussed and how policy toward replication research was communicated to the readers. Only 67 of 1901 editorials published within that period inv...
Source
#1Frank Mueller-Langer (MPG: Max Planck Society)H-Index: 7
#2Benedikt Fecher (German Institute for Economic Research)H-Index: 5
Last. Gert G. Wagner (MPG: Max Planck Society)H-Index: 43
view all 4 authors...
We investigate how often replication studies are published in empirical economics and what types of journal articles are eventually replicated. We find that from 1974 to 2014 0.10% of publications in the Top 50 economics journals were replications. We take into account the results of replication (negating or reinforcing) and the extent of replication: narrow replication studies are typically devoted to mere replication of prior work while scientific replication studies provide a broader analysis...
4 CitationsSource
There is growing interest in enhancing research transparency and reproducibility in economics and other scientific fields. We survey existing work on these topics within economics, and discuss the evidence suggesting that publication bias, inability to replicate, and specification searching remain widespread in the discipline. We next discuss recent progress in this area, including through improved research design, study registration and pre-analysis plans, disclosure standards, and open sharing...
18 CitationsSource
#1Benjamin Leiva (UGA: University of Georgia)H-Index: 1
#2Zhongyuan Liu (UGA: University of Georgia)H-Index: 1
Abstract Over two decades ago, Stern (1993) published evidence in this journal of unidirectional Granger-causality running from GDP to a raw measure of energy use, from a weighted measure of energy use to GDP, and of other relations pertaining to labor and capital. In this paper we verify the original results and conduct a two-fold reanalysis: on one hand we revisit Granger-causality using the Toda and Yamamoto (TY) procedure to account for integrated series, and on the other we study super exog...
3 CitationsSource
#1George Alter (Inter-university Consortium for Political and Social Research)H-Index: 20
#2Richard Gonzalez (UM: University of Michigan)H-Index: 41
8 CitationsSource