REVISITING THE FILE DRAWER PROBLEM IN META-ANALYSIS: AN ASSESSMENT OF PUBLISHED AND NONPUBLISHED CORRELATION MATRICES

Authors


  • A previous version of this paper was presented at the meetings of the Academy of Management, San Antonio, TX, August 2011. Also, a previous and much abbreviated version excluding detailed numerical results reported in this paper was published in the Academy of Management Best Paper Proceedings (2011). We thank the generosity of each of the researchers who anonymously and so gracefully provided us with data for Study 3. These individuals serve as an example of scholarship and collegiality.

Herman Aguinis, Department of Management and Entrepreneurship, Kelley School of Business, Indiana University, 1309 E. 10th Street, Suite 630D, Bloomington, IN 47405–1701; haguinis@indiana.edu.

Abstract

The file drawer problem rests on the assumption that statistically non-significant results are less likely to be published in primary-level studies and less likely to be included in meta-analytic reviews, thereby resulting in upwardly biased meta-analytically derived effect sizes. We conducted 5 studies to assess the extent of the file drawer problem in nonexperimental research. In Study 1, we examined 37,970 correlations included in 403 matrices published in Academy of Management Journal (AMJ), Journal of Applied Psychology (JAP), and Personnel Psychology (PPsych) between 1985 and 2009 and found that 46.81% of those correlations are not statistically significant. In Study 2, we examined 6,935 correlations used as input in 51 meta-analyses published in AMJ, JAP, PPsych, and elsewhere between 1982 and 2009 and found that 44.31% of those correlations are not statistically significant. In Study 3, we examined 13,943 correlations reported in 167 matrices in nonpublished manuscripts and found that 45.45% of those correlations are not statistically significant. In Study 4, we examined 20,860 correlations reported in 217 matrices in doctoral dissertations and found that 50.78% of those correlations are not statistically significant. In Study 5, we compared the average magnitude of a sample of 1,002 correlations from Study 1 (published articles) versus 1,224 from Study 4 (dissertations) and found that they were virtually identical (i.e., .2270 and .2279, respectively). In sum, our 5 studies provide consistent empirical evidence that the file drawer problem does not produce an inflation bias and does not pose a serious threat to the validity of meta-analytically derived conclusions as is currently believed.

Ancillary