Addicted to constructs: science in reverse?
Article first published online: 30 MAY 2013
© 2013 Society for the Study of Addiction
Volume 108, Issue 9, pages 1532–1533, September 2013
How to Cite
Larsen, K. R., Voronovich, Z. A., Cook, P. F. and Pedro, L. W. (2013), Addicted to constructs: science in reverse?. Addiction, 108: 1532–1533. doi: 10.1111/add.12227
- Issue published online: 16 AUG 2013
- Article first published online: 30 MAY 2013
- National Institutes of Health/National Cancer Institute
- National Science Foundation. Grant Number: NSF 0965338
Advancement in science requires clarity of constructs. Like other fields in behavioral science, addiction research is being held back by researchers' use of different terms to mean similar things (synonymy) and the same term to mean different things (polysemy). Journals can help researchers to stay focused on novel and significant research questions by challenging new terms introduced without adequate justification and requiring authors to be parsimonious in their use of terms. To support construct lucidity, new modes of thinking about research integration are needed to keep up with the aggregate of relevant research.
Rampant synonymy (differently named identical constructs) and polysemy (identically named dissimilar constructs)—idiomatically, the Jingle and Jangle Fallacies [1, 2]—hinder behavioral science. A linguistic hypothesis attributes these problems to imprecise language, wherein fewer than 20% of all individuals express the same idea using the same words . An alternate, motivational hypothesis postulates that researchers are addicted to new-construct creation because innovation drives both promotion and funding . We posit that these two phenomena cause significant confusion, rendering current behavioral science a ‘quixotic pursuit’ .
Information theory suggests that any test of the relationship between constructs X and Y diminishes the information yield of subsequent X–Y tests. If two divergently named but functionally identical constructs X‘ and Y’ are later created—whether through opportunism or obliviousness—a test of the X‘–Y’ relationship provides less value. The test also reduces the value of extant research by eroding the meaning of and attention to X and Y.
Multi-disciplinary fields such as addiction science may be particularly vulnerable to this scenario, because researchers from diverse disciplines express similar ideas using different words. Research that does not build on existing findings inhibits the science of human behavior and obstructs the formulation of meaningful interdisciplinary theory, which in turn hinders application of theory to practice. Data from the Inter-Nomological Network (INN) interdisciplinary construct search engine (http://inn.colorado.edu) show that for a sample of 500 Journal of Personality and Social Psychology (JPSP) papers published from 2005 to 2009, the average paper contains approximately 20 variables. Ignoring directionality, mediating and moderating relationships, the average paper therefore potentially extends our knowledge by 190 or fewer tests of relationships. However, no two constructs in different papers can, a priori, be assumed to be identical. As a result, in the JPSP set there are 10 000 variables and 95 000 testable hypotheses, but almost 50 million untested relationships. Furthermore, the hypothetical addition of a paper #501, with 20 potentially unrelated variables, again tests no more than 190 hypotheses but increases the number of unknown relationships by more than 200 000. Thus, in our current paradigm, potential knowledge increases arithmetically, while potential confusion increases exponentially. This is the behavioral sciences' ‘reverse progress problem’.
Relevance to addiction research
The requirement for research integration has traditionally been addressed by creating taxonomies, ontologies, reviews and meta-analyses that are often outdated before they are seen in print. With the exponential growth of knowledge, these approaches—while critical—are no longer sufficient. Journal editors and reviewers can improve cumulative behavioral science in addiction research by focusing carefully upon the underlying meaning of constructs and by requiring authors to link their construct terms more closely to existing research. Our own work on INN has shown that construct-level citations seldom exist and even then often trace back to differently named constructs, if the cited source contains relevant constructs at all. However, to function in this role, editors and reviewers must have access to new tools to clarify constructs.
The US National Cancer Institute (NCI) funds a large amount of research on smoking, an addictive behavior linked to lung cancer. NCI developed the Grid-Enabled Measures (GEM) database to address the reverse progress problem through expert consensus on a reduced set of high-quality constructs and measures. GEM is a web-based tool that allows researchers to examine cross-domain constructs such as substance abuse, smoking/tobacco use and stress. It allows users to submit additional constructs of interest, to identify similar terms and to comment on other researchers' submissions. Our recent analysis found that of 238 GEM constructs, there were four exactly synonymous self-efficacy constructs, only two of which were actually called ‘self-efficacy’. Similarly, the constructs ‘excessive alcohol use’ and ‘tobacco use’ each had a synonymous construct, with excessive alcohol use also appearing as ‘problematic drinking’ (clear synonymy) and tobacco use defined either as any use or specifically as smoking (potential polysemy). Additional evidence of polysemy has been found for the constructs alcohol-related ‘aggression’  and ‘self-control’ , each of which has been measured in multiple ways that intercorrelate poorly yet are still presumed to measure identical constructs. Although GEM originated as a repository of only the highest-quality constructs, measures and data sets with minimal overlap, its Web version 2.0 design leaves it vulnerable to both linguistic and motivational synonymy and polysemy problems. GEM could take advantage of this reality and encourage inclusion of synonymous and polysemous constructs, while resolving the associated problems by mapping the constructs onto higher-level frameworks or ontologies of meaning.
Behavioral science needs an effort analogous to the work of the HUGO Gene Nomenclature Committee. This committee reduced a bewildering array of proposed gene names down to a manageable set while providing links to all known synonymous gene names. A similar national-level effort to address construct diffusion—perhaps with GEM at the center—is needed in behavioral science. Higher-level classification of constructs such as the Theoretical Domains Framework [8, 9] can accelerate these taxonomial efforts and increase research transparency. Individually, researchers must clearly identify constructs in published papers and develop automated nomenclature tools capable of tracking theories and extracting and integrating constructs from papers. The task of integrating research by connecting synonymous constructs and parsing polysemous constructs is an urgent one if behavioral science is to advance. Not addressing this problem may force us to accept behavioral science merely as a solipsistic and proprietary pursuit that feeds upon rather than nourishes science and the populations it purports to benefit.
Declaration of interests
Sources of funds include National Institutes of Health/National Cancer Institute ($24 900 for ‘Semi-automatic detection of synonymous constructs in the Grid-Enabled Measures (GEM) database’, 2012–2013. PI) and grant no. NSF 0965338: National Science Foundation (NSF; $358 000 for ‘TLS and DAT: Construct Utilization in the Behavioral Sciences’, 2010–2013. PI with Jintae Lee and Eliot Rich).
- 1An Introduction to the Theory of Mental and Social Measurements. New York: Teachers College, Columbia University; 1904.
- 2Interpretation of Educational Measurements. Oxford: World Book Company; 1927.
- 4The toothbrush problem. APS Obs 2008; 21: 1–3.
- 7Jingle Jangle: A Meta-Analysis of Convergent Validity Evidence for Self-Control Measures [Manuscript]. University of Pennsylvania, Department of Psychology. 2009.,