Simplifying complexity: Mixture toxicity assessment in the last 20 years

Authors


  • See Table S1 for the number of citations and rank of all the “Top 100” papers, which in this essay are references [7,8].

Chemical monitoring provides ample evidence that organisms in their environment are exposed to complex cocktails of contaminants rather than individual chemicals. Thus, the challenge emerges on how to assess potential combined effects of such mixture exposures. The body of toxicological and ecotoxicologcial literature in the 90s suggested that combined effects of toxicological significance may indeed be caused in humans and the environment by an exposure to chemical mixtures. Furthermore, abundant statements seemed to agree that synergy was a major problem for environmental risk assessment. At the same time it was clear that an assessment of synergies based on the experimental testing of all possible mixtures occurring or anticipated in the environment was not a viable option for risk assessment. Moreover, existing uncertainty factors in chemical risk assessment are not intended to account for mixture effects, and formulating additional factors that would ensure the desired level of protection could not be based on any evidence. Therefore, it seemed as if chemical risk assessment was stuck in a deadlock without any management option, and revisiting component-based methodologies for assessing combined effects appeared as the most promising approach.

The empirical evidence of the time was derived mainly from studies of binary pesticide mixtures in short-term aquatic toxicity assays, which were analyzed by a plethora of different and often conflicting approaches. The confusion in terminology, methods, and conceptual premises reached almost Babylonian proportions, which lead to the widespread perception of unpredictability as the dominating feature of mixture (eco)toxicology. The various mathematical models and tools available fostered the assessment of combined effects without explicitly formulating the underlying, often fundamentally different assumptions of the different approaches. Furthermore, these assumptions were all too often not translated into consistent guidelines for the experimentalist, and descriptive tools seemed all too simplistic for a sound mixture toxicity assessment.

Synergism was a commonly used termed for describing the toxicity of a mixture. However, it was often overlooked that, as any other assessment term, synergism is a comparative statement, as it describes a mixture toxicity that is substantially higher than expected. Its use, therefore, requires a definition of the expected response of a mixture. Only this allows an unambiguous use, by comparing experimental observations with the expected response. The fundamental question of whether knowledge of the individual toxicities of the mixture components would allow the prediction of their combined effect, therefore, remained largely unanswered.

This started to change with a series of pioneering studies by Hermens and coworkers [1], which showed that the simple assumption of concentration addition provided quantitatively reasonable expectations for joint toxic effects of multiple mixtures of nonpolar, nonreactive organic compounds. For human-oriented environmental risk assessment the debate in the United States at this time focused on a mode of action debate to identify compounds that acted similar and could possibly be assessed together based on the concentration addition (CA) principle (e.g., Mileson et al. [2]). Also, a general belief was that determination of effect thresholds for individual chemicals was sufficiently protective for most mixtures [3].

Efforts of identifying patterns of combined effects for different types of mixtures led to the idea of comparatively testing the predictive power of the general concepts of CA (also called dose addition or Loewe additivity) [4] and independent action (IA; also known as response addition, effect multiplication, or Bliss independence) [5, 6], instead of developing specific models. This strategy also allowed tackling the question of whether effect thresholds of individual mixture components were protective against combination effects. A series of studies was designed to test multi-component mixtures with well-known mechanisms and modes of action for their components to investigate the following: 1) reference mixtures composed of compounds with entirely similar and dissimilar action [7, 8]; 2) mixtures with components in concentrations at and below statistically significant effects [9]; and 3) 2 different biological systems (bacteria and algae) to provide a proof of principle [7, 9]. As a prerequisite to meet the goal of analyzing low effect concentrations in the mixture experiments rigorous concentration-response estimations were performed employing a so-called “best fit modeling,” based on the simultaneous application of different nonlinear concentration-effect regression models [10].

The investigations demonstrated a surprisingly consistent overlap between observed and expected combination effects, as predicted from the information on the individual toxicities, the molar fractions of the components, and their mode of action [7-9]. In fact, the observed combined effects of the multi-component mixtures were, independent of the actual mixture ratios, effect level, chemical composition, biological endpoint, and test organism, almost perfectly captured by the predictions based on CA for the mixtures of anticipated similar modes of action and IA for those of anticipated dissimilar action, respectively. Moreover, there was clear evidence for combination effects even at concentrations that did not cause significant effects for the individual components [11, 12]. While this was conceptually to be expected for similarly acting compounds, the finding initially came as a surprise for dissimilarly acting components and has since been traced back to our understanding of low-dose effects.

The idea of using the generic predictive concepts directly took off in subsequent studies. Instead of using various models specific for a certain organism or mixture, CA and IA were thereafter more and more often employed right from the start for designing and assessing mixture studies. Different mixture types composed of pesticides, pharmaceuticals, metals, endocrine disruptors, or mycotoxins, to name a few, were systematically investigated and reviews became available (e.g., for pesticides [13]). Studies collected evidence in assays of different complexity, ranging from developmental endpoints in rodents over micro- and mesocosms, to cellular and receptor-binding assays (e.g., Kortenkamp et al. [14]).

Finding that multiple mixture measurements could be more informative than seemingly easier binary mixture studies, assuming that solid knowledge on the concentration-response relationships of the individual compounds is at hand, proved especially inspiring to a whole range of studies that contributed to the discussion on the relevance of endocrine disruption for human health and the environment [14].

The notion of following the reductionist idea of CA or IA as a reasonable reference prediction was subsequently developed further for mixtures of similarly and dissimilarly acting compounds by integrating both CA and IA models into a joint model [15]. This joint model subsequently adapted for combining measured and QSAR-based predicted effects in the assessment of mixtures of compounds and their transformation products [16], or for calculating potentially affected fractions of species, and was taken on board in effect-directed analysis of complex contaminated environmental samples for identification and confirmation of effect-contributing substances. Most recently, IA is becoming a popular reference case for the assessment of the joint effect of chemical and nonchemical stressors [17].

In particular, the finding that combined effects may be caused by even low or insignificant toxic concentrations of the individual components calls for explicit mixture risk assessment. Regulatory activities to account for this knowledge in Europe started to take off with the Mixture Conclusion of the Council of European Ministers in 2009, the requested state of the art on mixture toxicity [14], the follow-up opinion by the European Scientific Committees [18], which then cumulated in the communication from the commission on combination effects of chemicals (COM/2012/0252 final). Similar activities and reviews were pursued by the US-EPA, Sweden, the UK, Norway, and the WHO IPCS program, for example [19, 20].

Translating the findings from experimental mixture studies into regulatory proposals comprises the development of risk assessment approaches that make optimum use of available exposure and toxicity data from the individual chemicals. Any approach to this end has to recognize data and knowledge gaps, and identify different options for assessment to provide reasonable, protective, and pragmatically useful schemes [21]. In human risk assessment the perspective is still largely focused on mixtures of similarly acting compounds, which often leads to monographic efforts to review available knowledge on the mixture components modes and mechanisms of action. By contrast, in environmental risk assessment, simulation studies have shown the suitability and feasibility of using rather generic worst-case approaches that derive from CA with some additional simplifications. The latter opens the route to a more comprehensive and straightforward mixture assessment approach, which can be adapted to the specific needs of different regulatory frameworks. Specific schemes have since been proposed for biocides, pesticides, and fracking fluids, for example.

The reductionist approach of CA and IA has proven useful for the quantitative prediction and assessment of combined effects for various chemicals, biosystems, and exposure settings, but the employed biological responses were limited to mainly apical endpoints such as growth, reproduction, or lethality. Future work, therefore, could study the utility of the concepts when extended to the molecular level. The area of toxicogenomic studies and specifically transcriptome, proteome, or metabolome investigations opened novel routes in understanding modes of action and thus perspectives for interspecies or short-term to long-term effect extrapolation [22]. Meanwhile, over 40 experimental studies have investigated their applicability for mixture assessment. However, they have failed to deliver an unambiguous picture, possibly due to a lack of coherence between the studied hypothesis, design, and assessment [23]. Similarly, an area still largely underexplored is that of responses of systems more complex than the individual organism or population. A promising line of discussion has developed around community-based studies mainly using microbial assemblies and their accessible functional traits (e.g., Knauert et al. [24]). Here, a major challenge seems to lie in devising reliable response detection tools that detect different modes of toxic action. However, empirical data on the effects of multi-component mixtures in ecologically more complex systems (e.g., micro- and mesocosms comprising invertebrates or even fish) are still lacking.

Next, we have to acknowledge that no matter how accurate combined effect predictions might be, we are currently constrained to assuming static mixture exposure scenarios, which may be reasonable worst-case scenarios in prospective chemical assessment. For fluctuating exposure in site-specific situations this is clearly unsatisfactory. Here, process-based models are suggested as a way forward [25], but again their current evidence-base is rather limited with the prevailing focus on invertebrate studies. The way ahead would need to entail a more systematic identification of the essential information required for higher tier combined effect assessment.

One may hope that the large progress made in understanding the ecotoxicology of mixtures over the last decade also percolates into human risk assessment. This specifically applies to mixtures of compounds that do not act strictly similar. A data gap that is often encountered in human health as well as environmental assessment concerns the lack of sound, holistic exposure data. One focus for improving the realism of future assessment could be, therefore, to work toward improving exposure estimates, as discussed already under the heading of exposome research [26].

Despite all progress in predicting mixture toxicity from knowledge on their components there are cases of synergism/antagonism between contaminants, that is, clear deviations from the predictions derived from CA or IA. These may be caused by physico-chemical interactions, or interactions in the toxicokinetic or toxicodynamic phases, as has been discussed for metal mixture effects [27, 28]. Until now we have not succeeded in deriving principles that would allow their prediction and consequently prevent an underestimation of mixture effects in humans and their environment.

Supplemental Data

Table S1. (49 KB PDF).

Ancillary