SEARCH

SEARCH BY CITATION

In the 20th century, predicting ecological risk from the use of certain chemicals relied on testing programs that directly measured adverse outcomes (death, disease, reproductive failure, or developmental dysfunction) using in vivo toxicity tests. Extrapolation from these tests—from one species to another or from controlled laboratory tests to uncontrolled real-world environments—was based on largely conservative assumptions or arbitrary uncertainty factors. The result? Costly, time-consuming, unfocused, and contentious assessments that often failed to inspire public confidence in related regulatory and policy decisions.

But in the 21st century, risk assessment can benefit from our increased understanding of how biological systems respond to subtle perturbation, from our increased capacity to measure and monitor biological states, and from our improved ability to analyze, integrate, manage, and model complex data using modern computers. Advances in 21st-century biological methods, including transcriptomics, proteomics, metabolomics, in vitro, and high-throughput technologies, enable us to simultaneously examine effects at the cell and tissue level, while improving the efficiency, economy, and reliability of the data and collection methods. The dramatic increase in the amount of toxicological data we can collect and analyze is complemented by our improved ability to integrate, synthesize, and manage complex data sets through the power of computational biology and bioinformatics. Capitalizing on these 21st-century tools can accelerate our rate of discovery and enhance our ability to classify and characterize chemicals according to the ways in which they disrupt normal biological activity (their modes of action). The result is potentially more cost-effective, timely, and informative assessments, as well as regulations and policies with greater transparency and validity, thereby inspiring greater public confidence in regulatory approaches for human and environmental protection.

The prospect of supporting ecotoxicology's transition from the 20th to the 21st century drew 44 expert scientists to a SETAC Pellston workshop in Forest Grove, Oregon, USA, in April 2009. They represented academia, government, and business in seven countries as well as diverse disciplines such as biochemistry, ecology, molecular biology, toxicology, bioinformatics, and health and environmental risk assessment. Their work resulted in five papers that are included in this issue of Environmental Toxicology and Chemistry, each addressing a critical aspect of the transition from the traditional 20th-century paradigm of ecological risk assessment (ERA) to an evolving, groundbreaking new approach.

The History

  1. Top of page
  2. The History
  3. Impetus for Change
  4. Challenges of a Paradigm Shift
  5. Meeting the Challenges
  6. 21st-Century Ecotoxicology
  7. Acknowledgements
  8. References

Most current toxicity and ecotoxicity testing programs performed by regulatory agencies or by regulated parties continue to focus on directly measuring traditional adverse outcomes from in vivo toxicity tests. In contrast, 21st-century mechanistic and alternative data may be considered in some ERA scenarios, but we lack formal regulatory structures for generating and applying such data routinely. In addition, the extrapolation of data—from one species to another, from concentrations that cause significant responses in the laboratory to realistic exposure concentrations in the environment, from controlled laboratory conditions to complex uncontrolled environments, to name a few—has been and continues to be characterized by conservative assumptions and arbitrary uncertainty factors.

If we could effectively predict what the most sensitive endpoints are likely to be, we could potentially eliminate the need to conduct the entire battery of tests, thereby saving time, resources, and animals.

Thumbnail image of

What 21st-century ecotoxicology can do

Researchers have, for the first time, identified the mechanisms of action of two toxins released by certain microalgae, which contaminate fish and shellfish and then become toxic to humans.

These findings obtained in vitro explain the neurotoxicity of the phycotoxins, constitute a first step toward antidotes that might become a sanitary and economic necessity, and raise hopes for new, reliable, inexpensive tests that could detect the toxins in shellfish offered to consumers.

—CNRS (Délégation Paris Michel-Ange) (March 17, 2010). The mode of action of certain toxins that accumulate in seafood. ScienceDaily. Retrieved August 5, 2010 from http://www.sciencedaily.com/releases/2010/03/100311092118.htm

The traditional toxicity-testing paradigm has certainly played an important role in addressing many contaminant-related issues that plagued the latter half of the 20th century. Based on the best available and practical science and technology of its time, it is viewed by many as the “gold standard” by which all other approaches should be measured. Nevertheless, if we are to address the nuanced and diverse contaminant issues of the 21st century, our toxicity-testing paradigm must evolve 2–6. In many respects, in vivo toxicity testing and the concept of “one problem, one test” 7, which purports to design a single definitive toxicity test for each adverse outcome, have proved unwieldy, particularly as the inventory of chemicals requiring assessment has grown. That inventory of chemicals has grown as international public concern about chemicals in the environment has expanded and as subsequent legislative requirements have increased in response: REACH legislation in Europe, the U.S. Food Quality Protection Act and Safe Drinking Water Act of 1996, and the Canadian Environmental Protection Act of 1999. Also, voluntary programs, such as the U.S. Environmental Protection Agency's High Production Volume Challenge Program, have been developed.

Under these pressures, testing and data needs have expanded dramatically for most industrial chemicals. However, in vivo testing is time-consuming and costly, in terms of both money and animals. As a result, only certain high-risk classes of chemicals have been subjected to intensive testing. For example, pesticides long have been subjected to significant amounts of testing and assessment under legislative requirements such as the U.S. Federal Insecticide, Fungicide and Rodenticide Act, whereas little or no toxicity data remain for a large percentage of the chemicals currently in commerce 2, 3, 8, 9.

In the absence of more efficient and cost-effective routine testing methods, risk–benefit decisions often are based on sparse data and predictive models of questionable accuracy and applicability 8, 10. Even under regulatory programs that require substantial testing, often only a subset of the data drive the final risk assessment. The classic example is pesticide registration in the United States and many other countries, where a defined battery of toxicity studies must be submitted to a regulatory authority, but the final risk assessment may be based on only the few most sensitive toxicity tests or endpoint responses. This suggests that efficiency could be improved by incorporating both mechanistic data that could focus testing 3–5 and bioinformatic approaches that more effectively integrate and use available data.

If we could effectively predict the most likely sensitive endpoints, we could potentially eliminate the need to conduct the entire battery of tests, thereby saving time, resources, and animals. That calls are increasingly being heard to apply innovative approaches and new paradigms to the long-standing challenges of toxicological risk assessment and regulation comes as no surprise 2.

Impetus for Change

  1. Top of page
  2. The History
  3. Impetus for Change
  4. Challenges of a Paradigm Shift
  5. Meeting the Challenges
  6. 21st-Century Ecotoxicology
  7. Acknowledgements
  8. References

Paradigm Shift for Human Health Toxicity

In 2007, the U.S. National Research Council (NRC) Committee on Toxicity Testing and Assessment of Environmental Agents proposed a significant paradigm shift in the way human health-oriented toxicity testing should be done in the future 2. Specifically, the NRC concluded that a need existed to “transform toxicity testing from a system based on whole animal testing to one founded primarily on in vitro methods,” envisioning “a new toxicity testing system that evaluates biologically significant perturbations in key toxicity pathways” using a combination of computational biology and a comprehensive array of high-throughput in vitro tests, preferably with cells and tissues of human origin 2.

Although the NRC did not advocate eliminating all in vivo testing, the committee did recommend relegating it to a supporting role, focused on addressing specific uncertainties and risk assessment questions. The NRC did recommend that mechanistically based, high-throughput in vitro testing become the primary foundation for characterizing dose–response relationships and for assessing hazards. Models with quantifiable uncertainty would gradually replace many of the assumptions and arbitrary uncertainty factors of traditional 20th-century risk assessment 2, 11.

The NRC acknowledges that substantial resources will be required to fully understand toxicity pathways and to develop predictive models that support this new paradigm 2. However, this approach calls for a reallocation of resources, rather than a hefty and costly increase in resources; fewer resources will be invested in generating data specific to relatively few chemicals, whereas greater resources will be invested in making more effective use of available data and in developing knowledge that can be generalized to many chemicals.

Our increasing ability to measure tens, hundreds, or thousands of biological endpoints simultaneously and to conduct targeted, mechanistically based, in vitro assays in a high-throughput manner gives us the potential to generate a wealth of toxicologically relevant data in an efficient manner. Can ecotoxicology capitalize on this ability and adopt new approaches to manage and interpret these data to quickly and effectively screen large numbers of chemicals and inform risk management decisions?

Challenges of a Paradigm Shift

  1. Top of page
  2. The History
  3. Impetus for Change
  4. Challenges of a Paradigm Shift
  5. Meeting the Challenges
  6. 21st-Century Ecotoxicology
  7. Acknowledgements
  8. References

Two challenges impede ecotoxicology's transition to a 21st-century paradigm in ERA:

  • Linkage: We must establish credible links between responses measured at the cell or tissue level and adverse outcomes traditionally measured at the whole-animal or population level.

  • Extrapolation: We must develop biologically based, quantitative extrapolation tools or models that allow us to apply cell- or tissue-level data to individuals, or individual-level data to entire populations.

If we are to successfully incorporate wider use of suborganismal tests or alternative data types into ERA, we will need to address both challenges, as the SETAC workshop participants did in their plenary discussions and workgroups. Ultimately, their intent was not to consider the entire breadth of scientific development needed for ecotoxicology to achieve something akin to the NRC's vision for human health toxicology; rather, they focused on key challenges that limit the use of alternative endpoints and data in ERA and on how 21st-century science and technology can address those challenges.

Meeting the Challenges

  1. Top of page
  2. The History
  3. Impetus for Change
  4. Challenges of a Paradigm Shift
  5. Meeting the Challenges
  6. 21st-Century Ecotoxicology
  7. Acknowledgements
  8. References

Meeting the challenges of a paradigm shift requires moving beyond simply determining the concentration of a chemical that causes a specific adverse outcome we wish to prevent. We also must understand how chemicals adversely affect cellular mechanisms (i.e., toxicity pathways), and how those perturbations can extend to individuals and ultimately to the primary unit of concern for ERA—populations. Meeting the challenges also requires that we use knowledge from these adverse outcome pathways (AOPs) to develop more efficient testing and assessment methods. Finally, meeting the challenges will require that we develop models to support credible extrapolation beyond the constraints of the tests themselves.

Linkage: Adverse Outcome Pathways

Adverse outcome pathways represent a sequence of events that begins with a molecular initiating event, spans multiple levels of biological organization, and ends with an adverse outcome 12. By describing not just the organism- or population-level outcome but also the initiating event and key events in between, AOPs build on 20th-century toxicity paradigms to provide a conceptual framework that can facilitate the use of alternative endpoints and data in ERA 12.

A fully developed AOP is synonymous with a mechanism of action—a complete and detailed understanding of each and every step in the sequence of events leading to a toxic outcome (http://www.ecetoc.org; 4, 12, 13). Where gaps exist in an AOP, weight of evidence or statistical correlations can establish links between exposure and adversity 12.

Defining AOPs and their associated toxicity pathways can inform the development of effective alternative endpoints and data for use in hazard assessments 2, 12. For example, if we identify a molecular initiating event in an AOP 12, we have defined a critical interaction that can then be modeled to develop quantitative structure–activity relationships (QSARs) that predict the likelihood of a chemical interacting with a specific target. Similarly, defining a key toxicity pathway provides us with information critical to developing predictive in vitro tests. Biomarker responses that can be contextualized in an AOP gain both predictive and diagnostic credibility through their links to both the initiating event and the outcome. Although defining AOPs will not render all proposed QSARs, in vitro tests, or biomarkers relevant for ERA, the process will greatly aid our identification and development of those few responses that may be truly informative.

At the SETAC workshop, two groups developed complementary approaches for constructing AOPs that support predictive ecotoxicology. Watanabe et al. (p. 9) developed strategies for deriving AOPs and designing associated computational models using data from current scientific literature. Participants chose domoic acid as a case study, given its ecotoxicological and human health significance and its vast body of available literature, and synthesized relevant published data into a conceptual AOP model. The well-known cause of domoic acid poisoning facilitated the identification of potential molecular initiating events, a toxicity pathway, and associated adverse outcomes (Fig. 1). Furthermore, they showed how knowledge of that AOP could be used to design 21st-century in vitro assays for prospective assessments as well as appropriate biomarkers for diagnostic assessments. Finally, they considered how mechanistic models could support the predictive use of in vitro or biomarker responses in quantitative risk assessments.

thumbnail image

Figure 1. Adverse outcome pathway of domoic acid. (Photo of Pseudo-nitzschia courtesy of Rafael Kudela)

Download figure to PowerPoint

Another workgroup, Perkins et al. (p. 22), focused on new technologies and the application of ecotoxicogenomics. They demonstrated how ecotoxicogenomics can be combined with bioinformatics and computational biology to enable the unsupervised discovery of key nodes (genes, proteins, or metabolites) and other physiological functions impacted by chemical stressors. This threefold approach forms the basis for reverse-engineering elements that are important in responses to stressors and for the discovery of AOPs not previously elucidated through traditional hypothesis-based experimentation. Although Watanabe et al. use and build on a tremendous foundation of existing knowledge, the reverse-engineering approach employed by Perkins et al. is poised to facilitate new discoveries. Thus, the two general approaches for AOP discovery are complementary and should both be brought to bear on the challenge of the paradigm shift (Fig. 2).

thumbnail image

Figure 2. Proposed schema for prospective ecological risk assessment in the 21st century. Dashed red lines indicate dominant traditional approach. Black lines indicate potential 21st-century approaches. AOP = adverse outcome pathway; PBTK = physiologically based toxicokinetics; BBDR = biologically based dose–response.

Download figure to PowerPoint

Extrapolation: Tools and Models to Support Quantitative ERA

Adverse outcome pathways, and the toxicity pathways they encompass, are based on the assumption that an environmental or chemical disturbance is severe enough to overwhelm an organism's adaptive mechanisms and drive the response trajectory to adversity 2, 12. As such, AOPs are informative for hazard assessment but more limited in their application to risk assessment. An AOP does not account for contributions of chemical dose or concentration, timing and duration of exposure, biotransformation, or an organism's adaptive capacity. Furthermore, when an adverse outcome is predicted at the level of an individual, most ERAs require the extrapolation of that outcome to populations and other species. Consequently, developing appropriate extrapolation models constitutes a second, key scientific need if alternative, suborganismal endpoints are to play an important role in 21st-century ERA.

As the NRC suggests 2, prospective risk assessments need to be based on chemical characterization and exposure modeling to provide an estimated likelihood of a chemical reaching a target organism. A vast body of existing research addresses questions of environmental fate, distribution, and exposure that would facilitate this first level of characterization. Once exposure and that the chemical disruption leads to an adverse outcome have been determined to be likely, the next logical question becomes “Under what types of conditions?” Figure 2 depicts a strategy that will help answer this question by outlining steps for prospective ERA that define the necessary extrapolation tools and models.

Initial attempts to answer these questions could consist of toxicity pathway tests or QSARs that determine the likely cellular perturbations that a stressor may cause. However, predicting events at higher levels of biological organization (e.g., tissues and organs) becomes complicated by the intricacies of biological systems. In addition, many animals are able to withstand environmental and chemical changes, and this adaptability further confounds our ability to develop simple predictive toxicity models and establish acceptable levels of chemical exposure. One key to effectively harnessing suborganismal, non-apical response data is to better understand the robustness of biological systems—the mechanisms as well as the limitations—in relation to varying exposure regimens.

The workgroup of Nichols et al. (p. 39) addressed this challenge by examining features of the vertebrate endocrine system that facilitate their adjustment to and recovery from chemical disturbances. The workgroup emphasized the need to incorporate mathematical models as a means to understand the physiological mechanisms of adjustment and recovery. Especially pertinent to the concept of AOPs are biologically based dose–response (BBDR) models that account for major mechanisms of homeostasis and allostasis. Unlike AOP models that restrict their focus to perturbations and events that lead to an adverse outcome, BBDR models incorporate biological processes and variables that may determine whether those apical adverse outcomes occur. In addition, risk assessments need to consider multiple potential disruption sites, each of which would constitute a unique AOP. Identifying and parameterizing the relevant biology and incorporating it into a BBDR for a single AOP, let alone for multiple interacting AOPs, are daunting tasks. For that reason, the NRC views the development of BBDR models as a “longer-term goal” 2. Nonetheless, the literature reviewed by Nichols et al. reveals that BBDR models for fishes have already predicted vitellogenin concentrations after exposure to cadmium, ethinylestradiol, and polychlorinated biphenyls; in mammals, BBDR models can simulate the effects of dietary iodide deficiency. Such examples provide optimism that we can develop models to address the predictive challenges posed by dose-duration-response relationships.

Ecological risk assessment also has long needed to extrapolate from organisms with available toxicity test data to those that have not been or cannot be tested. As we identify important mediators of effect through AOPs and associated models, predictive genomic approaches to species extrapolation can be brought to bear (Fig. 2). The coupling of advances in DNA sequencing technologies and readily accessible sequence databases with the identification of key proteins that regulate the toxicokinetics or toxicodynamics of various chemical classes creates opportunities for new, quantitative approaches to species extrapolation. Genetic polymorphisms in humans have been proposed as potential biomarkers of susceptibility to certain types of environmental contaminants. Similarly, by comparing gene or protein sequence conservation among species, we are increasingly able to infer an organism's potential susceptibility or resistance to specific classes of contaminants such as dioxins (e.g., Head et al. 14, Gunnarsson et al. 15). The workgroup of Celander et al. (p. 52) considered value-added applications of 21st-century technologies to the challenge of species extrapolation, while recognizing the current limitations associated with inferring function from sequence. As a way to explore the extrapolation of evidence-based data across different species, workgroup participants used the inhibition of steroid production in fish as a case study. Similar physiological disruptions were observed in fathead minnow, medaka, and zebrafish after chemical exposure. These correspondences were further underscored by a high degree of genetic homology among the three species, demonstrating, in principle, a basis on which to build a genetic model for cross-species extrapolation.

Finally, as suggested by the Kramer et al. workgroup (p. 64), models can predict potential population impact once the available data are transformed into a prediction of an adverse outcome of demographic significance at the individual level (Fig. 2). Although traditional apical endpoints in the current testing paradigm (survival, growth, development, and reproduction) are measured at the individual level, they possess demographic relevance and are readily used by risk assessors to estimate population or ecosystem impacts. If alternative endpoints and data types such as in vitro toxicity pathway assays, biomarkers, or computational models are to be useful for ERA, their outputs also must be translated into demographic currencies relevant for population projection modeling. This translation step is critical for establishing the relevance of AOPs and the key events they capture in an ecological risk context. Thus, Kramer et al. focused on the challenge of extrapolating to the relevant level of organization or impact for ecological risk, that is, at the population level.

Coming to terms…

Adverse outcome pathway— A conceptual framework that portrays the linkage between a molecular initiating event and an adverse outcome 12

Allostasis— Achieving stability through physical or behavioral change

Alternative data/endpoints— Suborganismal, in vitro, biomarkers, QSARs, genomics

Apical endpoint— Traditional, directly measured whole-organism outcomes of exposure in in vivo tests, generally death, reproductive failure, or developmental dysfunction

Bioinformatics— Use of information science to integrate diverse, complex data generated by life sciences and organize it in an understandable context

Biomarker— A biochemical, physiological, or histological change or aberration in an organism that can be used to estimate either exposure to stressors or resultant effects

Computational biology— Addresses theoretical and experimental questions by using mathematical models to understand and predict responses of biological systems or key variables to varying conditions or perturbations

Diagnostic assessment— Addresses chemicals already released into the environment, aiming to determine the cause of a problem that is already being observed or the risk of a potential problem

Ecotoxicogenomics— Broadly, transcriptomics, proteomics, and metabolomics applied to ecotoxicological research

Global analysis— Analysis with the theoretical potential to simultaneously measure the entire complement of a particular type of analyte found within a sample (e.g., transcriptomics has potential to measure the entire complement of messenger RNA transcripts).

Hazard assessment— Evaluation of a chemical's capability to cause adverse effects

Homeostasis— Ability to maintain internal equilibrium, stability

Levels of biological organization— Atom, molecule, cell, tissue, organ, organ system, organism (individual), population, community, ecosystem, biosphere

Mechanism of action— Complete, detailed sequence of events that leads to a toxic outcome 4

Mechanistic approach— Ability to identify not only what concentration of a chemical causes a particular effect but also how the effect is caused

Metabolomics— Global analysis of small molecule metabolites and their relative abundance, generally through nuclear magnetic resonance and mass spectroscopy

Mode of action— Physiological or behavioral signs characterizing an adverse response in which major, but not all, biochemical steps are understood 4

Molecular initiating event— Direct interaction of a chemical with specific biomolecules

Non-apical endpoint— Alternative, suborganism-level, in vitro responses, biomarkers, QSARs, genomics

Pathway perturbation motifs— Recurring patterns of biological response associated with specific types of perturbation

Perturbation— Disturbance that causes deviation from normal state

Proteomics— Global analysis of proteins in a sample and their relative abundance or modifications

QSARs— Correlation of ecological or toxicological activity with chemical structure to understand or predict toxicity

Prospective risk assessment— Conducted before a chemical is approved for use, to identify or minimize potential hazards before the chemical is released into the environment

Risk assessment— Evaluation of probability of adverse effects under defined circumstances

Suborganismal— At the level of a molecule, cell, tissue, organ, or organ system

Systems biology— Study of relationships and flow of biological information between elements of biological systems, with the goal of understanding and predicting emergent properties of those systems 1

Toxicity pathway— Cellular response pathways that, when sufficiently perturbed, are expected to result in adverse health effects 2. Can be viewed as a trigger that initiates a trajectory toward an adverse outcome

Traditional data/endpoints— Apical, in vivo outcomes of exposure, such as death, disease, or reproductive or developmental impairment

Unsupervised discovery— Method that reveals new relationships and functions that were not necessarily predicted or hypothesized from previous knowledge

21st-Century Ecotoxicology

  1. Top of page
  2. The History
  3. Impetus for Change
  4. Challenges of a Paradigm Shift
  5. Meeting the Challenges
  6. 21st-Century Ecotoxicology
  7. Acknowledgements
  8. References

At their core, the aim of AOPs, extrapolation tools, and models is to systematically and effectively maximize the use of existing biological and toxicological knowledge to translate available chemical-specific data into information that is useful for ERA. Although chemical- and organism-specific data may contribute the knowledge needed to develop such tools, the ultimate application of those tools should provide information that transcends the limits of data generation. From a pragmatic perspective, this means transitioning from a chemical-by-chemical approach to regulatory testing and risk assessment, toward a more integrated system focused on cross-chemical understanding of pathway perturbation motifs.

Such a transition has substantial implications in terms of both focusing risk assessment and directing research. Through the appropriate application of AOPs and the development of biologically based extrapolation approaches, we can envision a schema that increasingly uses QSAR, in vitro, or other non-apical toxicity data as a foundation for credible and scientifically defensible prospective or predictive ecological risk assessment.

This is not to say that such a schema will be perfect. Like the current toxicity testing paradigm, the application of AOPs and extrapolation models still will be subject to error and uncertainty. Unlike the traditional paradigm, however, the tools of 21st-century predictive ecotoxicology will generate estimates of uncertainty through science-based methods rather than arbitrary factors, while building an information database to support environmental decisions through more cost-effective means.

As with any scientific endeavor, our hopes for progress and innovation are tempered by skepticism and a respect for the complexities of the systems we investigate. As Hartung 7 points out, all the new-paradigm science may never be sufficiently developed, let alone validated, as the universally accepted and best way to approach risk assessment. Advances toward the NRC's vision 2, or some version thereof in ecotoxicology, will continue to develop over time.

Nonetheless, risk management decisions are being made every day, regardless of whether whether perfect data sets are available. Therefore, the challenge we face is not to generate perfect predictions; rather, it is to apply today's technology to do a better job than we have done previously, and to continue making incremental progress toward developing the scientific foundations that foster confidence in alternative endpoints that will predict risk in the 21st century. Where there are no data to support a decision, our hope is to provide at least an estimate and a guide for optimal data collection. Where data are available, our vision is to harness state-of-the-art science and technology to yield the maximum amount of information from the available data.

Ultimately the vision for the 21st century is not to obviate the past and declare it obsolete, but to bring new science to bear on the problems and challenges that we continue to face. In the articles that follow, workshop participants speak to that vision while providing realistic guidance on how to achieve it in ecotoxicology.

Acknowledgements

  1. Top of page
  2. The History
  3. Impetus for Change
  4. Challenges of a Paradigm Shift
  5. Meeting the Challenges
  6. 21st-Century Ecotoxicology
  7. Acknowledgements
  8. References

This work was supported by the Society for Environmental Toxicity and Chemistry (SETAC), U.S. Army Corps of Engineers Engineer Research and Development Center Program in Environmental Quality and Installations, the Natural Environment Research Council (NERC), the U.S. Environmental Protection Agency, and Procter & Gamble. The authors thank the organizing committee (Gerald Ankley, Malin Celander, Kevin Chipman, Jeremy Edwards, Vincent Kramer, John Nichols, Edward Perkins, Irvin Schultz, Karen Watanabe, and James Wheeler), as well as all the Pellston workshop participants, for making the workshop successful. Also special thanks to the SETAC office, particularly Greg Schiefer, Nikki Turman, and Daniel Hatcher. Without their help this workshop would not have been possible. We thank G. Ankley and E. Perkins for comments on an earlier version of this manuscript, as well as Mimi Meredith, Jenny Shaw, Daniel Hatcher, Herb Ward, and Diana Freeman for editorial and graphics assistance in preparing this Focus article. This document has been subjected to review and has been approved for publication by the EPA, National Health and Environmental Effects Research Laboratory. Approval does not signify that the contents reflect Agency views or policies.

References

  1. Top of page
  2. The History
  3. Impetus for Change
  4. Challenges of a Paradigm Shift
  5. Meeting the Challenges
  6. 21st-Century Ecotoxicology
  7. Acknowledgements
  8. References
  • 1
    Hood L, Perlmutter RM. 2004. The impact of systems approaches on biological problems in drug discovery. Nat Biotechnol 22: 12151217.
  • 2
    Committee on Toxicity Testing and Assessment of Environmental Agents NRC. 2007. Toxicity testing in the 21st century: A vision and a strategy. National Academies Press, Washington, DC.
  • 3
    Bradbury SP, Feijtel TC, Van Leeuwen CJ. 2004. Meeting the scientific needs of ecological risk assessment in a regulatory context. Environ Sci Technol 38: 463A470A.
  • 4
    ECETOC. 2007. I ntelligent testing strategies in ecotoxicology: mode of action approach for specifically acting chemicals. Technical Report 102. Brussels, Belgium.
  • 5
    Ahlers J, Stock F, Werschkun B. 2008. Integrated testing and intelligent assessment: new challenges under REACH. Environ Sci Pollut Res Int 15: 565572.
  • 6
    Stokstad E. 2009. Putting chemicals on a path to better risk assessment. Science 325: 694695.
  • 7
    Hartung T. 2009. A toxicology for the 21st century: mapping the road ahead. Toxicol Sci 109: 1823.
  • 8
    Stephenson JB. 2006. Chemical regulation: actions are needed to improve the effectiveness of EPA's chemical review program. U.S. Government Accountability Office, Washington, DC, pp 114.
  • 9
    National Research Council. 2006. Toxicity testing for the assessment of environmental agents: interim report. National Academies Press, Washington, DC.
  • 10
    Walker JD, de Wolf W. 2003. QSARs on the world wide web: A need for quality assurance to prevent misuse. In WalkerJD, ed, QSARs for Pollution Prevention, Toxicity Screening, Risk Assessment, and Web Applications. SETAC, Pensacola, FL, USA.
  • 11
    Andersen ME, Krewski D. 2009. Toxicity testing in the 21st century: Bringing the vision to life. Toxicol Sci 107: 324330.
  • 12
    Ankley GT, Bennett RS, Erickson RJ, Hoff DJ, Hornung MW, Johnson RD, Mount DR, Nichols JW, Russom CL, Schmieder PK, Serrano JA, Tietge JE, Villeneuve DL. 2010. Adverse outcome pathways: A conceptual framework to support ecotoxicology research and risk assessment. Environ Toxicol Chem 29: 730741.
  • 13
    Borgert CJ, Quill TF, McCarty LS, Mason AM. 2004. Can mode of action predict mixture toxicity for risk assessment? Toxicol Appl Pharmacol 201: 8596.
  • 14
    Head JA, Hahn ME, Kennedy SW. 2008. Key amino acids in the aryl hydrocarbon receptor predict dioxin sensitivity in avian species. Environ Sci Technol 42: 75357541.
  • 15
    Gunnarsson L, Jauhiainen A, Kristiansson E, Nerman O, Larsson DG. 2008. Evolutionary conservation of human drug targets in organisms used for environmental risk assessments. Environ Sci Technol 42: 58075813.