In vivo animal studies help achieve international consensus on standards and guidelines for health risk estimates for chronic exposure to low levels of tritium in drinking water

Existing and future nuclear fusion technologies involve the production and use of large quantities of tritium, a highly volatile, but low toxicity beta‐emitting isotope of hydrogen. Tritium has received international attention because of public and scientific concerns over its release to the environment and the potential health impact of its internalization. This article provides a brief summary of the current state of knowledge of both the biological and regulatory aspects of tritium exposure; it also explores the gaps in this knowledge and provides recommendations on the best ways forward for improving our understanding of the health effects of low‐level exposure to it. Linking health effects specifically to tritium exposure is challenging in epidemiological studies due to high uncertainty in tritium dosimetry and often suboptimal cohort sizes. We therefore argued that limits for tritium in drinking water should be based on evidence derived from controlled in vivo animal tritium toxicity studies that use realistically low levels of tritium. This article presents one such mouse study, undertaken within an international collaboration, and discusses the implications of its main findings, such as the similarity of the biokinetics of tritiated water (HTO) and organically bound tritium (OBT) and the higher biological effectiveness of OBT. This discussion is consistent with the position expressed in this article that in vivo animal tritium toxicity studies carried out within large, multi‐partner collaborations allow evaluation of a great variety of health‐related endpoints and essential to the development of international consensus on the regulation of tritium levels in the environment. Environ. Mol. Mutagen. 59:586–594, 2018. © 2018 The Authors Environmental and Molecular Mutagenesis published by Wiley Periodicals, Inc. on behalf of Environmental Mutagen Society

Existing and future nuclear fusion technologies involve the production and use of large quantities of tritium, a highly volatile, but low toxicity beta-emitting isotope of hydrogen. Tritium has received international attention because of public and scientific concerns over its release to the environment and the potential health impact of its internalization. This article provides a brief summary of the current state of knowledge of both the biological and regulatory aspects of tritium exposure; it also explores the gaps in this knowledge and provides recommendations on the best ways forward for improving our understanding of the health effects of low-level exposure to it. Linking health effects specifically to tritium exposure is challenging in epidemiological studies due to high uncertainty in tritium dosimetry and often suboptimal cohort sizes. We therefore argued that limits for tritium in drinking water should be based on evidence derived from controlled in vivo animal tritium toxicity studies that use realistically low levels of tritium. This article presents one such mouse study, undertaken within an international collaboration, and discusses the implications of its main findings, such as the similarity of the biokinetics of tritiated water (HTO) and organically bound tritium (OBT) and the higher biological effectiveness of OBT. This discussion is consistent with the position expressed in this article that in vivo animal tritium toxicity studies carried out within large, multi-partner collaborations allow evaluation of a great variety of health-related endpoints and essential to the development of international consensus on the regulation of tritium levels in the environment. Environ.

INTRODUCTION
Internationally, including within the European Union, it is accepted that both nuclear fission and, in the future, fusion technologies will play a significant role in the ongoing provision of low-carbon energy (OECD, 2013;European Commission, 2014;Edenhofer et al., 2014;UN, 2015b;Editorial, 2016). It is also widely accepted that many nations will need nuclear power to achieve the climate change goals they agreed to at the Paris Conference of Parties to the United Nations Framework Convention on Climate Change, in December 2015 (UN, 2015a).
Some existing and all future nuclear fusion technologies, however, involve the production or use of large quantities of tritium, a highly volatile, but low toxicity bemitting isotope of hydrogen ( 3 H). Although this radionuclide is not considered highly toxic, tritium has received international attention because of public concerns over its release, mostly in water, to the environment and any resulting potential health impacts (AGIR, 2007;CNSC, 2008;ASN, 2010;Stack, 2016). Because of the prevailing public perception of nuclear (but not natural and medical) radiation as "a mysterious, unequivocally hazardous and deadly force that needs to be avoided at any cost" (Jordan, 2016), public tolerance of radiation releases is poor.
Currently, environmental release of tritium is regulated both by discharge limits and application of the ALARA principle (As Low As Reasonably Achievable), which takes social and economic factors into consideration. Together, these ensure that environmental tritium levels and public exposure are minimized. Because the current radiation protection system is risk-based, these release limits and the application of ALARA should, it is generally agreed, be based on a secure scientific understanding of tritium's environmental behavior and true toxicity.

TRITIUM REGULATION AND DOSIMETRY: ISSUES
Dose limits and associated standards, including the drinking water standards for radionuclides adopted by most nations, are largely based on the recommendations of the International Commission on Radiological Protection (ICRP) and of the World Health Organization (WHO). These recommend an annual dose limit of 0.1 mSv from tritium (CNSC, 2008;WHO, 2011). By comparison, the average annual dose to the public from natural tritium is estimated at 0.01 mSv and the total effective dose due to natural radiation at 2.4 mSv/year (UNSCEAR, 2008(UNSCEAR, , 2016. According to ICRP dosimetry models, this corresponds to a tritium level in drinking water of 7.61 kBq/L. Consequently, WHO has recommended a rounded-up guidance level of 10 kBq/L (WHO, 2011). These guidelines are not mandated limits, but are provided for the development of national drinking water standards. Countries remain free to set their own limits-although, in most cases, these accord with those recommended by the ICRP and the WHO. For tritium, however, national drinking water standards vary by several orders of magnitude (CNSC, 2008;WHO, 2011;Brooks et al., 2013;IWA, 2017) (Fig. 1). This disparity is due in part to minor variations in how tritium activity is calculated by national bodies (CNSC, 2008). It also reflects differences in the application of ALARA, unique national priorities, and dosimetric uncertainties. These uncertainties result from the diverse speciation of tritium in the environment, due to its potential for incorporation into all types of organic molecules. Given the very low levels of tritium in global drinking water supplies, this wide variation in the limits presents few compliance problems (CNSC, 2008). Nonetheless, it is difficult to rationalize, induces controversy, reduces public confidence, and increases public concern. An obvious question must be asked: Is there any scientific toxicological evidence to justify any specific limit?
As indicated above, the root cause of many problems with tritium dosimetry is the complex speciation of this radioactive isotope of hydrogen. Tritium ( 3 H) gas readily oxidizes to form tritiated water molecules ( 1 H 3 H 16 O) and can subsequently enter biological pathways, becoming part of the organic molecules inside tissues and cells. Consequently, most tritium in the environment exists as either tritiated water (HTO) or as organically bound tritium (OBT). In the special case of heavy water reactors, tritiated heavy water ( 2 H 3 H 16 O) is produced by neutron capture by deuterium ( 2 H). For dosimetry purposes, it is assumed that tritium behaves the same in light and heavy water.
As a low-energy beta (b)-emitter, tritium poses a health risk only if internalized. The biokinetics of HTO, but not OBT, have been well characterized. Due to the very short average track length of tritium b-particles in water/tissues (mean <1 mm), it must enter the nucleus/cell to damage DNA and/or produce other biological effects. Consequently, it is expected that forms of OBT fixed in the proximity of nuclear chromatin may cause more damage to the genome than HTO, which may either fail to enter a cell or transit through it quickly. Overall, it is clear that the toxicity of different tritium species will differ according to their location and retention time. Thus, a radiation weighting factor (w R ) of 1 for b-particles of all energies (currently assigned for calculating the committed dose from chronic internal tritium exposure) may not always be appropriate (ICRP, 2007). Indeed, our own research has indicated that OBT, in the form of ingested amino acids, has greater biological activity than ingested HTO, as discussed below.
Relative biological effectiveness (RBE), a parameter in the derivation of w R , is measured experimentally as the ratio of the doses of the reference radiation to the doses of the b-radiation producing equal biological effects.
Photon radiation types, most commonly gamma (g)-and X-radiation, are used as references (ICRP, 2007). Importantly, this empirical value may depend on absorbed dose, the endpoint being measured, the type of reference radiation, and the experimental model. The wide range of these variables and uncertainties has led to varying assessments of the RBE for tritium and makes them difficult to interpret, as pointed out by (Little and Lambert, 2008). RBE estimates in their study ranged from 1.2 to 2.5 for carcinogenesis endpoints, from 0.43 to 2.65 for chromosomal aberrations, and from 0.43 to 3.5 for all endpoints and models considered.
In response to public and professional concerns, many study groups and regulatory organizations, including the Canadian Nuclear Safety Commission (CNSC) and the French Nuclear Safety Authority (ASN), have acknowledged that a higher w R may be applied, at least for the purpose of sensitivity analysis in radiological impact assessment of tritium releases (AGIR, 2007;Lebaron-Jacobs et al., 2007;CNSC, 2008CNSC, , 2010ASN, 2010). Application of modified weighting factors, however, does not conform to the ICRP dosimetry paradigm, which recommends weighting factors for radiation types, and not for specific radionuclides. Using a higher value of w R for very low energy b-emitters would impact the dosimetry of tritium, but also a wide range of other radionuclides, for example, 241 Pu.
A science-based international consensus on regulatory limits for tritium in drinking water is necessary. Given the limitations of epidemiological studies, this consensus should, in our opinion, be primarily supported by results of thoroughly planned toxicological studies in laboratory animals. Past in vivo and in vitro studies using acutely delivered tritium at high doses characterized the levels of tritium required to produce acute lethality, carcinogenicity, and teratogenic, reproductive and cytogenetic effects (Cronkite et al., 1973;Cahill et al., 1975;Brooks et al., 1976;Carsten et al., 1978). These doses were several orders of magnitude higher (GBq) than the average effective dose for the public. Studies of biological effects of low tritium levels (<1 MBq/L) are extremely rare.

TRITIUM TOXICITY STUDIES
A previous review of studies examining the RBE of tritium (Little and Lambert, 2008) showed that several issues remain unresolved. These include: (1) the use of inappropriately high doses and dose rates, which are irrelevant to occupational and public exposure doses, (2) the use of X-rays as reference radiation with an undefined spectrum of energy, and (3) the use of unmatched dose rates for tritium and a reference radiation.
To address these gaps, as well as the regulatory uncertainties described above, a collaboration was established between the Canadian Nuclear Laboratories (CNL, Canada), formerly known as the Atomic Energy of Canada Limited (AECL, Canada) and the Institut de Radioprotection et de Sûret e Nucl eaire (IRSN, France), to conduct a large scale in vivo mouse study to examine both the The study was designed based on the recommendations of the UK Health Protection Agency (now Public Health England) to use chronic exposures, very low dose rates, and equivalent dose rates of g-radiation (AGIR, 2007). We took advantage of a globally unique facility at CNL, which allows low level chronic life-long radiation exposure studies on a large scale (thousands of mice), with both external and internal radiation emitters. Both cancer and non-cancer related endpoints were explored in this study. The wide range of measurements and variety of endpoints required the study to be divided into multiple tasks. More than 5,300 mice were used, as follows: Tasks 1 and 2: biokinetics of 3 H (7 groups of n 5 6, 11 time-points, 462 mice in total) Task 3: non-cancer toxicity markers (9 groups of n 5 12, 2 time-points, 216 mice in total) Task 4: genotoxicity markers (9 groups of n 5 12, 2 time-points, 216 mice in total) Task 5: genotoxicity, with the pkZ1 model (7 groups of n 5 12, 2 time-points, 168 mice in total) Task 6: life span and tumorigenesis (12 groups of n 5 300 and 1 control group of n 5 660, 4,260 mice in total). Briefly, mice (C57Bl/6J males) were treated for either 1 or 8 months of continuous exposure to either HTO or OBT (a blend of three common tritiated amino acids: glycine, proline and alanine) in drinking water (0.01, 1, and 20 MBq/L dose levels) and to chronic 60 Co g-irradiation delivered at the same dose rates (1.4 and 31 mGy/h). Although a great variety of various OBT molecule classes exist (Kim et al., 2013), amino acids were chosen because they are an important component of cellular metabolism and are not preferentially fixed within the DNA compartment, as tritiated thymidine is. The lowest dose of 0.01 MBq/L (0.014 mGy/h) is consistent with the WHO guidance level for tritium in drinking water. This dose rate could not be matched by g-radiation since it is less than one tenth than of the natural background radiation (range 0.11-1.25 mGy/h) (UNSCEAR, 2008). It is also 100-10003 higher than typical tritium levels in drinking water (CNSC, 2008). Levels of 1 and 20 MBq/L are found only at nuclear weapon test sites, where thermonuclear explosions have contaminated surface waters. Although the results of this study are being or will be published in a series of peer-reviewed experimental reports elsewhere (Bannister et al., 2016;Priest et al., 2017;Roch-Lefevre et al., in press), we present here a high-level summary of the findings. We feel that this generalization is important to support the opinion expressed in this article that animal studies are key sources of information to address regulatory inconsistencies related to tritium. Overall, the results indicate that (Fig. 2, Table I): The biokinetics of HTO and the form of OBT we used are similar (Priest et al., 2017), in contrast to the existing ICRP model (ICRP, 1989); HTO produced a larger dose-dependent biological response than equivalent doses of g-radiation, and OBT exposures a still larger response (g < HTO < OBT); Fig. 2. Summary of mouse tissue responses to chronic external low level gor internal b-radiation exposures. In the joint CNL-IRSN study, a variety of molecular and cellular endpoints representing different biological processes, including DNA damage and repair, inflammation, and oxidative stress were evaluated. These were examined in the indicated organs/systems of laboratory mice. The tissue responses are shown schematically in the diagram. This shows typical results for all endpoints and all doses (0.01 MBq/L, 1 MBq/L or 20 MBq/L) of either HTO or OBT in drinking water, or to equivalent doses and dose rates of 60 Co g-radiation. Detailed information is provided in the Table I   Neither HTO nor OBT produced detectable biological effects at the WHO guidance level of 0.01 MBq/L; Some molecular and cellular responses to tritium seemed to be protective, while others were most likely detrimental; however, no pathological outcomes were observed macroscopically at any dose used; Responses to tritium were tissue-specific. Many of these observations appear to provide important new information that must be considered in assessing the risk of tritium intake for radiological protection. For example, the similarity of biokinetics of HTO and OBT may support the need for adjustments to dosimetry for tritium exposure in humans. Neither the lack of effects at 0.01 MBq/L, suggesting a threshold rather than linear dose response, nor the simultaneous greater biological effectiveness of OBT at higher levels (1 MBq/L) seem to support current practices in radiation protection. Lastly, since the main health effect considered in radiological protection is cancer, it is necessary to explore in greater detail the links between the physiological/gene toxicity markers and the biological outcomes used in radiological protection. Follow-up studies using additional mouse models and treatments and different toxicological and biological endpoints appear necessary.

REMAINING QUESTIONS AND OUTLOOK
The study has produced interesting results that contribute to our understanding of the effects of low doses of tritium on processes that may impact cancer and non-cancer diseases. Experimental animal studies in the low-dose exposure domain are clearly the type of scientific research most likely to be relevant to the radioprotection of workers and populations and to provide answers to the questions raised by society. The doses employed by this study delivered total chronic doses of less than 200 mGy and clearly fall within the low dose exposure domain (Table  II) (Roch-Lefèvre et al., in press).
This project required collaboration because of the wide range of endpoints to be studied and the need for chronic irradiation. The partnership between AECL/CNL and IRSN brought together the necessary groups specializing in genetics, epigenetics, immunology, toxicology and cancer biology, and their combined expertise facilitated our discovery of the complex interplay of different pathways and networks that are available to produce systemic outcomes. The study showed that for internal-emitterexposure, like that from tritium, the dose received by the tissues is highly dependent on its speciation, distribution, and elimination. Microdosimetry or subcellular dosimetry and biochemistry experiments are needed to clarify in more detail why OBT produces a greater biological response than equivalent organ doses produced by HTO. For example, how much of the tritium, administered as tritiated amino acids, is incorporated into cell proteins?
The regulation of tritium exposure can be improved by generating additional accurate scientific knowledge of systemic biological responses to it (such as cancer and other aging-associated diseases). Unfortunately, human epidemiological studies examining the health effects of tritium in workers or the general public are of limited help in deriving specific risk estimates, due to their low statistical power and the lack of tritium-specific absorbed doses to tissues (Little and Wakeford, 2008;ASN, 2010;UNSCEAR, 2016). Given the complexity of the interacting systemic responses, studies using in vitro methods are unlikely to be predictive of health outcomes. It is likely that the challenges presented by the toxicology of tritium will be solved most effectively by thoroughly planned large-scale collaborative in vivo research studies utilizing the large spectrum of available biological methodologies. Due to the scale and associated high costs of such mouse studies (e.g., the study examining cancer incidence and lifespan in CBA mice, still underway at CNL, involves >3,300 mice and will finally take 6 years), funding by multiple national and international stakeholders is likely to be required. Unlike Europe's MELODI platform (Editorial, 2012), there are no funding programs specifically dedicated to low-dose radiation research in North America. By an open letter to the White House Office of Science and Technology Policy (Abbott, 2015) and a corresponding bill introduced in Congress (Conover, 2015), U.S. researchers have attempted to revive the lowdose radiation research program previously funded by the U.S. Department of Energy (DOE), but debates and progress in this area have remained dormant since June 2015 (U.S. Congress, 2015). This research program on the biological effects of low-dose radiation funded by Government of Canada (co-funded by the CNSC and the CANDU Owners Group) and maintained at CNL is accordingly unique and represents a great opportunity to bridge research performed in Europe with that in non-European countries.
As an essential part of our low-carbon energy future, both fission and future fusion face many acceptability challenges related to either radiological protection standards or public risk perception. Research to address these requires support from all stakeholders, including government agencies, the nuclear industry, and academia. It is our contention that an open discussion of the topic in major science publishing media, such as this one, will facilitate sufficient interest to initiate action and allow the international radiological protection system to continue to evolve based on sound scientific evidence.

AUTHOR CONTRIBUTIONS
YG, JRJ, and DK proposed the concept of the manuscript. NDP, ID, JRJ, DK, and YG designed the experimental study. HW applied for Research ethics approval and provided veterinary care and oversight. YG, LB, CD, TE, EG, SG, CI, AL, PL, SRL, and DK performed the experiments and collected data. All authors contributed to data analyses and interpretation. DK and YG wrote the manuscript with important intellectual input from ID, NDP and SRL. All authors read and provided input in finalizing the manuscript.