Establishing pathogen log reduction value targets for direct potable reuse in the United States

Communities are now turning to potable reuse to augment their water supply portfolios in response to increasing demand and climate uncertainty. One barrier to broader implementation is a lack of regulations for direct potable reuse (DPR) in some locations. An incomplete understanding of the foundation of existing DPR frameworks may be contributing to this barrier. The objective of this study was to use a publicly available quantitative microbial risk assessment (QMRA) tool—DPRisk—to explain the basis behind California's existing indirect potable reuse regulations, California's draft DPR regulations, and an Expert Panel's response to those draft regulations. Then, leveraging a robust raw wastewater pathogen dataset from the literature, DPRisk was used to justify two alternatives: one based on maximum simulated pathogen concentrations and the other based on 97.4th percentile concentrations. The latter represents an effort to seek equivalency between “raw wastewater” (i.e., California) and “treated effluent” (i.e., Texas) approaches. Using justified QMRA assumptions, the baseline log reduction value (LRV) targets were determined to be 15/11/11 (maximum) or 13/10/10 (97.4th percentile) for viruses, Giardia, and Cryptosporidium. Additionally, instead of augmenting the baseline LRVs to account for undetected treatment process failures, tolerances for off‐specification conditions (e.g., up to 3 logs for 3–12 days per year) were characterized. With this foundational knowledge, stakeholders can better understand and adopt these frameworks or use DPRisk to establish a new framework that better addresses their unique constraints, including considerations for preferred treatment paradigms and capital and operational costs.


| INTRODUCTION
In recent decades, climate change and climate uncertainty have led to significant capital investments aimed at augmenting water resource portfolios and ensuring continued access to existing water supplies.In the southwestern United States (U.S.), the continued aridification of the Colorado River basin has led to historically low elevations in Lake Powell and Lake Mead, which are critically important reservoirs for tens of millions of people in the U.S. and Mexico.In August 2021, the U.S. Bureau of Reclamation announced the first official shortage in the Lower Basin, leading to mandatory cuts in Colorado River allocations (AWWA, 2021).This resulted in renewed negotiations between major stakeholders, notably the affected water agencies, and prioritization of mitigation measures (James, 2022).
Potable reuse has emerged as a key component of many water resource portfolios due to decades of demonstrated success for indirect potable reuse (IPR), with increasing rates of implementation in recent years (Gerrity et al., 2013).Despite its positive track record, widespread adoption of potable reuse is still hindered by practical considerations (Dow et al., 2019) and/or the absence of regulatory frameworks for certain applications, notably direct potable reuse (DPR).DPR systems must protect public health from chemical and microbiological hazards with reduced response retention time, specifically because of the absence (or reduced role) of an environmental buffer.Outside of Texas, which initially permitted DPR at two locations (Big Spring and Wichita Falls) and is now designing another installation in El Paso, there are few established regulatory frameworks for DPR in the U.S.However, in response to current and projected conditions in the Colorado River basin, there are new efforts to develop DPR regulations in western states, namely California, Colorado, and Arizona.In fact, Colorado officially approved a DPR rule in October 2022 (CDPHE, 2022).
Although the U.S. Environmental Protection Agency (EPA) recently launched a National Water Reuse Action Plan (WRAP) along with several guiding documents (EPA, 2012(EPA, , 2017)), there are no potable reuse regulations at the federal level in the U.S. (Nappier et al., 2018).Instead, interested municipalities or water agencies must comply with state-specific regulations or frameworks, and only when they are available.When a given application is regulated, the corresponding treatment requirements may vary considerably between states depending on the underlying benchmarks and assumptions used to develop the regulations.Moreover, these benchmarks and assumptions are not always apparent, so the level of inherent conservatism may be unclear.
Given the increasing interest in potable reuse and parallel efforts to develop state DPR regulations, the intent of this study was to evaluate recently proposed frameworks, in addition to potential alternatives, through quantitative microbial risk assessment (QMRA).This builds on the wealth of literature on the topic of QMRA for DPR (Amoueyan et al., 2019;Pecson et al., 2017;Soller et al., 2017) but provides context that is specifically relevant to the regulatory rulemaking process.

| QMRA TOOL
QMRA was performed in this study to estimate the risk of infection due to exposure to a range of pathogens commonly identified as hazards in potable reuse systems, specifically enteric viruses, Giardia, and Cryptosporidium.All scenarios were modeled in DPRisk (Gerrity et al., 2022;State Water Resources Control Board [SWRCB], 2021), a publicly available QMRA tool developed using the R statistical language.Additional background and details for accessing DPRisk are available in Text S1 in the Supplementary Information, and input parameters for the various scenarios described in this study are summarized in Tables S1-S5.

| HISTORICAL BASIS OF CALIFORNIA'S IPR REGULATIONS
Log reduction value (LRV) targets for viruses, Giardia, and Cryptosporidium were originally promulgated through California's Groundwater Replenishment Reuse Regulations (DDW, 2014).These initial LRVs for IPR were calculated assuming an annual risk of infection of 10 À4 -a benchmark that was originally proposed by the U.S. EPA for control of Giardia in drinking water applications (Regli et al., 1991).California's deterministic (i.e., point estimate) approach assumed a single drinking water ingestion event per day with a volume of 2 L, which is the 90th percentile across all ages in the U.S. based on the Continuing Survey of Food Intake by Individuals (CSFII; EPA, 2019).Constant raw wastewater pathogen concentrations were based on rounded maximum reported values from the literature: 10 5

Article Impact Statement
This study evaluates direct potable reuse frameworks through quantitative microbial risk assessment to better communicate underlying assumptions and their implications for log reduction value targets.most probable number (MPN)/L for viruses (Asano et al., 2007), 10 5 cysts/L for Giardia (Asano et al., 2007), and 10 4 oocysts/L for Cryptosporidium (Robertson et al., 2006;Tetra Tech, 2011).Finally, California assumed a beta-Poisson dose-response model previously developed for rotavirus (α ≈ 0.26 and β ≈ 0.42; Ward et al., 1986), an exponential dose-response model for Giardia (r = 0.02) (Regli et al., 1991), and an exponential dose-response model for Cryptosporidium (r = 0.09; EPA, 2005).A byproduct of using point estimates for every input is that each day results in the same risk (2.7 Â 10 À7 infections per person per day), with the sum over 365 days yielding the annual benchmark of 10 À4 infections per person per year.
Under the probabilistic assessment of treatment train performance (PATTP) output tab in DPRisk, this framework (defined in Table S1) results in benchmark LRVs of 11.6/10.2/9.8 for viruses, Giardia, and Cryptosporidium, respectively, corresponding with acceptable drinking water concentrations of 10 À6.6 MPN/L, 10 À5.2 cysts/L, and 10 À5.8 oocysts/L.After rounding, California selected 12/10/10 for the state's original IPR regulations.In its recently approved rule (CDPHE, 2022), Colorado also adopted these baseline LRV targets for DPR but provided flexibility for interested agencies to pursue a treated wastewater effluent alternative with minimum LRVs of 8/6/5.5 (V/G/C).This was the original approach implemented for DPR in Texas (TCEQ, 2022;TWDB, 2015), in which the LRV calculation starts with treated wastewater effluent and is specific to the advanced water treatment (AWT) train.

| Background
In developing its draft DPR regulations (DDW, 2021), California made several adjustments to its deterministic approach based on more recent QMRA literature.Several studies noted the significance of short-term off-specification conditions (Soller, Parker, & Salveson, 2018) or daily spikes in risk (Soller et al., 2017), which could drive annual risk estimates.Coupled with decreased response retention time relative to IPR applications, this resulted in California explicitly shifting from a 10 À4 annual risk benchmark to a 2.7 Â 10 À7 daily risk benchmark.As noted in the previous section, this has no impact on the risk calculation when using an entirely deterministic approach, specifically when calculating baseline LRV requirements.However, it can become a factor as the QMRA increases in complexity.For example, the draft DPR regulations included an LRV adjustment to account for potential treatment process failures, and this adjustment is impacted by the decision to regulate based on daily versus annual risk (described later).Lastly, California included norovirus based on recent studies highlighting its potential significance in driving risk estimates (Soller, Eftim, & Nappier, 2018) and also incorporated new, more conservative dose-response models for norovirus and Cryptosporidium.

| Baseline LRVs
For DPR, California still assumed a daily ingestion volume of 2 L, but that volume was divided equally over 96 ingestion events (i.e., every 15 min).It was assumed that online monitoring of surrogate water quality parameters should provide data at least every 15 min to ensure rapid response to off-specification conditions (Pecson et al., 2017), hence the adoption of this time interval for risk assessment.This had no impact in calculating the baseline LRVs but was a factor for the failure scenario.Constant raw wastewater pathogen concentrations were again assumed based on precedent (Giardia at 10 5 cysts/L and Cryptosporidium at 10 4 oocysts/L) and a rounded maximum reported value from the literature for norovirus (10 9 gene copies [gc]/L; Eftim et al., 2017).The same exponential dose-response model with r = 0.02 was used for Giardia (Regli et al., 1991), and conservative dose-response models were assumed for norovirus (i.e., hypergeometric with α = 0.04 and β = 0.055; Teunis et al., 2008) and Cryptosporidium (i.e., beta-Poisson with α = 0.116 and β = 0.121; Messner & Berger, 2016).Under the PATTP output tab in DPRisk, this framework (defined in Table S1) results in benchmark LRVs of 15.5/10.2/10.8 for viruses, Giardia, and Cryptosporidium, respectively, corresponding with acceptable drinking water concentrations of 10 À6.5 gc/L (norovirus), 10 À5.2 cysts/L, and 10 À6.8 oocysts/L.After rounding, California selected 16/10/11 as the baseline LRVs for the state's draft DPR regulations (see next section for justification of the final LRVs of 20/14/15).Additional stipulations related to minimum/maximum LRV crediting for each treatment process, the minimum number of barriers/mechanisms for each pathogen, and so forth, are noted in the draft regulations (DDW, 2021) but are not addressed here.

| Treatment failure considerations
A notable addition relative to the original IPR regulations was the explicit consideration of treatment failure in the draft DPR regulations.In DPRisk, failures can be defined based on their magnitude (0%-100%), duration (15 min to 24 h), and frequency (stochastic or deterministic).
California assumed a worst-case failure (i.e., 6 logs for each pathogen) of the ultraviolet (UV) advanced oxidation process (AOP), which is mandated for full advanced treatment (FAT).The failure was assumed to last 15 min with a deterministic frequency of 1 time per year.Because drinking water ingestion was divided equally over 96 events per day, the ingestion event that coincided with the 15-min failure (i.e., $1% of the day) was offset by the other 95 ingestion events that occurred during normal operation (i.e., $99% of the day).This effectively equates to a 2-log buffer or an implicit LRV of 2.0 (i.e., Àlog 10 (0.01)).As a result, California requires only 4-log treatment redundancy for each pathogen to adequately compensate for a 6-log failure, which raises the required LRVs from 16/10/11 to 20/14/15 for viruses, Giardia, and Cryptosporidium.After inputting the scenarios into DPRisk (defined in Table S1), the QMRA output tab indicates that the required LRVs generally satisfy both the daily and annual risk benchmarks, with only Giardia exceeding the daily risk benchmark by a factor of 1.7 because its baseline LRV had been rounded down.
Interestingly, if the regulation was based on a 10 À4 annual risk, this would equate to an additional 2.6-log implicit "calculation buffer" (i.e., Àlog 10 (0.0027)), as the 364 days (99.73%) without failure would offset the 1 day (0.27%) with failure.Considering the "rounding buffers" (i.e., 0.5/À0.2/0.2) already incorporated into the no-failure baseline LRVs, the treatment redundancy required to compensate for a 6-log failure could be reduced considerably, as summarized in Table 1.This would result in revised LRV targets of 17/12/12 instead of 20/14/15 for viruses, Giardia, and Cryptosporidium, which could significantly reduce capital and O&M costs for potable reuse systems.Even though the annual risk for each pathogen would remain below 10 À4 , the maximum daily risks (i.e., on the day of UV AOP failure) would exceed the 2.7 Â 10 À7 benchmark by approximately two orders of magnitude, which California would not consider acceptable.
Another perspective on the California approach is that because of the point estimate assumption for pathogen concentrations, the LRV required for compliance with the daily risk benchmark is actually independent of the number of 15-min, 6-log failures each year, as long as there is a maximum of one per day.Simply put, California's 4-log redundancy achieves an acceptable daily and annual risk, even if a 15-min, 6-log failure occurs every day.The caveat is that if a failure occurs every day, the annual risk estimate increases by 2.6 logs because the single failure day is no longer "buffered" by the other 364 days of normal operation.
Despite its potential link to online monitoring frequencies, it is unclear whether the assumption of 96 ingestion events per day is appropriate because individual consumers will not visit the tap that frequently.Across all age groups in the U.S., people average 4.4 ± 3.2 ingestion events per day in the summer and 4.1 ± 3.1 ingestion events per day in the winter (EPA, 2019).Historically, QMRAs have often assumed one ingestion event per day, although Soller, Parker, and Salveson (2018) assumed eight ingestion events per day in a QMRA incorporating short-duration, off-specification treatment conditions.Therefore, 96 ingestion events per day would be a significant departure from real-world conditions and precedent, potentially leading to nonrepresentative daily risk estimates for individuals.However, this assumption may have value in terms of capturing (1) the range of daily conditions experienced across a large community or (2) the effects of pathogen concentration "averaging," which might occur due to hydraulic considerations within a wastewater collection system or treatment facility (Gerrity et al., 2022).Moreover, a recent reliability assessment of a large-scale UV AOP system estimated that the probability of a catastrophic undetected failure that would significantly reduce pathogen inactivation is highly unlikely-on the order of 10 À12 (Pecson et al., 2018).While additional studies are needed to further characterize potential failure conditions, their probabilities, and their implications for T A B L E 1 California direct potable reuse (DPR) treatment redundancy alternative based on an annual rather than daily risk benchmark.Note: The implicit rounding buffer, implicit daily risk calculation, implicit annual risk calculation, and treatment redundancy log reduction values (LRVs) can be combined to compensate for a potential 6.0-log failure of the ultraviolet (UV) advanced oxidation process (AOP).Treatment redundancy LRVs (representing the difference between the required 6.0-log redundancy and the sum of the implicit LRVs) can then be added to the no-failure baseline LRVs to determine final LRV targets.

Pathogen
pathogen attenuation, Pecson et al. (2018) and Soller, Parker, and Salveson (2018) provide valuable summaries of the industry's existing knowledge on this topic.Based on this current literature, it can generally be assumed that public health is protected even with the 2-log implicit "calculation buffer" related to ingestion frequency.

| Background
Throughout the process of developing potable reuse regulations in California, there have been significant parallel efforts sponsored by the California SWRCB, the National Water Research Institute (NWRI), and the Water Research Foundation (WRF), among others.Notably, SWRCB tasked NWRI with organizing an Expert Panel to evaluate California's draft IPR/DPR frameworks and make recommendations to ensure that the final criteria and regulations adequately protected public health (SWRCB, 2014).In addition, SWRCB and WRF coordinated multiple research projects aimed at addressing critical knowledge gaps for potable reuse.Two projects relevant to the DPR regulatory effort included "DPR-1" (Pecson, Ashbolt, et al., 2021), which developed a tool for conducting QMRA and PATTP (i.e., DPRisk; SWRCB, 2021), and "DPR-2" (Pecson, di Giovanni, et al., 2021), which resulted in a comprehensive raw wastewater pathogen concentration dataset that could be adopted for QMRA (Pecson et al., 2022).Leveraging relevant literature and these recent research projects, the NWRI Expert Panel published a report responding to California's draft DPR regulations (NWRI, 2022).The Expert Panel's overarching criticism was related to the compounding of conservative assumptions in the development of California's draft DPR regulations, which could result in unsustainable and overdesigned potable reuse systems (NWRI, 2022).For example, the Expert Panel was concerned with the use of (1) point estimate maximum concentrations rather than recently developed statistical distributions, (2) molecular concentrations (e.g., for norovirus) without adjusting for infectivity with gene copy to infectious unit (GC:IU) ratios, and (3) potentially conservative dose-response models (e.g., hypergeometric vs. fractional Poisson for norovirus), among others.The Expert Panel estimated that these factors alone accounted for inherent conservatism of 9-11 logs (NWRI, 2022).In response, the Expert Panel proposed its own framework that accounted for baseline public health protection during normal operation and also for relatively high frequency off-specification treatment conditions.This alternative framework leveraged the recently developed statistical distributions for recoveryadjusted raw wastewater pathogen concentrations (Pecson et al., 2022), which are summarized in the context of DPRisk in Table 2.

| Baseline LRVs
The Expert Panel conducted its own QMRA and identified norovirus and enterovirus scenarios that were adequately protective of public health and resulted in the same baseline virus LRV; this approach also provided adequate protection against adenovirus.The norovirus assumptions were as follows: (1) lognormal distribution of molecular norovirus genotype II (GII) concentrations (highest of the three genotypes; Pecson et al., 2022); (2) a uniform distribution of GC:IU ratios ranging from 1:1 to 200:1 (Donia et al., 2010); and (3) the more conservative hypergeometric dose-response function (Teunis et al., 2008).
For this study, adjusting the norovirus GII distribution for the GC:IU ratios involved inputting the baseline distribution parameters (μ = 9.2 and σ = 2.8; base e) into DPRisk to develop a dataset of 10,000 concentrations.These concentrations were then randomly paired with a set of 10,000 GC:IU ratios that followed a uniform distribution from 1:1 to 200:1, allowing for the calculation of infectivity-adjusted norovirus concentrations.This revised concentration dataset was then input into DPRisk as a .csvdata file, which was assumed to follow a lognormal distribution.The QMRA output tab then identified the corresponding parameters of the revised lognormal distribution as μ = 6.5 and σ = 3.2 (base e); this revised distribution was used for the subsequent QMRA.
The Expert Panel also assumed a constant drinking water ingestion volume of 2 L/day divided equally over 96 ingestion events.For this scenario, it is not possible to use DPRisk's "benchmark" LRVs under the PATTP output tab to identify the baseline LRV targets for each pathogen.This is because DPRisk's benchmark LRV calculation does not account for the averaging effect of multiple ingestion events per day when also using a distribution for pathogen concentrations and/or ingestion volumes.Thus, a full QMRA must be conducted by assuming a range of overall treatment train LRVs (e.g., 11, 12, or 13) and then evaluating the resulting daily and annual risks from the QMRA output tab.The various scenarios used to validate the NWRI assumptions are summarized in Table S2, and the resulting DPRisk output is summarized in Figure 1.
To achieve >99.99% compliance (NWRI, 2022) with the 2.7 Â 10 À7 daily risk benchmark, LRVs of 13/10/10 would be required for viruses, Giardia, and Cryptosporidium, respectively.Relative to California's LRVs for DPR, the Expert Panel approach reduces the baseline LRV for viruses from 16 to 13 primarily because of the "averaging" effect of using a distribution of concentrations over 96 daily ingestion events rather than a high-end norovirus point estimate for all ingestion events.The slightly lower maximum virus concentrations yielded by the distributions-approximately 8 log 10 MPN/L or gc/L T A B L E 2 Summary of statistical distributions of recovery-adjusted raw wastewater pathogen concentrations developed from 24 grab samples over 14 months from five treatment facilities in California (n = 120 total samples).(Table 2)-also contributed to the LRV reduction.In contrast, the baseline LRV requirements for Giardia and Cryptosporidium were more consistent with the California framework because of their flatter concentration distributions and dampened averaging effect.Coincidentally, the distributions yielded maximum concentrations for Giardia (5.5 log 10 cysts/L) and Cryptosporidium (4.2 log 10 oocysts/L; Table 2) that were comparable but slightly higher than California's assumed point estimates.Another distinction is that the Expert Panel approach suggests an approximate one order of magnitude safety factor for annual risk, whereas the California approach yields flat risk estimates that are essentially equivalent to the benchmarks at all times.Again, this is because of the use of concentration distributions (Expert Panel) versus point estimates (California).

| Treatment failure considerations
The Expert Panel evaluated a failure framework that assumed full treatment on 90% of simulated days; a 24-h, 3-log reduction in treatment (i.e., off-specification) on 9% of days; and a 24-h, 6-log reduction in treatment (e.g., UV AOP failure) on 1% of days (NWRI, 2022).As noted earlier, California assumed a 6-log failure that lasted 15 min and occurred only once per year, but because of California's point estimate assumption for pathogen concentrations, the required treatment redundancy could easily be calculated.In contrast, the Expert Panel's adoption of statistical distributions for pathogen concentration requires a more in-depth evaluation of redundancy.In this study, DPRisk was used to assess the adequacy of 3-log (16/13/13), 4-log (17/14/14), and 5-log redundancy (18/15/15) for viruses, Giardia, and Cryptosporidium.These scenarios are summarized in Table S2, and the resulting DPRisk outputs are summarized in Figure 2.With a 5-log redundancy, the Giardia and Cryptosporidium scenarios satisfied the 10 À4 annual risk benchmark in all simulations, while enterovirus (after the 10Â modification) and norovirus exceeded the risk benchmark but only beyond the 99th percentile, or less than once every 100 years (NWRI, 2022).Inflection points representing the stepwise reduction in treatment were more apparent in the daily risk curves, particularly for the protozoa that had flatter baseline distributions for raw wastewater concentration.A 5-log redundancy was generally sufficient to achieve the 2.7 Â 10 À7 daily risk benchmark at the 99th percentile, except for Giardia with a daily risk estimate of 4.8 Â 10 À7 .Based on these results, the Expert Panel's final recommendation was a 5-log redundancy that raised the baseline LRVs to 18/15/15 for viruses, Giardia, and Cryptosporidium.Although a 6-log failure would technically drop the LRVs to 12/9/9, the Expert Panel recommended that utilities never knowingly drop below the initial baseline LRVs of 13/10/10.Thus, the LRV tiers representing 90%, 9%, and 1% of days were established at 18/15/13 for viruses and 15/12/10 for Giardia and Cryptosporidium (NWRI, 2022).
With respect to Figure 2, one point of departure from the Expert Panel is that DPRisk's probabilistic failure framework also reflects the possibility of a compound failure (i.e., 3 logs + 6 logs = 9 logs) occurring 0.09% of the time (9% Â 1% = 0.09%), which was not the intent of the Expert Panel.Therefore, the daily risk profiles in  S2 for scenario definitions).Comparing the risk profiles against the annual and daily risk benchmarks (red dashed lines) yields target baseline LRVs of 13/10/10 for viruses, Giardia, and Cryptosporidium, respectively.Abbreviation: GC:IU = gene copy to infectious unit ratio.
Figure 2 exhibit an additional inflection point near the 99.9th percentile.These high consequence, low frequency compound failures inflate the annual risk for each pathogen beyond the Expert Panel's estimates, although they do not affect the final conclusion from the QMRA.

| BEYOND CALIFORNIA: ALTERNATIVE FRAMEWORKS FOR DPR
As more states launch efforts to implement DPR, they will have to weigh the benefits and challenges posed by existing regulatory frameworks and determine what components may or may not make sense given their constraints (e.g., lack of access to coastal waters for reverse osmosis (RO) brine discharge).The following sections offer guidance for the development of DPR regulations in other states by recommending components of existing frameworks while also offering flexible but justifiable alternatives to facilitate widespread implementation.2011) identified norovirus, Giardia, and Salmonella as the top three pathogens contributing to gastrointestinal episodes in the U.S., thereby highlighting the potential significance of viruses, protozoa, and bacteria when characterizing raw wastewater pathogen concentrations for potable reuse.Although NWRI (2013) proposed an LRV of 9 for total coliform as a surrogate for Salmonella, potable reuse regulations rarely specify LRVs for bacteria, in part because the QMRA literature demonstrates that potable reuse systems designed for viruses and protozoa should be highly protective against bacteria as well (Amoueyan et al., 2019;Soller et al., 2017).This was also noted in Regli et al. (1991) in the initial assessment of public health protection in the context of the U.S. EPA's Safe Drinking Water Act (SDWA).Other microbiological agents such as antimicrobial resistance (AMR; Liguori et al., 2022) or amoebae (e.g., as a host for Legionella) may also be relevant to public health, but these hazards have not yet been regulated in drinking water and are not necessarily unique to potable reuse systems.Therefore, it seems justifiable to focus on viruses, Giardia, and Cryptosporidium in developing DPR regulations.

| Exposure considerations
In a drinking water context, pathogen exposure is the product of ingestion volume, ingestion frequency, and pathogen concentration.California and many QMRA studies in the literature have assumed a daily ingestion volume of 2 L/day.However, other studies have used F I G U R E 2 Pathogen-specific annual and daily risk profiles for the National Water Research Institute (NWRI) Expert Panel's treatment failure scenarios.The Expert Panel assumed full treatment on 90% of simulated days, a 3-log reduction in treatment on 9% of days, and a 6-log reduction in treatment on 1% of days (see Table S2 for scenario definitions).Each panel includes three risk profiles representing the target log reduction values (LRVs; baseline + 3-log, 4-log, or 5-log redundancy) and a "no failure" risk profile assuming 5-log redundancy.Achieving >99% compliance with the daily and annual risk benchmarks (red dashed lines) requires at least 5-log redundancy beyond the baseline LRVs of 13/10/10, resulting in new target LRVs of 18/15/15 for viruses, Giardia, and Cryptosporidium, respectively.Note that the actual Expert Panel approach does not include compound failures at the upper percentiles (see main text for explanation).Abbreviation: GC:IU = gene copy to infectious unit ratio.
2.5 L/day (Soller et al., 2017), which is the 90th percentile for adults (≥21 years of age) based on the National Health and Nutrition Examination Survey (NHANES; EPA, 2011) and was recently used by the U.S. EPA in evaluating health advisories for perfluorooctane sulfonate (PFOS) and perfluorooctanoic acid (PFOA).Assuming 2 versus 2.5 L/day will generally have minimal impact on the final conclusions from a QMRA, but 2.5 L/day provides additional conservatism and appears to be more consistent with the U.S. EPA's recent assumptions.
Ingestion frequency has recently been identified as an important variable in potable reuse QMRAs, specifically as a means of incorporating failure scenarios (Pecson et al., 2017) and off-specification conditions (Soller, Parker, & Salveson, 2018).As noted earlier, assuming multiple ingestions per day in conjunction with statistical distributions for pathogen concentration leads to an averaging effect, which can mitigate the impacts of low frequency extreme events.California assumed 96 ingestion events per day for its draft DPR regulations, while the average across all age groups in the U.S. is approximately 4 ingestion events per day (EPA, 2019).Assuming one ingestion event per day sometimes yields a worst-case scenario because it eliminates any averaging benefit, but it can also lead to more favorable conditions when that single ingestion helps avoid a low frequency extreme event (e.g., when ingestion and treatment failure occur at different times on a given day).Therefore, sensitivity analyses on ingestion frequency are recommended to better understand the implications of this parameter on final risk conclusions.
As noted by the Expert Panel (NWRI, 2022), the statistical distributions for raw wastewater pathogen concentration reported by Pecson et al. (2022) represent a robust dataset for developing DPR regulations.However, there is a perception that LRV determinations from raw wastewater are overly conservative.This is particularly relevant to projects that are unable to seek LRV credits for conventional (i.e., primary, secondary, and/or tertiary) wastewater treatment, in part because the industry currently lacks a full mechanistic understanding and relevant surrogates to justify and assign credits to these upstream treatment processes.While this perception of conservatism may be valid in some instances, a "treated effluent" approach is highly site specific and may not always lead to an overall reduction in the LRV requirement when it is possible to seek credit for conventional wastewater treatment (Table S7).Also, AWT-specific LRVs may not be able to compensate for upsets in upstream conventional wastewater treatment-a scenario that might have minimal impact on a "raw wastewater" approach.But considering other potential benefits of the treated effluent approach (Text S2), providing flexibility for either option (e.g., Colorado) may be justifiable.
Text S3 offers an adaptation of Texas' monitoring and LRV requirements for its treated effluent approach (TCEQ, 2022), which could be adopted as a complement to the raw wastewater framework described here.
A potential consideration when allowing Coloradotype flexibility is consistency between the alternatives.For example, the default raw wastewater approach would allow a utility to implement a potable reuse project without the need for a costly, time-intensive monitoring campaign.But depending on how the regulations are written, there is potential for the raw wastewater option to be at a fundamental disadvantage, specifically if compliance is based on the maximum concentration from 10,000 simulations (raw wastewater; Table 2) versus the maximum concentration from a more limited number of actual samples (treated effluent; Text S3).One possibility for achieving consistency would be targeting compliance based on equivalent percentiles, according to the following equation (Blom, 1958;Pecson et al., 2022): where, rank = ordered number of samples in question, N = total number of samples.Consistent with the U.S. EPA's Long Term 2 Enhanced Surface Water Treatment Rule (LT2) and the raw wastewater monitoring campaign in Pecson et al. (2022), characterization of pathogen concentrations for a given location is often based on 24 samples (N = 24).In the Texas treated effluent framework (TCEQ, 2022), the required LRVs are calculated based on the maximum concentrations observed during the monitoring campaign, which yields a percentile value of 0.974 when rank = 24 and N = 24 (Equation 1).In contrast, the maximum value from the full distribution in Table 2, which is based on 10,000 simulations, would have a percentile value of 0.9999, hence the potential disparity between the two approaches.Instead, the 97.4th percentile concentrations from the Pecson et al. (2022) distributions could be selected to achieve greater comparability (Table 3).
Coincidentally, the 97.4th percentile enterovirus (baseline) and Giardia concentrations are consistent with those used by California for IPR and IPR/DPR, respectively.Thus, a 10Â adjustment for enterovirus, as proposed by the NWRI Expert Panel, would yield a more conservative virus LRV than California's IPR regulations for groundwater replenishment.The proposed 97.4th percentile Cryptosporidium concentration is approximately an order of magnitude lower than the California assumption, but it is consistent with the maximum concentration observed in Pecson et al. (2022).This consistency also applies for enterovirus and Giardia but not for norovirus, which was detected at just under 8.0 log 10 gc/L in one sample from the Orange County Sanitation District (Pecson, Ashbolt, et al., 2021;Pecson, di Giovanni, et al., 2021).For comparison, the 97.4th percentile and maximum combined norovirus concentrations from the statistical distribution are 6.5 and 8.6 log 10 gc/L, respectively (Table 3).Therefore, regulating raw wastewater based on the 97.4th percentile is highly protective, but this approach may not always reflect the extreme concentrations that might occur in some systems, although the discrepancy may be more applicable to molecular methods.
Based on the concerns of Gerba and Betancourt (2019) regarding the limitations of current culture methods, adjusting culture-based enterovirus concentrations by a factor of 10 seems warranted to better protect public health.Adjusting molecular concentrations to better reflect infectivity is less clear, however.Figure S3 shows the Pecson et al. (2022) distributions with several GC:IU modifications.None of the proposed GC:IU assumptions reproduce the corresponding culture-based concentrations, particularly in the case of adenovirus.Therefore, the use of molecular data-even with GC:IU modifications-may not be advisable when culture data are available (e.g., enterovirus and adenovirus), although this is still not possible for norovirus because of its lack of a robust infectivity assay.
Another consideration for norovirus is its multiple molecular assays (i.e., GIA, GIB, and GII), which theoretically capture distinct strains that are all capable of causing infection.A portion of the general population is resistant to certain norovirus strains because these individuals lack complementary histo-blood group antigens, but it is hypothesized that the genetic variability of norovirus makes all people susceptible to at least one or more strains (Atmar, 2010;Barker et al., 2013).GII is believed to account for most norovirus gastroenteritis cases worldwide, which presumably explains its greater abundance in raw sewage in California (Table 2), but GI still accounts for a considerable number of infections (da Silva et al., 2007).Therefore, the various genotypes may exhibit different dose-response curves that have not yet been characterized, but it may still be justifiable to sum the concentrations of the genotypes (i.e., [GIA] + [GIB] + [GII]) when such data are available.It is important to note that some strains might be captured by both GIA and GIB, so this would represent another potential source of conservatism.
T A B L E 3 Summary of log reduction value (LRV) requirements for the alternative framework assuming a daily ingestion volume of 2.5 L and the specified point estimates for raw wastewater pathogen concentration.Note: The two sets of LRVs reflect compliance with the (left) 97.4th percentiles and (right) maximum values from the distributions in Table 2.The full scenarios are defined in Table S3.The recommended LRVs based on this analysis would be (left) 13/10/10 and (right) 15/11/11 for viruses, Giardia, and Cryptosporidium, respectively (see values in bold).Risks associated with the maximum concentrations are illustrated in Figures 3 and 4. Risks associated with the 97.4th percentile concentrations are illustrated in Figures S1 and S2.
Abbreviation: GC:IU = gene copy to infectious unit ratio.
Another question for the development of DPR regulations is whether outbreak conditions should explicitly be considered when establishing LRV requirements.The extensive wastewater-based epidemiology (WBE) research conducted during the COVID-19 pandemic has shown that raw sewage virus concentrations can be directly linked to infection incidence in local communities, with concentrations of genomic targets sometimes spanning orders of magnitude during infection surges (Gerrity et al., 2022;Vo et al., 2022).This has also been demonstrated for enteric pathogens during confirmed (da Silva et al., 2007) and simulated (Barker et al., 2013) outbreaks. However, da Silva et al. (2007) also noted that prolonged shedding (e.g., norovirus) may result in cumulative concentrations peaking after an outbreak has already been detected through syndromic surveillance efforts.This highlights the need for robust communication between the public health and water sectors, particularly in the context of DPR.This increased communication might obviate the need for costly treatment redundancy, which is implemented only to mitigate these low frequency but potentially high consequence events.For example, any indication of off-specification treatment during an outbreak condition might warrant product water diversion (if practical), even if diversion is not mandated by the regulatory framework.As noted in Barker et al. (2013), additional precautionary measures may be needed in smaller systems with limited dilution (e.g., building scale with $100 people).

| Dose-response considerations
Recent QMRAs and supporting literature discuss a number of dose-response models for pathogens of interest in potable reuse applications, and some explicitly quantify the implications of choosing different dose-response parameters (Amoueyan et al., 2017) or models (Amoueyan et al., 2019;Soller et al., 2017).The alternative frameworks described in the next section adopt the most conservative dose-response models from these studies, which are largely consistent with those used by California and the NWRI Expert Panel (Tables S1 and S2).The one exception is that the following analysis also includes the ingestion-based, exact beta-Poisson (i.e., hypergeometric) dose-response model for adenovirus (α = 5.11 and β = 2.8; Teunis et al., 2016).

| Baseline LRVs
Scenarios were developed in DPRisk to identify baseline LRV requirements for two different compliance targets: (1) the 97.4th percentile and (2) maximum concentrations from the Pecson et al. (2022) statistical distributions (Table S3).This deterministic QMRA indicated that baseline LRVs of 13/10/10 and 15/11/11, respectively, would be appropriate to achieve adequate public health protection against enterovirus (10Â culture), adenovirus (culture), norovirus (combined molecular), Giardia, and Cryptosporidium (Table 3).The corresponding acceptable drinking water concentrations are 10 À6.7 MPN/L for enterovirus, 10 À6.8 MPN/L for adenovirus, 10 À6.6 gc/L for norovirus, 10 À5.3 cysts/L for Giardia, and 10 À7.0 oocysts/L for Cryptosporidium.The unadjusted adenovirus and enterovirus molecular data require higher baseline LRVs, but again, the corresponding culture-based data are recommended due to GC:IU uncertainty.Because these LRV targets were determined with point estimates for all relevant parameters, they satisfy both the daily and annual risk benchmarks, and they are independent of ingestion frequency.It is important to note that for ingestion frequencies of 1 or 4 per day, the full raw wastewater distributions from Table 2 also result in recommended LRVs of 15/11/11 (Table S8).This is because the maximum concentration still drives the daily risk calculation, and 4 ingestion events per day do not yield a significant averaging effect.
The more conservative 15/11/11 LRV targets are consistent with the recommendations of Soller, Eftim, and Nappier (2018).Although lower, the 13/10/10 LRVs still provide a justifiable level of public health protection while also providing comparability with the treated effluent alternative (Text S3).As noted earlier, 13/10/10 would generally be protective against the highest observed pathogen concentrations in Pecson et al. (2022), while 15/11/11 would protect against the maximum simulated pathogen concentrations based on the statistical distributions from that same study.The one exception for 13/10/10 is the aforementioned peak norovirus concentration reported in Pecson et al. (2022), but it is important to note that this QMRA assumes a GC:IU ratio of 1:1.With observed GC:IU ratios as high as 10,000:1 for enterovirus and 100,000:1 for adenovirus (Pecson et al., 2022), regulating based on the 97.4th percentile might also be protective against the highest observed norovirus concentration assuming similar infectivity ratios.

| Tolerance for off-specification conditions
Rather than explicitly augmenting these baseline LRVs to compensate for treatment failure, an alternative approach is to characterize their tolerance for failures or offspecification conditions.To quantify this tolerance, a sensitivity analysis was performed in DPRisk using the full raw wastewater distributions from Table 2 to identify failure magnitudes, durations, and frequencies, which would still satisfy the public health benchmarks.Based on how the baseline LRVs were determined for the "equivalency" approach, the daily risk benchmark is already exceeded beyond the 97.4th percentile, even under nominal conditions; exacerbating those daily risks can ultimately jeopardize the annual risk benchmark as well.For the maximum concentration approach, treatment trains satisfying the 15/11/11 baseline LRVs have tolerance for both the daily and annual risk benchmarks.Tolerances for the 97.4th percentile approach (Table S5) are illustrated in Figures S1 and S2, while tolerances for the more conservative maximum concentration approach (Table S4) are described here.
To narrow the range of possible scenarios, the tolerance framework aimed to satisfy the following criteria: allowing for a >1-log decrease in LRV (magnitude), which could occur multiple times per year (frequency) over a conservative 24-h period (duration) to provide sufficient time for identifying, responding, and correcting the off-specification condition.After conducting preliminary model simulations (data not shown), the baseline tolerance scenario was defined as a 24-h, 3-log offspecification event; ≥4 logs could not be tolerated even once for protozoa.An initial sensitivity analysis was then performed (scenarios defined in Table S4) to identify offspecification frequencies that would still achieve >99% compliance with the daily and annual risk benchmarks (NWRI, 2022), assuming a single daily ingestion volume of 2.5 L. The results indicated that approximately 12 (viruses) and 3 (protozoa) off-specification events per year could be tolerated under these conditions (Figure 3).
In reality, off-specification conditions would likely be identified and resolved rapidly (e.g., within 15 min or 1 h), particularly with online surrogate monitoring of critical control points.Also, an important factor not captured by the first sensitivity analysis is the impact of ingestion frequency (Soller, Parker, & Salveson, 2018).By visiting the tap more frequently, people might increase their probability of being exposed to an off-specification event (for failure durations shorter than 24 h), but multiple daily ingestions also produce some degree of averaging, which can help mitigate the impacts of high concentrations and/or LRV deficits.These issues were assessed through additional sensitivity analyses on failure duration (15 min, 1 h, and 24 h) and ingestion frequency (1, 4, or 96 per day).The corresponding scenarios are defined in Table S4, and the results are summarized in Figure 4. Shorter duration off-specification conditions pushed noncompliance with the daily risk benchmark to even higher percentiles and decreased the corresponding annual risk estimates by orders of magnitude.Consistent with the hydraulics analysis in Gerrity et al. (2022), the averaging effect from multiple ingestions was most pronounced at the lower percentiles where risk increased considerably, but for 96 ingestion events per day, it also reduced risks at the highest percentiles that drive regulatory decisions.
F I G U R E 3 Pathogen-specific annual and daily risk profiles for the alternative (max concentration) framework's off-specification treatment scenarios.A 24-h, 3-log reduction in treatment relative to the 15/11/11 baseline log reduction values (LRVs) for viruses, Giardia, and Cryptosporidium was assumed to occur the specified number of times each year.The entire daily ingestion volume of 2.5 L was assumed to occur during a single ingestion event (see Table S4 for scenario definitions).Achieving >99% compliance with the daily and annual risk benchmarks (red dashed lines) required the treatment train to experience no more than 12 and 3 off-specification days each year for viruses and protozoa, respectively.

| Summary
The alternative frameworks can be summarized as follows (Figure 5): • For the 97.4th percentile approach, baseline LRVs of 13/10/10 should be targeted for viruses, Giardia, and Cryptosporidium, and potable reuse systems may be allowed to operate down to 11/8/8 for up to 4 h and 4 times per 365-day period.• For the maximum concentration approach, baseline LRVs of 15/11/11 should be targeted for viruses, Giardia, and Cryptosporidium, and potable reuse systems may be allowed to operate down to 12/8/8 for 3 (protozoa) to 12 (viruses) days per year (i.e., 24-h duration).
These frameworks allow for systems to fall below the baseline LRVs for short periods of time while maintaining continuity of service.Despite these allowances, potable reuse systems should be designed to exceed the baseline LRVs at all times and should be operated with online surrogate monitoring at critical control points (Nappier et al., 2018) to verify process integrity and rapidly identify F I G U R E 4 Annual and daily risk profiles for enterovirus and Cryptosporidium for the alternative (max concentration) framework's offspecification treatment scenarios.A 3-log reduction in treatment relative to the baseline log reduction values (LRVs) was assumed to occur for varying durations (15 min, 1 h, or 24 h) or in combination with a daily ingestion volume of 2.5 L divided over varying ingestion frequencies (1/day, 4/day, or 96/day; see Table S4 for scenario definitions).Shorter durations shift the risk profiles down (lower risk) and to the right (lower frequency), while varying ingestion frequencies exhibit a more complex relationship with risk.and respond to off-specification conditions.If feasible in the future, high frequency monitoring of raw wastewater pathogen concentrations could help inform operations at the conventional wastewater treatment and AWT facilities, in addition to providing valuable WBE data for public health officials.Robust monitoring would supplement the inherent conservatism of regulations that often use the lower bounds of observed treatment for crediting (e.g., 5th percentile LRV; DDW, 2021)-a practice that has been shown to overestimate public health risk (Amoueyan et al., 2020).Potable reuse systems may also incorporate blending, small environmental or engineered storage buffers to increase response retention time, and product water diversion points.Collectively, these measures can help minimize treatment redundancy that may increase capital and operational costs without clear benefits for public health protection (NWRI, 2022).This overall approach would be consistent with the U.S. EPA's SDWA framework, which relies on multiple barriers employing diverse mechanisms but does not explicitly call for treatment redundancy to compensate for the possibility of treatment failure.That being said, even if the proposed tolerance criteria are technically satisfied, a system that repeatedly falls below the baseline LRVs should be audited to identify and resolve underlying problems.

| PATH FORWARD FOR DPR
The California, NWRI Expert Panel, and alternative framework all rely on the same basic approach for developing DPR regulations, but the assumptions used in each QMRA differ significantly, resulting in a wide range of target LRVs and compliance criteria (Figure 5).California's deterministic approach and 4-log redundancy requirement yields the highest LRV for viruses, but the LRVs for Giardia and Cryptosporidium are comparable to those of the NWRI Expert Panel.The Expert Panel's use of statistical distributions for raw wastewater concentration yields a lower baseline LRV for viruses, and although they proposed a greater level of treatment redundancy (5 logs), they also recommended greater flexibility in terms of offspecification allowances.Finally, the alternative frameworks rely on the same statistical distributions as the NWRI Expert Panel but assume a single ingestion event.One alternative also offers equivalency between raw wastewater and treated effluent approaches.Notably, instead of proposing treatment redundancy, the alternative frameworks consider tolerance for off-specification conditions while still maintaining a minimum level of treatment.
The take-home message is that there is no single correct approach for protecting public health in DPR systems, with each approach being a product of its assumptions.Importantly, these frameworks are all protective of public health, in part because they are inherently conservative in various ways.However, they each have important differences with respect to their complexity and/or implications for capital and operational costs upon implementation.Therefore, it is important for stakeholders to understand these nuances when adopting a framework or framework elements to ensure the final regulation is adequately protective of public health while still satisfying goals related to policy and practice.

F
I G U R E 1 Pathogen-specific annual and daily risk profiles for the National Water Research Institute (NWRI) Expert Panel's baseline log reduction value (LRV) scenarios (see Table Scallan et al. (

F
I G U R E 5 Summary schematic of the California indirect potable reuse (IPR; groundwater), California direct potable reuse (DPR), National Water Research Institute (NWRI) Expert Panel DPR, and alternative DPR frameworks (97.4th percentile and maximum concentration approaches).Adapted from DDW (2021) and NWRI (2022).Created with BioRender.com.