Open Access Article
Emily Clements†
ab,
Katherine Crank†
a,
Deena Hannouna and
Daniel Gerrity*a
aSouthern Nevada Water Authority, P.O. Box 99954, Las Vegas, NV 89193, USA. E-mail: daniel.gerrity@snwa.com
bCarollo Engineers, Inc, 901 N Stuart Street, Suite 403, Arlington, Virginia 22203, USA
First published on 22nd December 2025
De facto reuse (DFR) refers to the incidental or unintentional incorporation of treated wastewater into natural water bodies used as a source of drinking water. Increasing recognition of this practice has highlighted a potential risk of human exposure to various chemicals and pathogens originating from wastewater. In this study, quantitative microbial risk assessment (QMRA) was used to determine the infection risks associated with norovirus, adenovirus, enterovirus, Cryptosporidium, and Giardia for DFR in Southern Nevada (i.e., Lake Mead). Scenarios included three lake levels to encompass current (329 m) and possible scenarios associated with continued drought conditions (312 m and 297 m). Starting with observed raw wastewater pathogen concentrations at local wastewater treatment plants, risks were estimated after accounting for facility-specific wastewater treatment trains, discharge-specific dilution and decay in the environmental buffers (based on hydrodynamic modeling), and drinking water treatment. Log reduction values (LRVs) for wastewater treatment were also calibrated to observed Cryptosporidium concentrations in the environment to characterize ‘gaps’ in crediting (LRVgap = 1.97). For the baseline lake level, the median cumulative risk of gastrointestinal infection from all pathogens was 10−4.59 infections per person per year, with Cryptosporidium as the primary driver of risk. Risks increased significantly for the lower lake elevations but still satisfied the annual risk benchmark of 10−4. The impacts of seasonality were also studied for norovirus, indicating increased risks during fall and spring. Overall, this study demonstrates that the current design and operation of the Southern Nevada DFR system is protective of public health with respect to enteric pathogen exposure, even if the current Colorado River Basin drought continues or worsens.
Water impactEnteric pathogen concentrations were modeled through Southern Nevada's de facto reuse system. By coupling quantitative microbial risk assessment (QMRA) with an existing hydrodynamic model, the effects of seasonality, lake level decline, engineered treatment, and environmental fate (i.e., decay) and transport were considered. The Lake Mead de facto reuse system was determined to be protective of human health and resilient to lake level decline. |
Quantitative microbial risk assessment (QMRA) serves as a valuable tool in evaluating the potential pathogen risks in these de facto reuse applications. Previous potable reuse QMRAs have incorporated seasonality in terms of its impact on norovirus concentrations,11 but none have directly assessed how seasonality and drought impact risk estimates through hydrodynamic changes in pathogen travel times and relative water supply contribution from DFR (i.e., DFR percentage).12 These considerations are important because of the potential time dependence of pathogen loads at wastewater treatment plants (WWTPs)13,14 and differences in dilution and decay in the environmental buffer. While QMRAs and regulatory frameworks for potable reuse often assume that influent pathogen concentrations are random (i.e., not autocorrelated), seasonality and outbreaks may result in elevated or peak concentrations for sustained periods of time. Beyond seasonality, drought conditions can also alter the DFR percentage at a drinking water intake,1 which impacts the level of pathogen attenuation from dilution.
With climate change, drought frequencies and intensities are worsening, impacting reservoirs around the world.15 DFR percentages in reservoirs can increase with drought, which has been shown to impact water quality.15–17 Drought can also impact how water, including treated wastewater, travels through reservoirs. Certain attributes of wastewater effluent (e.g., salinity and temperature) may differ from ambient conditions in a receiving lake, thereby causing the discharge to travel as a distinct plume along the thermocline, particularly under stratified conditions.18,19 The depth of the thermocline, which is impacted by solar radiation, air/water temperature, and lake elevation, and its relation to the drinking water intake can have a large impact on the DFR percentage.20 This has not yet been integrated into potable reuse QMRAs.
As regulations for indirect potable reuse (IPR) and direct potable reuse (DPR) continue to be developed and implemented in states across the U.S.,21–26 it becomes critically important to understand the risks associated with potable reuse and how those risks change under shifting conditions. With these emerging regulatory frameworks, there has been intense focus on their stringent pathogen log reduction value (LRV) targets and the corresponding credits for advanced water treatment (AWT) processes. This potentially leads to the perception that DFR is inherently unsafe since some of the LRVs occurring within the overall DFR system, particularly in the environmental buffer(s), are not explicitly characterized. However, the combination of conventional engineered treatment, including both wastewater and drinking water treatment, coupled with natural attenuation in DFR systems could be as protective as AWT.27
Therefore, the goal of this study was to quantify drinking water infection risks due to potential exposure to five pathogens (norovirus, adenovirus, enterovirus, Cryptosporidium, and Giardia) in the Southern Nevada DFR system associated with Lake Mead. The novelty of this study relates to its incorporation of robust, geographically-linked raw wastewater pathogen concentrations and the effects of seasonality and long-term drought, as these environmental factors affect pathogen concentrations, travel times (i.e., pathogen decay), and dilution in DFR systems.
950 simulated days using a Monte Carlo approach. Annual risk distributions were then developed by dividing the 10
950 daily risks into 365-day subgroups to generate 30 annual datasets. The union probabilities were calculated, and the resultant annual risk distribution was compared against an annual risk benchmark of 10−4. The model was implemented in R 4.4.1 using RStudio 2024.09.1.28 Additional details for the QMRA components are described in greater detail in the following sections.
To maximize “return flow credits” to the Colorado River system, Southern Nevada prioritizes reductions in consumptive water use (e.g., outdoor irrigation and cooling towers) and aims to maximize discharge of treated wastewater effluent from four WWTPs (hereafter, WWTP-1, WWTP-2, WWTP-3, and WWTP-4) into the Las Vegas Wash.4 The Las Vegas Wash ultimately discharges into Boulder Basin, which is the most downstream basin in Lake Mead and the location of the drinking water intakes for the Southern Nevada Water Authority (SNWA). Under typical (i.e., dry) flow conditions, the Las Vegas Wash carries 7 × 105 m3 per day of treated wastewater effluent, which represents approximately 90% of its total flow.32 This results in approximately 1.4% DFR at the drinking water intake,33 though that number changes seasonally and at different lake levels.20 By prioritizing conservation efforts to maximize return flows to Lake Mead, Southern Nevada has significantly reduced its annual consumptive water use, reaching as low as 230 million m3 in 2023. This is well below Nevada's baseline Colorado River allocation of 370 million m3, despite a population increase of 829
000 residents between 2002 and 2024.
:
IU) ratios. Here, we used the recovery-corrected data from the previous study across both pandemic and non-pandemic conditions, but focused only on facilities that discharged to LVW. Recovery-corrected data was chosen in accordance with the requirements for high-quality pathogen datasets set forth by Darby et al.35 for use in studies within regulatory contexts. Recovery percentages ranged from 0–100% and are described in detail in Crank et al.13 The overall pathogen concentration dataset was also subdivided by WWTP to allow coupling of site-specific concentrations, LRVs, and travel times within LVW. The site-specific datasets were fit to new log10-normal distributions using ‘fitdistcens’ in R (Table S1). In Crank et al.,13 a strong seasonal effect was observed only for norovirus, which Eftim et al.11 also found, so new season-specific distributions were also fit to the norovirus data for this QMRA. Seasons were defined as follows: fall = September, October, November; winter = December, January, February; spring = March, April, May; and summer = June, July, and August. For each simulation, norovirus GI and GII concentrations from the strain-specific distributions were summed and treated as an overall norovirus hazard, consistent with recent literature.24
In addition to modeling molecular virus concentrations with no infectivity adjustments, the previously reported GC
:
IU ratio distributions from Crank et al.13 (Table S2) were applied to the molecular enterovirus and adenovirus concentrations (i.e., in units of GC L−1) to convert them to infectious units (i.e., IU L−1). These adjusted molecular concentrations were compared alongside the directly measured culture concentrations. For norovirus, the GC
:
IU relationship is not well elucidated in the literature, so a uniform distribution between 1
:
1 and 200
:
1 was assumed (implemented as a log10-uniform distribution from 0.00 to 2.30).24 Additional discussion related to the norovirus GC
:
IU assumption is included in Table S2. Final pathogen scenarios included WWTP-specific concentrations of Giardia (microscopy), Cryptosporidium (microscopy), adenovirus (cell culture and molecular with/without GC
:
IU ratio adjustment), enterovirus (cell culture and molecular with/without GC
:
IU ratio adjustment), and norovirus (molecular GI and GII summed with/without a GC
:
IU ratio adjustment).
| Process | Virus | Giardia | Cryptosporidium | WWTP/DWTP | Ref. |
|---|---|---|---|---|---|
| a Assumes a truncated normal distribution for virus LRVs with a = minimum LRV and b = maximum LRV.b Assumes NoV GII from Hill et al. (2025).39c UV: WWTP-3 = 34 mJ cm−2 and WWTP-4 = 47 mJ cm−2.d Ozone: WWTP-3 = 1.2 mg min L−1 CT and 26.8 °C at outfall.e Chlorine: WWTP-1 = 6 mg min L−1 CT and WWTP-2 = 450 mg min L−1 CT.f The chlorine virus LRV was capped at 4 (maximum in the US EPA Guidance Manual on Disinfection Profiling and Benchmarking table B-2 (ref. 46)) for the WWTPs.g Relative flow split at WWTP-3: 83% = GMF + UV and 17% = UF + Ozone.h Z-score/value from standardized normal distribution.i The LRVs were from the SCADA software for DWTP-1, which produces more water and has higher chlorine CT values than DWTP-2, though this would only impact Giardia chlorine LRVs.j Ozone: DWTP-1 CT = 2.30 mg min L−1 for virus and 2.12 mg min L−1 for Giardia from SCADA calculations, and CT = 3.37 mg min L−1 for Cryptosporidium based on the log integration method.k Long Term 2 Enhanced Surface Water Treatment Rule allows plants to claim an additional 0.5 LRVs if they meet turbidity requirements (40 CFR 141.718(b)).l Chlorine: DWTP-1 CT = 242 mg min L−1 at pH of ∼8 and temperature of ∼12.5 °C, which yields an LRV > 200 that was capped at 6. | |||||
| Wastewater treatment plants | |||||
| Primary | 0 | 0 | 0 | 2, 3 | NA |
| Membrane bioreactor (MBR) | 1.5 | 2 | 2 | 1 | 38 |
| Conventional secondarya | AdV: m = 2.3, s = 0.8, a = 0.6, b = 3.6 | 2.5156 + Zh × 0.1070 | 2.0962 + Zh × 0.1085 | 2, 3, 4 | 39, 40 |
| EnV: m = 1.3, s = 0.8, a = 0.3, b = 3.5 | |||||
| NoV:b m = 1.1, s = 0.7, a = 0, b = 2.7 | |||||
| Granular media filtration (GMF) | 0 | 0 | 0 | 2, 3,g 4 | NA |
| Ultrafiltration (UF) | 0 | 4 | 4 | 3g | 41 |
| Ultraviolet (UV) disinfectionc | WWTP-3 = 0.4 | 6 | 6 | 3,g 4 | 42, 43 |
| WWTP-4 = 0.7 | |||||
| Ozoned | 6 | 6 | 0.6 | 3g | 42, 44, 45 |
| Chlorinee | 4f | WWTP-1 = 0.3 | 0 | 1, 2 | 42, 46 |
| WWTP-2 = 6 | |||||
| Drinking watertreatment plantsi | |||||
| Ozonej | 6 | 5.77 | 0.47 | 1, 2 | 42, 44, 45 |
| Direct GMF | 1 | 2 | 2.5 | 1, 2 | 45 |
| Individual filter effluentk | 0 | 0 | 0.5 | 1, 2 | 45 |
| Chlorinel | 6 | 6 | 0 | 1, 2 | 45 |
000 m3 per day, WWTP 2 had a flow rate of 150
000 m3 per day, WWTP 3 had a flow rate of 360
000 m3 per day, and WWTP 4 had a flow rate of 83
000 m3 per day. The combined flow from the four WWTPs (657
000 m3 per day) was assumed to account for 90% of the total flow in LVW,32 resulting in an overall LVW flow rate of 730
000 m3 per day. In other words, the non-effluent base flow of LVW was assumed to be 73
000 m3 per day. The percent contributions to overall flow in LVW were then converted to LRVs, which ranged from 0.31 to 1.06 (Table S3), to account for dilution.
In addition to dilution, pathogen decay in LVW was included as an environmental buffer LRV and was determined based on published first order decay rate constants coupled with WWTP-specific travel times to the LVW discharge point to Lake Mead. The assumed travel times ranged from approximately 7–15 hours (Table S4), as determined by the rhodamine tracer study data in Blasius et al.48 Each pathogen was assigned a distribution of decay rate constants based on published systematic reviews for protozoa49 and viruses.50 These literature reviews encompassed multiple enumeration methods, so new log10-normal distributions were fit to the published base e rate constants (Table S5),49,50 to focus only on microscopy for protozoa and cell culture for viruses. This is because microscopy and cell culture methods, as opposed to molecular methods, better describe decay/inactivation of infectious pathogens. LRVs were then calculated using the stochastic pathogen-specific first order decay rate constants (k, d−1) and the deterministic WWTP-specific travel times for LVW (t, days) (eqn (1)). LRVs for decay/inactivation were not capped at a maximum value. Example decay LRVs as a function of travel time are illustrated in Fig. S2.
| Decay LRV = −log10(e−kt) | (1) |
The LMM was also used to create an empirical distribution of travel times from the LVW discharge point into Lake Mead to SNWA's “Intake 3”. Consistent with Marti et al.,51 a tracer was added to the model at the LVW discharge point for each of the four seasons. The likelihood of a specific travel time was determined by the relative concentration of the tracer at the drinking water intake at that time; travel time distributions were developed for the aforementioned lake levels (329 m, 312 m, and 297 m). For the non-seasonal analyses, the tracer concentrations at each time point for each season were summed to create a total relative concentration (or relative weight), which was used to determine the likelihood of a specific travel time. Using the decay rate constants described earlier, the empirical travel times were randomly sampled—either from the overall dataset or the season-specific datasets—to calculate corresponding LRVs (not capped at a maximum value). Fig. S2 also characterizes LRVs for the longer travel times within Lake Mead.
Settling was also considered as a potential removal mechanism; however, the lack of particle data precluded reliable estimation. Future work characterizing particle-associated protozoa and virus settling could help quantify this reduction more accurately.
An additional analysis was conducted to assess risks with and without ozone. Since the DWTPs are “Bin 1” public water systems according to the U.S. EPA's Long Term 2 Enhanced Surface Water Treatment Rule (LT2), no additional Cryptosporidium removal/inactivation is required. In other words, ozonation is voluntarily employed for added public health protection and for its ancillary benefits, including algal mitigation, oxidation of taste and odor compounds and other trace organics, and reductions in disinfection byproduct formation potential. For this particular analysis, pathogen risks were compared against the daily risk benchmark of 2.7 × 10−7 infections per person to characterize the contribution of ozonation to overall public health protection.
![]() | (2) |
950 simulations. The underlying dataset was then divided into 30 years of simulations, with 365 daily simulations per year (365 × 30 = 10
950). Pathogen-specific annual risk was then calculated according to eqn (3); eqn (3) was also used to combine pathogen-specific risks into a cumulative risk of gastrointestinal infection (i.e., simultaneously accounting for all pathogens). For the cumulative risk calculation, molecular virus concentrations were used to avoid redundancy for adenovirus and enterovirus. This resulted in a more conservative estimate of cumulative risk since molecular enterovirus and adenovirus concentrations were higher than the corresponding culture-based concentrations, even after accounting for GC
:
IU ratios.| Ptot = 1 − ∏(1 − Pindividual) | (3) |
For Lake Mead, travel times were computed for each lake level (329 m, 312 m, and 297 m) and ranged between 44 and 17
507 hours (or 729 days) (Fig. S4). Lake Mead travel time significantly impacts pathogen decay/inactivation (Fig. S2), particularly for enterovirus because of its relatively high first order decay rate constant. The median travel time across all seasons decreased with declining lake level: 1306 h or 54 days at 329 m; 1050 h or 44 days at 312 m; and 684 h or 29 days at 297 m. This was presumably due to the declining lake volume (V) coupled with the fixed LVW flow rate (Q), resulting in a shorter theoretical hydraulic retention time (τ = V/Q). It should be noted that the lake level is not determined from flow from the LVW, which accounts for only ∼2% of the inflow to Lake Mead.29 At 329 m, the seasonal travel times were also evaluated. The median travel time was shortest in the fall (712 h), then the summer (1144 h), winter (2488 h), and spring (4249 h). The travel time was shortest in the fall and summer likely due to lake stratification,20 which reduces mixing. Even for Giardia and Cryptosporidium, the long travel times in Lake Mead can provide a valuable barrier for added public health protection (e.g., LRV > 5 with 1 year of travel time; Fig. S2), but short travel times coupled with slow decay/inactivation can also lead to elevated public health risk.
With respect to engineered treatment, free chlorine is known to be effective for viruses and Giardia but is ineffective against Cryptosporidium.42,46 On the other hand, UV is effective against protozoa but is less effective for some viruses, notably adenovirus.42,43 Thus, based on the assumptions in this QMRA, there was little attenuation of Cryptosporidium for WWTP-2 due to its use of chlorination, moderate attenuation for WWTP-1 due to the MBR, and higher levels of attenuation for WWTP-3 and WWTP-4 due to their use of UV disinfection. WWTP-3 also included UF and ozonation (LRVUF+O3 = 4.6) but only on 17% of its flow. The opposite was shown for adenovirus, with high levels of attenuation at WWTP-1 and WWTP-2 due to free chlorine disinfection and lower levels of attenuation at WWTP-3 and WWTP-4 with UV disinfection. These differences across WWTPs were further impacted by differences in discharge location (i.e., affecting decay) and relative contribution to LVW flow rate (i.e., affecting dilution).
The overall median LRVs (not counting GC
:
IU adjustment as an LRV) were highest for enterovirus (LRV ≈ 34–43, depending on WWTP source), primarily because it decayed more in the environmental buffers than the other pathogens. Adenovirus (LRV ≈ 27–31) and norovirus (LRV ≈ 20–24) also differed from each other due to differences in environmental decay. Giardia had relatively high median LRVs of ∼20–26, with ∼14
log10 reduction provided by the DWTPs (primarily from ozonation and chlorination). In contrast, Cryptosporidium had the lowest overall median LRVs of ∼11–17, even with the LRVgap, because of its limited attenuation at some WWTPs, slow environmental decay, and minimal DWTP LRV of ∼3.5 (despite the use of ozonation).
As demonstrated by the LRVgap for Cryptosporidium, it is important to reiterate that the LRVs for engineered treatment incorporated into this QMRA may be overly conservative in some cases. For example, the UV dose of 47 mJ cm−2 at WWTP-4 results in a calculated LRV of 8.7 for Cryptosporidium, but the LRV is capped at 6. The difference between calculated and assumed LRVs was even more substantial for viruses at the DWTPs, with calculated LRVs for ozone and chlorine of 13 and 206, respectively. For Cryptosporidium, uncredited sedimentation and filtration at the WWTPs and deposition within LVW were presumably accounted for with the LRVgap, but deposition was not considered for Lake Mead. Due to a lack of quantitative data, a similar calibration to observed Cryptosporidium concentrations at the drinking water intake was not possible. In over 10 years of intake monitoring by SNWA's Compliance Laboratory, Cryptosporidium has never exceeded the reporting limit of 1 oocyst per L, hence SNWA's Bin 1 designation.
:
IU adjustment significantly increased risk of infection for norovirus (p < 0.0001, Wilcoxon test) but still maintained a ∼20-fold safety factor relative to the 10−4 benchmark for the maximum value. Giardia yielded the lowest risks for all pathogen scenarios considered. Annual pathogen-specific risks are also illustrated in Fig. 4.
| Pathogen scenario | Mean | 50th | 95th | 99th | Max |
|---|---|---|---|---|---|
| Cumulative | −4.59 | −4.59 | −4.51 | −4.49 | −4.49 |
| Norovirus | |||||
| Molecular | −6.29 | −6.57 | −5.89 | −5.43 | −5.33 |
Molecular (GC : IU) |
−7.01 | −7.35 | −6.34 | −6.19 | −6.18 |
| Enterovirus | |||||
| Culture | −10.72 | −10.97 | −10.23 | −10.19 | −10.17 |
| Molecular | −7.68 | −9.46 | −8.74 | −6.37 | −6.22 |
Molecular (GC : IU) |
−9.19 | −11.44 | −10.31 | −7.86 | −7.72 |
| Adenovirus | |||||
| Culture | −11.40 | −11.44 | −11.25 | −10.90 | −10.82 |
| Molecular | −7.71 | −7.87 | −7.33 | −7.08 | −7.01 |
Molecular (GC : IU) |
−10.39 | −10.60 | −10.01 | −9.67 | −9.59 |
| Protozoa | |||||
| Giardia | −14.05 | −14.05 | −14.01 | −14.00 | −14.00 |
| Cryptosporidium | −4.59 | −4.59 | −4.51 | −4.49 | −4.49 |
Due to the difficulty in culturing norovirus, its GC
:
IU ratio must be assumed,24 which also prevents direct comparisons of GC
:
IU-adjusted molecular concentrations with corresponding culture-based concentrations. On the other hand, GC
:
IU ratios for adenovirus and enterovirus have recently been well described using qPCR and cell culture on paired wastewater samples.9,13 However, their GC
:
IU ratios still span several orders of magnitude, and this variability has not yet been fully explained. Even after GC
:
IU adjustment in the current study, the molecular data for enterovirus and adenovirus sometimes yielded orders of magnitude higher risks than the corresponding culture-based scenarios. This suggests that molecular concentrations are highly conservative in nature when developing regulatory targets or generally characterizing risk.
In this study, we assumed one ingestion event per day, which is commonly done in potable reuse QMRAs, though some other studies have used as many as 96 ingestion events per day—equivalent to one ingestion event every 15 minutes. Multiple daily ingestion events lead to higher probabilities of infection at lower percentiles but lower maxima, which might be more important from a public health or regulatory perspective.9,12,59,60 Thus, a single daily ingestion event might be considered more conservative since a rare but high consequence scenario is not ‘averaged out’ by other nominal or low consequence ingestions.
due to the shorter travel time caused by lake level decline from 329 m to 297 m. Conversely, Cryptosporidium experiences the slowest decay, so its median lake LRV decreased by only 0.44. A post hoc analysis confirmed that the baseline Lake Mead elevation of 329 m had the lowest risk profile (p < 0.0001), but there was no significant difference (p = 0.60) in the cumulative annual risk profiles for 312 m vs. 297 m (Fig. 5). The highest median log10 probability of infection was for lake level 312 m (−4.43) followed by 297 m (−4.44) and then 329 m (−4.59) (Table 3). Importantly, even when considering continued lake level decline, none of the simulated annual risks exceeded the 10−4 annual risk benchmark.
| Lake level (m) | Mean | 50th | 95th | 99th | Max |
|---|---|---|---|---|---|
| 329 | −4.59 | −4.59 | −4.51 | −4.49 | −4.49 |
| 312 | −4.42 | −4.43 | −4.33 | −4.31 | −4.31 |
| 297 | −4.44 | −4.44 | −4.36 | −4.35 | −4.35 |
This analysis demonstrates that there are public health implications related to drought due to climate change and the corresponding impacts on source waters. This QMRA illustrates how declining lake elevation can lead to shorter travel/storage times in an environmental buffer like Lake Mead (Fig. S4), which then reduces natural die-off/inactivation of pathogens and increases risk of gastrointestinal infection. Other adverse water quality impacts due to climate change are also possible; for SNWA, rising water temperatures combined with higher concentrations of dissolved organic matter and total dissolved solids can potentially lead to higher concentrations of disinfection byproducts. However, it should also be reiterated that risk was not perfectly correlated with lake level in this QMRA. The initial drop in elevation from 329 m to 312 m led to a significant increase in risk, but the additional drop to 297 m led to no significant change. The shorter travel times for lake elevation of 297 m led to less pathogen decay (i.e., higher risks) than at 312 m, but the longer travel times for 312 m were offset by higher percent DFR, ultimately leading to less dilution and slightly higher risk predictions. Overall, percent DFR demonstrated a complex relationship with lake level, with the highest lake elevation sometimes yielding the highest percent DFR during certain times of the year due to stratification and hydrodynamics (Table S6). Because the LVW discharge into Lake Mead differs considerably from the ambient water quality in Lake Mead, there is a distinct plume at the confluence,18–20 and this necessitates a complex 3D hydrodynamic model to understand the interplay of bathymetry, meteorological conditions, and DFR parameters. Future QMRAs should also consider incorporating hydrodynamic modeling in reservoir augmentation applications, because only accounting for dilution is an oversimplification that can impact the accuracy of risk estimates and ultimately risk management decisions.
:
IU ratio-adjusted norovirus risk estimates (p < 0.0001) at a Lake Mead elevation of 329 m, with the highest risk occurring during the fall (median log10 daily risk of −12.83), followed by summer (−14.26), winter (−16.15), and then spring (−19.66) (Fig. 6). Daily risk simulations exceeded the daily risk benchmark of 2.7 × 10−7 at the 99.93rd percentile in the fall, and at the 99.96th percentile in the spring. Maximum daily risks in fall reached as high as −4.57. Variables that impact seasonality include percent DFR at the drinking water intake (i.e., dilution), Lake Mead travel times (i.e., decay/inactivation), and the WWTP-specific norovirus concentrations, although worst-case conditions for each parameter did not always align. For example, for a Lake Mead elevation of 329 m, there was a higher percent DFR observed during fall (Table S6), but higher norovirus concentrations occurred in the winter for all WWTPs in Southern Nevada (Table S1). Moreover, the season with the highest percent DFR depended on lake level due to the impact of varying temperature gradients. For example, the highest percent DFR was observed in fall at 329 m but then in summer for lake elevations of 312 m or 297 m. Therefore, the impact of seasonality was also linked to lake level.
Travel times also varied seasonally throughout the lake, due primarily to lake stratification in the fall and summer that resulted in faster travel time from the confluence to the drinking water intake. Additional mixing in the winter resulted in slower travel times and greater decay/inactivation, which effectively countered the higher raw wastewater norovirus concentrations assumed for the winter. In fact, daily risk followed the order of slowest (lowest risk) to fastest (highest risk) seasonal travel times, indicating that travel times impact norovirus risk more significantly than raw wastewater concentration. This analysis highlights how dilution and decay/inactivation in the environmental buffer can somewhat attenuate higher wastewater concentrations during seasonal fluctuations, showing that norovirus outbreaks in the winter do not necessarily correspond to higher DFR risks. Additionally, since the intake is deep in Lake Mead, the temperature of the water at the intake is relatively stable throughout the year, varying by only approximately 2 °C, reducing the variation in decay coefficients due to temperature fluctuations.
log10).
During normal operation, ozone at the DWTPs accounted for point estimate LRVs of 6 for viruses, 5.77 for Giardia, and 0.47 for Cryptosporidium. As shown in Fig. 7, GC
:
IU-adjusted norovirus and Cryptosporidium exceeded the 2.7 × 10−7 daily risk benchmark at the upper percentiles even with ozonation included. However, as noted earlier, the system still achieved the 10−4 annual risk benchmark that is typically assumed for conventional public water systems due to ‘averaging’ of daily risks throughout the year. The assumed 24 h downtime for ozonation resulted in increased risks for all pathogens, but most notably for the viruses (Table S8 and Fig. 7). Omitting the ozone LRVs resulted in an equivalent log10-increase in risk for all pathogens (i.e., NoV risks increased by 6
log10), because at low doses, the relationship becomes linear between dose and risk for the dose–response models. Giardia still achieved the daily risk benchmark for all scenarios, but for adenovirus and enterovirus, regardless of enumeration method (i.e., GC
:
IU-adjusted molecular or culture), risks exceeded the daily benchmark at the very upper percentiles (i.e., >97th percentiles) (Table S8). This highlights the value of ozonation in terms of robust disinfection efficacy and its corresponding risk reduction potential. This can be beneficial to water utilities looking to minimize pathogen risks, even if supplemental disinfection is not mandated by federal regulations.
For GC
:
IU-adjusted norovirus with ozone omitted, even the median daily risk exceeds the benchmark at the lowest lake level of 297 m; higher lake elevations resulted in exceedances at the 50th–70th percentiles (Table S8). Therefore, virus selection during the hazard identification step of a QMRA can have significant implications when extending the results to regulatory development and/or real-world systems. There is still debate regarding the appropriateness of using molecular norovirus data, even with GC
:
IU adjustment, rather than culture-based enterovirus or adenovirus data when developing potable reuse regulations,24 or, in this case, operational criteria for an existing DWTP in a DFR system. That being said, omitting ozonation also dropped the exceedance percentiles for Cryptosporidium from approximately the 80th–90th percentiles down to the 60th–70th percentiles. Thus, ozonation may not be mandated by U.S. EPA regulations, but this analysis demonstrates its benefits for public health protection in DFR systems.
With respect to pathogen concentrations, various approaches have been used, including maximum observed values, maximum values from 10
000 random samplings of a concentration distribution, and various percentiles of a concentration distribution. However, the maximum is a poor choice for statistical comparisons because it is unstable, depends on the number of samples, and is influenced by the random number generator used. An alternative metric that we have previously proposed is the 97.4th percentile based on Blom's equation.62 This percentile was selected in Gerrity et al.24 to harmonize standard source water characterization data, such as that required under U.S. EPA's LT2, with simulated data from a distribution. Specifically, the 97.4th percentile from a simulated 10
000-concentration dataset corresponds to the maximum observed value across 24 real-world samples. A sample size of N = 24 aligns with the standard source water characterization requirements under the U.S. EPA's LT2.24
This discussion is relevant to Nevada, as the state currently has IPR regulations with LRTs of 12/10/10 for viruses, Giardia, and Cryptosporidium26 but has not yet established DPR regulations. Using the 97.4th percentile approach coupled with the pathogen concentration distributions (Table S1), exposure assumptions, and dose–response models from this study, the LRTs could be as high as 16/10/10 for viruses, Cryptosporidium, and Giardia (Table 4). The virus LRT of 16 is driven by the high norovirus concentrations (without GC
:
IU adjustment) observed in Southern Nevada from 2021 to 2024.13 It is important to note that the virus LRT could be justifiably lower depending on the choice of reference virus; as noted earlier, there is still debate regarding the appropriateness of molecular data. For instance, using culturable enterovirus with a 10× correction factor63 as the reference virus, the LRTs could be reduced to 13/10/10—consistent with Gerrity et al.—or 12/10/10 without the 10× correction factor—consistent with the existing IPR regulations in Nevada. If maximum simulated values are assumed, the LRTs would increase to 18/11/11.
000 Monte Carlo simulations
| Scenario | 97.4th percentile | Maximum | ||
|---|---|---|---|---|
| Viruses (culture) | Conc. (Log10 L−1) | LRT | Conc. (Log10 L−1) | LRT |
| Enterovirus culture (10×) | 6.5 | 13.2 | 8.0 | 14.7 |
| Adenovirus culture (baseline) | 4.7 | 11.5 | 6.4 | 13.2 |
| Viruses (molecular) | Conc. (Log10 L−1) | LRT | Conc. (Log10 L−1) | LRT |
|---|---|---|---|---|
Enterovirus molecular (GC : IU) |
5.8 | 12.5 | 8.2 | 14.9 |
| Norovirus molecular combined | 9.5 | 16.1 | 11.6 | 18.2 |
Norovirus molecular (GC : IU) |
7.8 | 14.4 | 10.4 | 17.0 |
Adenovirus molecular (GC : IU) |
5.9 | 12.7 | 8.4 | 15.2 |
| Protozoa | Conc. (Log10 L−1) | LRT | Conc. (Log10 L−1) | LRT |
|---|---|---|---|---|
| Giardia (baseline) | 4.5 | 9.8 | 5.3 | 10.6 |
| Cryptosporidium (baseline) | 3.1 | 10.1 | 4.1 | 11.1 |
log10 increments) and then evaluating the percentage of simulations that fall below the acceptable risk threshold. This is a variation of the “bottom-up” QMRA described by Clements et al.,12 and this approach was also used by Soller et al.,56 who found that LRTs of 16/11/11 resulted in 100% of their simulations having cumulative annual risks less than 10−4. This approach also provides an inherent sensitivity analysis showing the relative impact of small changes in virus and protozoa LRVs on final risk estimates, as is shown for the Southern Nevada system in Fig. 8. For example, holding the protozoa LRV constant at 9 and increasing norovirus LRV, the percentage of simulations achieving the annual risk benchmark increases slightly but then plateaus at an LRV of 16, with marginal gains beyond that point. Similarly, when increasing protozoa LRVs from 9 to 10 while holding norovirus LRV constant, there is a considerable jump in compliance (e.g., from 5.0% to 49.1% for a norovirus LRV of 13 and from 13.7% to 95.0% for a norovirus LRV of 18). Beyond a protozoa LRV of 10, the gains diminish unless there is a corresponding increase in virus LRV. Stepwise increases in LRVs for all pathogen targets eventually leads to 100% compliance at 17/11/11 (Fig. 8), which is consistent with the maximum GC
:
IU-adjusted norovirus scenario in Table 4.
![]() | ||
Fig. 8 Percent of simulations with cumulative annual risk of infection less than the annual risk benchmark of 10−4. Norovirus is adjusted for GC : IU ratio here, whereas both non-adjusted and adjusted scenarios are included in Table 4. | ||
For the Southern Nevada de facto reuse system, this analysis demonstrates that public health is adequately protected on an annual risk basis. Simulated declines in Lake Mead elevation led to a higher cumulative annual risk of infection, but none of the simulations exceeded the annual risk benchmark of 10−4. While lower lake levels may impact water quality in other ways, our findings suggest that the multi-barrier approach encompassing wastewater treatment, natural attenuation, and drinking water treatment is sufficient to manage pathogen risks in this system.
However, the analysis also highlighted important considerations for risk management in Southern Nevada, and elsewhere. Fall exhibited the highest risk of norovirus infection, exceeding the daily risk threshold at the 99.93rd percentile, specifically due to the combined effect of increased norovirus concentrations in wastewater, shorter travel times due to lake stratification (i.e., less die-off), and higher percent DFR (i.e., less dilution). Thus, it is important to develop a comprehensive understanding of any public water system to identify high risk scenarios that might warrant closer attention or even operational modifications. This QMRA also demonstrated that ozonation, which is not required at the DWTPs, provides consequential reductions in risk, particularly for viruses and Cryptosporidium. Therefore, even if ozonation is not required for regulatory compliance, efforts should be made to ensure nominal operation at all times.
Overall, this QMRA demonstrates that public health should be adequately protected even under prolonged drought conditions, at least from gastrointestinal microbial hazards, but this is due to resilience of the overall multi-barrier system. Finally, as Nevada pursues direct potable reuse regulations, this QMRA also highlights the impacts of critical assumptions, including choice of reference pathogen and concentration distributions, with deterministic LRTs ranging from 12/10/10 to 18/11/11.
Supplementary information: including the model parameters and R script. See DOI: https://doi.org/10.1039/d5ew00514k.
Footnote |
| † These authors contributed equally. |
| This journal is © The Royal Society of Chemistry 2026 |