Open Access Article
This Open Access Article is licensed under a
Creative Commons Attribution 3.0 Unported Licence

Quantitative microbial risk assessment of the impact of drought and seasonality on a de facto reuse system in Southern Nevada, USA

Emily Clements ab, Katherine Cranka, Deena Hannouna and Daniel Gerrity*a
aSouthern Nevada Water Authority, P.O. Box 99954, Las Vegas, NV 89193, USA. E-mail: daniel.gerrity@snwa.com
bCarollo Engineers, Inc, 901 N Stuart Street, Suite 403, Arlington, Virginia 22203, USA

Received 5th June 2025 , Accepted 1st December 2025

First published on 22nd December 2025


Abstract

De facto reuse (DFR) refers to the incidental or unintentional incorporation of treated wastewater into natural water bodies used as a source of drinking water. Increasing recognition of this practice has highlighted a potential risk of human exposure to various chemicals and pathogens originating from wastewater. In this study, quantitative microbial risk assessment (QMRA) was used to determine the infection risks associated with norovirus, adenovirus, enterovirus, Cryptosporidium, and Giardia for DFR in Southern Nevada (i.e., Lake Mead). Scenarios included three lake levels to encompass current (329 m) and possible scenarios associated with continued drought conditions (312 m and 297 m). Starting with observed raw wastewater pathogen concentrations at local wastewater treatment plants, risks were estimated after accounting for facility-specific wastewater treatment trains, discharge-specific dilution and decay in the environmental buffers (based on hydrodynamic modeling), and drinking water treatment. Log reduction values (LRVs) for wastewater treatment were also calibrated to observed Cryptosporidium concentrations in the environment to characterize ‘gaps’ in crediting (LRVgap = 1.97). For the baseline lake level, the median cumulative risk of gastrointestinal infection from all pathogens was 10−4.59 infections per person per year, with Cryptosporidium as the primary driver of risk. Risks increased significantly for the lower lake elevations but still satisfied the annual risk benchmark of 10−4. The impacts of seasonality were also studied for norovirus, indicating increased risks during fall and spring. Overall, this study demonstrates that the current design and operation of the Southern Nevada DFR system is protective of public health with respect to enteric pathogen exposure, even if the current Colorado River Basin drought continues or worsens.



Water impact

Enteric pathogen concentrations were modeled through Southern Nevada's de facto reuse system. By coupling quantitative microbial risk assessment (QMRA) with an existing hydrodynamic model, the effects of seasonality, lake level decline, engineered treatment, and environmental fate (i.e., decay) and transport were considered. The Lake Mead de facto reuse system was determined to be protective of human health and resilient to lake level decline.

Introduction

De facto reuse (DFR) refers to the incidental or unintentional incorporation of treated wastewater into natural water bodies used as a source of drinking water. Awareness of this practice has increased throughout the world,1,2 particularly in regions experiencing drought. For example, Lake Mead, a critically important drinking water reservoir in the southwestern United States (U.S.), has experienced ongoing drought conditions since the early 2000s.3 Southern Nevada, which includes the Las Vegas metropolitan area, receives the majority of its drinking water from Lake Mead and is also reliant on DFR to maximize its limited Colorado River allocation, a framework described locally as “return flow credits”.4 While treated wastewater can be a valuable resource in areas facing drought, its presence at drinking water intakes can raise important public health concerns, for both chemical contaminants1,5–7 and pathogens.8–10

Quantitative microbial risk assessment (QMRA) serves as a valuable tool in evaluating the potential pathogen risks in these de facto reuse applications. Previous potable reuse QMRAs have incorporated seasonality in terms of its impact on norovirus concentrations,11 but none have directly assessed how seasonality and drought impact risk estimates through hydrodynamic changes in pathogen travel times and relative water supply contribution from DFR (i.e., DFR percentage).12 These considerations are important because of the potential time dependence of pathogen loads at wastewater treatment plants (WWTPs)13,14 and differences in dilution and decay in the environmental buffer. While QMRAs and regulatory frameworks for potable reuse often assume that influent pathogen concentrations are random (i.e., not autocorrelated), seasonality and outbreaks may result in elevated or peak concentrations for sustained periods of time. Beyond seasonality, drought conditions can also alter the DFR percentage at a drinking water intake,1 which impacts the level of pathogen attenuation from dilution.

With climate change, drought frequencies and intensities are worsening, impacting reservoirs around the world.15 DFR percentages in reservoirs can increase with drought, which has been shown to impact water quality.15–17 Drought can also impact how water, including treated wastewater, travels through reservoirs. Certain attributes of wastewater effluent (e.g., salinity and temperature) may differ from ambient conditions in a receiving lake, thereby causing the discharge to travel as a distinct plume along the thermocline, particularly under stratified conditions.18,19 The depth of the thermocline, which is impacted by solar radiation, air/water temperature, and lake elevation, and its relation to the drinking water intake can have a large impact on the DFR percentage.20 This has not yet been integrated into potable reuse QMRAs.

As regulations for indirect potable reuse (IPR) and direct potable reuse (DPR) continue to be developed and implemented in states across the U.S.,21–26 it becomes critically important to understand the risks associated with potable reuse and how those risks change under shifting conditions. With these emerging regulatory frameworks, there has been intense focus on their stringent pathogen log reduction value (LRV) targets and the corresponding credits for advanced water treatment (AWT) processes. This potentially leads to the perception that DFR is inherently unsafe since some of the LRVs occurring within the overall DFR system, particularly in the environmental buffer(s), are not explicitly characterized. However, the combination of conventional engineered treatment, including both wastewater and drinking water treatment, coupled with natural attenuation in DFR systems could be as protective as AWT.27

Therefore, the goal of this study was to quantify drinking water infection risks due to potential exposure to five pathogens (norovirus, adenovirus, enterovirus, Cryptosporidium, and Giardia) in the Southern Nevada DFR system associated with Lake Mead. The novelty of this study relates to its incorporation of robust, geographically-linked raw wastewater pathogen concentrations and the effects of seasonality and long-term drought, as these environmental factors affect pathogen concentrations, travel times (i.e., pathogen decay), and dilution in DFR systems.

Methods

Quantitative microbial risk assessment (QMRA)

QMRA includes hazard identification (i.e., which pathogens are of interest in the target scenario), exposure assessment (i.e., what is the pathogen dose), dose–response modeling (i.e., what is the risk of infection given that dose), risk characterization (i.e., how does the calculated risk compare to a given benchmark), and risk management (i.e., how can that risk be reduced, if necessary). Fig. 1 illustrates the overall approach for the QMRA in this study. Specifically, the QMRA involved estimating the attenuation of raw wastewater pathogen concentrations through (1) engineered treatment at four different WWTPs, (2) discharge-specific dilution and decay (i.e., die-off) in the Las Vegas Wash, (3) further dilution and decay in Lake Mead (based on hydrodynamic modeling), and (4) engineered treatment at two nearly identical drinking water treatment plants (DWTPs). The hydrodynamic modeling paired the percentages of DFR at the intake from a previous study20 with new modeling to compute the travel times, and was able to account for differences in lake levels and seasons. Risk of infection to consumers was estimated based on once daily ingestion of 2.5 L of finished drinking water containing the simulated pathogen concentrations. Pathogen-specific risks of infection were calculated from 10[thin space (1/6-em)]950 simulated days using a Monte Carlo approach. Annual risk distributions were then developed by dividing the 10[thin space (1/6-em)]950 daily risks into 365-day subgroups to generate 30 annual datasets. The union probabilities were calculated, and the resultant annual risk distribution was compared against an annual risk benchmark of 10−4. The model was implemented in R 4.4.1 using RStudio 2024.09.1.28 Additional details for the QMRA components are described in greater detail in the following sections.
image file: d5ew00514k-f1.tif
Fig. 1 Flow of the QMRA. Site-specific raw wastewater pathogen concentrations were reduced to account for (1) the log reduction values (LRVs) assumed for each wastewater treatment plant (WWTP), (2) a calibrated LRV for Cryptosporidium based on observed concentrations in the Las Vegas Wash (LVW), (3) environmental buffer dilution and decay in LVW and Lake Mead (LM), and (4) LRVs assumed for the drinking water treatment plants (DWTPs). Risk of infection due to ingestion was calculated using the final estimated pathogen concentrations in treated drinking water, a single daily ingestion volume of 2.5 L per day, and pathogen-specific dose–response relationships.

Study area

Lake Mead is the largest reservoir by volume in the U.S., providing water to 25 million people in the Lower Basin states.29 It is formed by impoundment of the upstream Colorado River by Hoover Dam, on the border between Nevada and Arizona in the southwestern U.S. (Fig. 2). Lake Mead is in an arid region, with an average precipitation of 10.6 cm of rainfall a year and only 21 days with precipitation.30 Because of sustained drought conditions, Lake Mead decreased in volume by 71% between 2000 and 2022,3 while also reaching its lowest point in 2022.31 There are four inputs to Lake Mead: the Colorado River, which accounts for approximately 97% of the inflow; the Virgin and Muddy Rivers, which account for approximately 1% of the inflow; and the Las Vegas Wash (LVW), which makes up approximately 2% of the inflow.29
image file: d5ew00514k-f2.tif
Fig. 2 Map of study area, including the wastewater treatment plants (WWTPs), drinking water treatment plants (DWTPs), Las Vegas Wash (LVW), and Lake Mead. The map also indicates the location of the Cryptosporidium sampling point within LVW. Note that LVW travels in an enclosed pipe under Lake Las Vegas (independent water body) so there is no need to account for pathogen attenuation in Lake Las Vegas.

To maximize “return flow credits” to the Colorado River system, Southern Nevada prioritizes reductions in consumptive water use (e.g., outdoor irrigation and cooling towers) and aims to maximize discharge of treated wastewater effluent from four WWTPs (hereafter, WWTP-1, WWTP-2, WWTP-3, and WWTP-4) into the Las Vegas Wash.4 The Las Vegas Wash ultimately discharges into Boulder Basin, which is the most downstream basin in Lake Mead and the location of the drinking water intakes for the Southern Nevada Water Authority (SNWA). Under typical (i.e., dry) flow conditions, the Las Vegas Wash carries 7 × 105 m3 per day of treated wastewater effluent, which represents approximately 90% of its total flow.32 This results in approximately 1.4% DFR at the drinking water intake,33 though that number changes seasonally and at different lake levels.20 By prioritizing conservation efforts to maximize return flows to Lake Mead, Southern Nevada has significantly reduced its annual consumptive water use, reaching as low as 230 million m3 in 2023. This is well below Nevada's baseline Colorado River allocation of 370 million m3, despite a population increase of 829[thin space (1/6-em)]000 residents between 2002 and 2024.

Raw wastewater pathogen concentrations

Pathogens identified as hazards in potable reuse systems often include Giardia, Cryptosporidium, and enteric viruses,24,34 specifically enterovirus, adenovirus, and norovirus (GI and GII). All of these pathogens were previously measured in the raw wastewater of the four WWTPs in the study area.13 In the previous study, raw wastewater samples were collected from each of the WWTPs over four years and analyzed for viral targets using qPCR (N ≈ 1000 in total for each target). During the four years, one year included collecting raw wastewater for enumeration of Giardia, Cryptosporidium, culturable enterovirus, and culturable adenovirus (N = 73 for protozoa and 56 for viral culture). Paired enterovirus and adenovirus wastewater samples were analyzed for both culturable and molecular concentrations to develop a distribution of gene copy to infectious unit (GC[thin space (1/6-em)]:[thin space (1/6-em)]IU) ratios. Here, we used the recovery-corrected data from the previous study across both pandemic and non-pandemic conditions, but focused only on facilities that discharged to LVW. Recovery-corrected data was chosen in accordance with the requirements for high-quality pathogen datasets set forth by Darby et al.35 for use in studies within regulatory contexts. Recovery percentages ranged from 0–100% and are described in detail in Crank et al.13 The overall pathogen concentration dataset was also subdivided by WWTP to allow coupling of site-specific concentrations, LRVs, and travel times within LVW. The site-specific datasets were fit to new log10-normal distributions using ‘fitdistcens’ in R (Table S1). In Crank et al.,13 a strong seasonal effect was observed only for norovirus, which Eftim et al.11 also found, so new season-specific distributions were also fit to the norovirus data for this QMRA. Seasons were defined as follows: fall = September, October, November; winter = December, January, February; spring = March, April, May; and summer = June, July, and August. For each simulation, norovirus GI and GII concentrations from the strain-specific distributions were summed and treated as an overall norovirus hazard, consistent with recent literature.24

In addition to modeling molecular virus concentrations with no infectivity adjustments, the previously reported GC[thin space (1/6-em)]:[thin space (1/6-em)]IU ratio distributions from Crank et al.13 (Table S2) were applied to the molecular enterovirus and adenovirus concentrations (i.e., in units of GC L−1) to convert them to infectious units (i.e., IU L−1). These adjusted molecular concentrations were compared alongside the directly measured culture concentrations. For norovirus, the GC[thin space (1/6-em)]:[thin space (1/6-em)]IU relationship is not well elucidated in the literature, so a uniform distribution between 1[thin space (1/6-em)]:[thin space (1/6-em)]1 and 200[thin space (1/6-em)]:[thin space (1/6-em)]1 was assumed (implemented as a log10-uniform distribution from 0.00 to 2.30).24 Additional discussion related to the norovirus GC[thin space (1/6-em)]:[thin space (1/6-em)]IU assumption is included in Table S2. Final pathogen scenarios included WWTP-specific concentrations of Giardia (microscopy), Cryptosporidium (microscopy), adenovirus (cell culture and molecular with/without GC[thin space (1/6-em)]:[thin space (1/6-em)]IU ratio adjustment), enterovirus (cell culture and molecular with/without GC[thin space (1/6-em)]:[thin space (1/6-em)]IU ratio adjustment), and norovirus (molecular GI and GII summed with/without a GC[thin space (1/6-em)]:[thin space (1/6-em)]IU ratio adjustment).

Log reduction values for engineered wastewater treatment processes

Facility-specific LRVs for engineered wastewater treatment were then applied to the facility-specific distributions of raw wastewater pathogen concentrations. The treatment trains are illustrated in Fig. S1, and the corresponding LRVs are summarized in Table 1. Unless otherwise specified, the maximum LRV assumed for any engineered treatment process was 6, consistent with California's potable reuse regulatory framework,22,23 and there was no minimum LRV threshold for crediting (i.e., LRVs < 0.5 were also considered). In Nevada, regulations for IPR via groundwater replenishment were promulgated in 2016,4,26,36 but “return flow credits” in Southern Nevada are exempt from the pathogen LRV targets stipulated in those IPR regulations. Thus, assumed LRVs for the unit processes at each WWTP were derived from various sources, including truncated normal distributions for conventional secondary treatment and point estimates for all other processes based on operational conditions (i.e., disinfectant CTs or UV dose) or industry practice. As illustrated in Fig. S1, the WWTPs employ some combination of secondary biological treatment, consisting of activated sludge and secondary clarification or a membrane bioreactor (MBR); tertiary treatment with granular media filtration (GMF) or ultrafiltration (UF); and disinfection with low-dose UV (i.e., <100 mJ cm−2), ozone, and/or free chlorine. All WWTPs in Southern Nevada target full nitrification at all times, thus chlorine disinfection was assumed to be with free chlorine, and disinfection is employed throughout the year (i.e., no seasonal adjustments).37 Only WWTP-2 and WWTP-3 include primary treatment, although no pathogen LRVs were assumed for primary treatment.
Table 1 Log reduction values (LRVs) for the engineered unit treatment processes at the wastewater treatment plants (WWTPs; 1–4) and drinking water treatment plants (DWTPs; 1–2)
Process Virus Giardia Cryptosporidium WWTP/DWTP Ref.
a Assumes a truncated normal distribution for virus LRVs with a = minimum LRV and b = maximum LRV.b Assumes NoV GII from Hill et al. (2025).39c UV: WWTP-3 = 34 mJ cm−2 and WWTP-4 = 47 mJ cm−2.d Ozone: WWTP-3 = 1.2 mg min L−1 CT and 26.8 °C at outfall.e Chlorine: WWTP-1 = 6 mg min L−1 CT and WWTP-2 = 450 mg min L−1 CT.f The chlorine virus LRV was capped at 4 (maximum in the US EPA Guidance Manual on Disinfection Profiling and Benchmarking table B-2 (ref. 46)) for the WWTPs.g Relative flow split at WWTP-3: 83% = GMF + UV and 17% = UF + Ozone.h Z-score/value from standardized normal distribution.i The LRVs were from the SCADA software for DWTP-1, which produces more water and has higher chlorine CT values than DWTP-2, though this would only impact Giardia chlorine LRVs.j Ozone: DWTP-1 CT = 2.30 mg min L−1 for virus and 2.12 mg min L−1 for Giardia from SCADA calculations, and CT = 3.37 mg min L−1 for Cryptosporidium based on the log integration method.k Long Term 2 Enhanced Surface Water Treatment Rule allows plants to claim an additional 0.5 LRVs if they meet turbidity requirements (40 CFR 141.718(b)).l Chlorine: DWTP-1 CT = 242 mg min L−1 at pH of ∼8 and temperature of ∼12.5 °C, which yields an LRV > 200 that was capped at 6.
Wastewater treatment plants
Primary 0 0 0 2, 3 NA
Membrane bioreactor (MBR) 1.5 2 2 1 38
Conventional secondarya AdV: m = 2.3, s = 0.8, a = 0.6, b = 3.6 2.5156 + Zh × 0.1070 2.0962 + Zh × 0.1085 2, 3, 4 39, 40
EnV: m = 1.3, s = 0.8, a = 0.3, b = 3.5
NoV:b m = 1.1, s = 0.7, a = 0, b = 2.7
Granular media filtration (GMF) 0 0 0 2, 3,g 4 NA
Ultrafiltration (UF) 0 4 4 3g 41
Ultraviolet (UV) disinfectionc WWTP-3 = 0.4 6 6 3,g 4 42, 43
WWTP-4 = 0.7
Ozoned 6 6 0.6 3g 42, 44, 45
Chlorinee 4f WWTP-1 = 0.3 0 1, 2 42, 46
WWTP-2 = 6
Drinking watertreatment plantsi
Ozonej 6 5.77 0.47 1, 2 42, 44, 45
Direct GMF 1 2 2.5 1, 2 45
Individual filter effluentk 0 0 0.5 1, 2 45
Chlorinel 6 6 0 1, 2 45


Dilution and pathogen decay in the Las Vegas wash

The average treated effluent flow rate for each WWTP, as determined by Thompson et al.,47 was assumed for the discharges to LVW. WWTP 1 had a flow rate of 64[thin space (1/6-em)]000 m3 per day, WWTP 2 had a flow rate of 150[thin space (1/6-em)]000 m3 per day, WWTP 3 had a flow rate of 360[thin space (1/6-em)]000 m3 per day, and WWTP 4 had a flow rate of 83[thin space (1/6-em)]000 m3 per day. The combined flow from the four WWTPs (657[thin space (1/6-em)]000 m3 per day) was assumed to account for 90% of the total flow in LVW,32 resulting in an overall LVW flow rate of 730[thin space (1/6-em)]000 m3 per day. In other words, the non-effluent base flow of LVW was assumed to be 73[thin space (1/6-em)]000 m3 per day. The percent contributions to overall flow in LVW were then converted to LRVs, which ranged from 0.31 to 1.06 (Table S3), to account for dilution.

In addition to dilution, pathogen decay in LVW was included as an environmental buffer LRV and was determined based on published first order decay rate constants coupled with WWTP-specific travel times to the LVW discharge point to Lake Mead. The assumed travel times ranged from approximately 7–15 hours (Table S4), as determined by the rhodamine tracer study data in Blasius et al.48 Each pathogen was assigned a distribution of decay rate constants based on published systematic reviews for protozoa49 and viruses.50 These literature reviews encompassed multiple enumeration methods, so new log10-normal distributions were fit to the published base e rate constants (Table S5),49,50 to focus only on microscopy for protozoa and cell culture for viruses. This is because microscopy and cell culture methods, as opposed to molecular methods, better describe decay/inactivation of infectious pathogens. LRVs were then calculated using the stochastic pathogen-specific first order decay rate constants (k, d−1) and the deterministic WWTP-specific travel times for LVW (t, days) (eqn (1)). LRVs for decay/inactivation were not capped at a maximum value. Example decay LRVs as a function of travel time are illustrated in Fig. S2.

 
Decay LRV = −log10(ekt) (1)

Dilution and pathogen decay in Lake Mead

Also described as the recycled water contribution (RWC) in California, percent DFR can be viewed as a dilution factor, with lower percent DFR equating to greater dilution (i.e., higher LRVs) and lower concentrations of wastewater-derived constituents (e.g., pathogens, bulk organic matter, trace organics, nutrients, and total dissolved solids). For public water systems, characterizing percent DFR can be useful for short- and long-term strategic planning and operations. van der Nagel et al.20 characterized percent DFR at the drinking water intakes in Lake Mead using an established three-dimensional hydrodynamic model in AEM3D (the Lake Mead model; LMM).33 This mechanistic model includes a bathymetric map of Lake Mead, with inflows from the Colorado River, LVW, and the Virgin and Muddy Rivers. It incorporates a large amount of data collected from over 20 years of sampling campaigns on the lake, as well as weather data to simulate the lake conditions as closely as possible. It can also be used to investigate alternative scenarios, such as decreased lake levels due to climate change. van der Nagel et al.20 used this model to evaluate lake levels of 329 m (baseline condition), 312 m (prolonged drought condition 1), and 297 m (prolonged drought condition 2) for the meteorological conditions observed between 2019 and 2021. The resulting percent DFR, which ranged from as low as 1.9% to as high as 6.9%, were integrated into this QMRA as uniform distributions using the minimum and maximum percent DFR values for the overall or season-specific scenarios (Table S6).

The LMM was also used to create an empirical distribution of travel times from the LVW discharge point into Lake Mead to SNWA's “Intake 3”. Consistent with Marti et al.,51 a tracer was added to the model at the LVW discharge point for each of the four seasons. The likelihood of a specific travel time was determined by the relative concentration of the tracer at the drinking water intake at that time; travel time distributions were developed for the aforementioned lake levels (329 m, 312 m, and 297 m). For the non-seasonal analyses, the tracer concentrations at each time point for each season were summed to create a total relative concentration (or relative weight), which was used to determine the likelihood of a specific travel time. Using the decay rate constants described earlier, the empirical travel times were randomly sampled—either from the overall dataset or the season-specific datasets—to calculate corresponding LRVs (not capped at a maximum value). Fig. S2 also characterizes LRVs for the longer travel times within Lake Mead.

Settling was also considered as a potential removal mechanism; however, the lack of particle data precluded reliable estimation. Future work characterizing particle-associated protozoa and virus settling could help quantify this reduction more accurately.

Log reduction values for engineered drinking water treatment processes

The public water system in Southern Nevada largely relies on two DWTPs (DWTP-1 and DWTP-2 in Fig. 1), both of which employ identical treatment trains that include ozone, ferric chloride coagulation/flocculation, direct filtration (granular media), and free chlorine disinfection. Deterministic LRVs for ozonation and chlorination were determined using the SCADA software for DWTP-1, which treats the majority of the water45 (Table 1). The SCADA software incorporates online measurements of water temperature, pH, and disinfectant residual to calculate pathogen LRVs based on approaches previously described in the literature.42,44–46 These point estimate LRVs represent the operational targets for the DWTPs; failure and off-specification conditions were not considered in this QMRA. Additional LRVs were included for direct filtration, in accordance with the U.S. EPA's Surface Water Treatment Rules. The maximum LRV for any engineered treatment process was capped at 6,22,23 even when the SCADA-reported LRVs were higher, but there was no minimum LRV threshold. These assumptions resulted in point estimate drinking water LRVs of 13/13.77/3.47 for viruses, Giardia, and Cryptosporidium, respectively.

An additional analysis was conducted to assess risks with and without ozone. Since the DWTPs are “Bin 1” public water systems according to the U.S. EPA's Long Term 2 Enhanced Surface Water Treatment Rule (LT2), no additional Cryptosporidium removal/inactivation is required. In other words, ozonation is voluntarily employed for added public health protection and for its ancillary benefits, including algal mitigation, oxidation of taste and odor compounds and other trace organics, and reductions in disinfection byproduct formation potential. For this particular analysis, pathogen risks were compared against the daily risk benchmark of 2.7 × 10−7 infections per person to characterize the contribution of ozonation to overall public health protection.

Cryptosporidium model calibration data

The SNWA Compliance Laboratory conducts routine monitoring of Cryptosporidium in LVW using EPA Method 1623 (ref. 52) with 10 L sample volumes. We obtained historical recovery-corrected Cryptosporidium concentrations for 185 samples collected between 2016 and 2024 from the sample site shown in Fig. 1. Although 92% of samples were below the method detection limit of 1 oocyst/10 L, the remaining detections were sufficient to apply regression on order statistics (ROS) to the dataset using the NADA package in R53 to estimate quantiles, providing a baseline for comparison against the model-estimated concentrations for LVW. The model estimates accounted for the WWTP-specific influent Cryptosporidium concentrations, LRVs, and relative flow contributions to LVW, in addition to environmental decay up to the Cryptosporidium sampling point in LVW. For this analysis, low-dose UV at WWTP-3 and WWTP-4 was omitted based on the assumption that UV would not affect microscopy-based Cryptosporidium detection. We determined a point-value difference in modeled vs. observed Cryptosporidium concentrations and converted this difference to an apparent LRV (or LRVgap) using eqn (2).54 This supplemental analysis could only be performed for Cryptosporidium due to scarcity of data for other pathogens.
 
image file: d5ew00514k-t1.tif(2)

Exposure assessment and dose–response relationships

The dose–response relationships for this QMRA (Table S7) were adopted for conservatism and consistency with California's regulatory approach for DPR.22,55 Daily ingestion volume was assumed to be 2.5 L, occurring once per day.24,56,57 This is higher than the mean ingestion rate recommended by the EPA and slightly under the 95th percentile, providing a conservative estimate for the ingestion, though at low doses the impact is likely negligible.58 Although assuming a single daily ingestion event decreases the lower percentile risks, it also increases the higher percentile risks, including the extreme percentiles that often drive regulatory determinations. Rare, high consequence events are less likely to be captured on any given day when assuming a single daily ingestion event. However, when they are captured, they are more consequential because they are not countered by many other nominal or low-risk ingestion events.12,59

Risk calculation and characterization

For consistency with recent potable reuse QMRAs, calculated risks of infection were compared against two different public health benchmarks: an annual risk of infection of 10−4 and a daily risk of infection of 2.7 × 10−7 (or 10−4/365).22 Annual risk of infection was used for the baseline analysis, and daily risk of infection was used for the seasonal and supplemental ozone analysis. To calculate risk, a Monte Carlo analysis was first used to generate an underlying dataset of 10[thin space (1/6-em)]950 simulations. The underlying dataset was then divided into 30 years of simulations, with 365 daily simulations per year (365 × 30 = 10[thin space (1/6-em)]950). Pathogen-specific annual risk was then calculated according to eqn (3); eqn (3) was also used to combine pathogen-specific risks into a cumulative risk of gastrointestinal infection (i.e., simultaneously accounting for all pathogens). For the cumulative risk calculation, molecular virus concentrations were used to avoid redundancy for adenovirus and enterovirus. This resulted in a more conservative estimate of cumulative risk since molecular enterovirus and adenovirus concentrations were higher than the corresponding culture-based concentrations, even after accounting for GC[thin space (1/6-em)]:[thin space (1/6-em)]IU ratios.
 
Ptot = 1 − ∏(1 − Pindividual) (3)

Statistical analysis

Multiple variable significance testing was performed using the Kruskal–Wallis test with Dunn's multiple comparisons test as a post hoc. Statistical analyses were performed using GraphPad Prism version 10.4.1 for Windows (GraphPad Software, Boston, Massachusetts, USA). For reported statistics, the calculations were performed before log transformation.

Results and discussion

Cryptosporidium concentrations in the Las Vegas Wash

Using ROS, quantiles of the observed Cryptosporidium concentrations in LVW were 5%, 10%, 25%, 50%, 75%, 90%, and 95%, and the corresponding concentrations were determined to be 0.001, 0.002, 0.005, 0.014, 0.034, 0.077, and 0.128 oocysts per L, respectively (Fig. S3). In calibrating to the ROS-estimated median of 0.014 oocysts per L in LVW, we observed a point-value difference in modeled vs. observed Cryptosporidium LRVs of 1.97. This LRVgap is likely attributable to undercrediting at the WWTPs, specifically in relation to settling during clarification and physical removal in the MBR, GMF, and UF systems. These treatment processes are often uncredited or undercredited in potable reuse regulatory frameworks. There may also be deposition of Cryptosporidium oocysts within LVW prior to reaching the sampling point. For the QMRA, Cryptosporidium concentrations were further modified with this LRVgap of 1.97 to calibrate to the LVW data. It is likely that an LRVgap also exists for the other pathogen targets, but these could not be determined due to a lack of corresponding LVW data, presumably leading to additional conservatism in the risk estimates.

Travel times and decay in Las Vegas Wash and Lake Mead

Pathogen decay was characterized using stochastic first order decay rate constants coupled with deterministic travel times for LVW and stochastic travel times for Lake Mead. With the relatively short travel times for LVW, ranging from approximately 7 to 15 hours (Table S4), there was minimal decay/inactivation expected for the various pathogens, with typical LRVs ranging from <0.05 for norovirus, Giardia, and Cryptosporidium to <0.3 for adenovirus and enterovirus (Fig. S2).

For Lake Mead, travel times were computed for each lake level (329 m, 312 m, and 297 m) and ranged between 44 and 17[thin space (1/6-em)]507 hours (or 729 days) (Fig. S4). Lake Mead travel time significantly impacts pathogen decay/inactivation (Fig. S2), particularly for enterovirus because of its relatively high first order decay rate constant. The median travel time across all seasons decreased with declining lake level: 1306 h or 54 days at 329 m; 1050 h or 44 days at 312 m; and 684 h or 29 days at 297 m. This was presumably due to the declining lake volume (V) coupled with the fixed LVW flow rate (Q), resulting in a shorter theoretical hydraulic retention time (τ = V/Q). It should be noted that the lake level is not determined from flow from the LVW, which accounts for only ∼2% of the inflow to Lake Mead.29 At 329 m, the seasonal travel times were also evaluated. The median travel time was shortest in the fall (712 h), then the summer (1144 h), winter (2488 h), and spring (4249 h). The travel time was shortest in the fall and summer likely due to lake stratification,20 which reduces mixing. Even for Giardia and Cryptosporidium, the long travel times in Lake Mead can provide a valuable barrier for added public health protection (e.g., LRV > 5 with 1 year of travel time; Fig. S2), but short travel times coupled with slow decay/inactivation can also lead to elevated public health risk.

Modeled overall pathogen log reduction values

The overall pathogen LRVs accounted for engineered treatment at the WWTPs, dilution and decay/inactivation in the environmental buffers (LVW and Lake Mead), engineered treatment at the DWTPs, and the LRVgap for Cryptosporidium. The relative contribution of each engineered or natural barrier varied considerably by pathogen and WWTP due to the different treatment trains and varying pathogen susceptibility to treatment and environmental decay (Fig. 3). For example, accounting for stochasticity in rate constants and travel time, the median LRV for decay/inactivation of enterovirus in Lake Mead was 22, but only 1.1 for Cryptosporidium, and these values also varied seasonally due to differences in travel time.
image file: d5ew00514k-f3.tif
Fig. 3 Median contributions of each pathogen reduction barrier to the overall log reduction value (LRV); overall LRVs are shown relative to each wastewater treatment plant (WWTP), numbered 1–4. The “observed LRV” (LRVgap = 1.97) only applies to Cryptosporidium, and the GC[thin space (1/6-em)]:[thin space (1/6-em)]IU LRVs only apply to the molecular virus scenarios (i.e., LRVGC[thin space (1/6-em)]:[thin space (1/6-em)]IU omitted when using culture-based concentrations).

With respect to engineered treatment, free chlorine is known to be effective for viruses and Giardia but is ineffective against Cryptosporidium.42,46 On the other hand, UV is effective against protozoa but is less effective for some viruses, notably adenovirus.42,43 Thus, based on the assumptions in this QMRA, there was little attenuation of Cryptosporidium for WWTP-2 due to its use of chlorination, moderate attenuation for WWTP-1 due to the MBR, and higher levels of attenuation for WWTP-3 and WWTP-4 due to their use of UV disinfection. WWTP-3 also included UF and ozonation (LRVUF+O3 = 4.6) but only on 17% of its flow. The opposite was shown for adenovirus, with high levels of attenuation at WWTP-1 and WWTP-2 due to free chlorine disinfection and lower levels of attenuation at WWTP-3 and WWTP-4 with UV disinfection. These differences across WWTPs were further impacted by differences in discharge location (i.e., affecting decay) and relative contribution to LVW flow rate (i.e., affecting dilution).

The overall median LRVs (not counting GC[thin space (1/6-em)]:[thin space (1/6-em)]IU adjustment as an LRV) were highest for enterovirus (LRV ≈ 34–43, depending on WWTP source), primarily because it decayed more in the environmental buffers than the other pathogens. Adenovirus (LRV ≈ 27–31) and norovirus (LRV ≈ 20–24) also differed from each other due to differences in environmental decay. Giardia had relatively high median LRVs of ∼20–26, with ∼14[thin space (1/6-em)]log10 reduction provided by the DWTPs (primarily from ozonation and chlorination). In contrast, Cryptosporidium had the lowest overall median LRVs of ∼11–17, even with the LRVgap, because of its limited attenuation at some WWTPs, slow environmental decay, and minimal DWTP LRV of ∼3.5 (despite the use of ozonation).

As demonstrated by the LRVgap for Cryptosporidium, it is important to reiterate that the LRVs for engineered treatment incorporated into this QMRA may be overly conservative in some cases. For example, the UV dose of 47 mJ cm−2 at WWTP-4 results in a calculated LRV of 8.7 for Cryptosporidium, but the LRV is capped at 6. The difference between calculated and assumed LRVs was even more substantial for viruses at the DWTPs, with calculated LRVs for ozone and chlorine of 13 and 206, respectively. For Cryptosporidium, uncredited sedimentation and filtration at the WWTPs and deposition within LVW were presumably accounted for with the LRVgap, but deposition was not considered for Lake Mead. Due to a lack of quantitative data, a similar calibration to observed Cryptosporidium concentrations at the drinking water intake was not possible. In over 10 years of intake monitoring by SNWA's Compliance Laboratory, Cryptosporidium has never exceeded the reporting limit of 1 oocyst per L, hence SNWA's Bin 1 designation.

Risk of gastrointestinal infection

The cumulative annual risk (i.e., simultaneously considering all pathogens) never exceeded 10−4 for the baseline Lake Mead elevation of 329 m, with a maximum cumulative annual risk of 10−4.49 driven by Cryptosporidium (Table 2). In fact, cumulative risk was driven by and essentially equivalent to Cryptosporidium risk at all percentiles. Norovirus had the next highest risks, with a median of 10−7.35 and maximum of 10−6.18. Omitting the GC[thin space (1/6-em)]:[thin space (1/6-em)]IU adjustment significantly increased risk of infection for norovirus (p < 0.0001, Wilcoxon test) but still maintained a ∼20-fold safety factor relative to the 10−4 benchmark for the maximum value. Giardia yielded the lowest risks for all pathogen scenarios considered. Annual pathogen-specific risks are also illustrated in Fig. 4.
Table 2 Annual log10 transformed probabilities of infection for individual pathogens and all pathogens combined (i.e., cumulative) at Lake Mead elevation of 329 m
Pathogen scenario Mean 50th 95th 99th Max
Cumulative −4.59 −4.59 −4.51 −4.49 −4.49
Norovirus
Molecular −6.29 −6.57 −5.89 −5.43 −5.33
Molecular (GC[thin space (1/6-em)]:[thin space (1/6-em)]IU) −7.01 −7.35 −6.34 −6.19 −6.18
Enterovirus
Culture −10.72 −10.97 −10.23 −10.19 −10.17
Molecular −7.68 −9.46 −8.74 −6.37 −6.22
Molecular (GC[thin space (1/6-em)]:[thin space (1/6-em)]IU) −9.19 −11.44 −10.31 −7.86 −7.72
Adenovirus
Culture −11.40 −11.44 −11.25 −10.90 −10.82
Molecular −7.71 −7.87 −7.33 −7.08 −7.01
Molecular (GC[thin space (1/6-em)]:[thin space (1/6-em)]IU) −10.39 −10.60 −10.01 −9.67 −9.59
Protozoa
Giardia −14.05 −14.05 −14.01 −14.00 −14.00
Cryptosporidium −4.59 −4.59 −4.51 −4.49 −4.49



image file: d5ew00514k-f4.tif
Fig. 4 Annual pathogen-specific probabilities of infection over 30 years at Lake Mead elevation of 329 m. The red dotted line denotes the annual risk benchmark of 10−4. For adenovirus (AdV), enterovirus (EnV), and norovirus (NoV), risks were calculated based on molecular concentrations adjusted for GC[thin space (1/6-em)]:[thin space (1/6-em)]IU ratio or culture-based concentrations.

Due to the difficulty in culturing norovirus, its GC[thin space (1/6-em)]:[thin space (1/6-em)]IU ratio must be assumed,24 which also prevents direct comparisons of GC[thin space (1/6-em)]:[thin space (1/6-em)]IU-adjusted molecular concentrations with corresponding culture-based concentrations. On the other hand, GC[thin space (1/6-em)]:[thin space (1/6-em)]IU ratios for adenovirus and enterovirus have recently been well described using qPCR and cell culture on paired wastewater samples.9,13 However, their GC[thin space (1/6-em)]:[thin space (1/6-em)]IU ratios still span several orders of magnitude, and this variability has not yet been fully explained. Even after GC[thin space (1/6-em)]:[thin space (1/6-em)]IU adjustment in the current study, the molecular data for enterovirus and adenovirus sometimes yielded orders of magnitude higher risks than the corresponding culture-based scenarios. This suggests that molecular concentrations are highly conservative in nature when developing regulatory targets or generally characterizing risk.

In this study, we assumed one ingestion event per day, which is commonly done in potable reuse QMRAs, though some other studies have used as many as 96 ingestion events per day—equivalent to one ingestion event every 15 minutes. Multiple daily ingestion events lead to higher probabilities of infection at lower percentiles but lower maxima, which might be more important from a public health or regulatory perspective.9,12,59,60 Thus, a single daily ingestion event might be considered more conservative since a rare but high consequence scenario is not ‘averaged out’ by other nominal or low consequence ingestions.

Impact of lake level

Varying lake levels resulted in statistically significant changes in cumulative annual probabilities of infection (p < 0.0001) (Table 3 and Fig. 5), although individual pathogen risks were not all impacted to the same extent. This is because some pathogens were more sensitive to lake level changes due to differential decay (i.e., faster first order decay rate constants). For example, enterovirus experiences the fastest decay/inactivation in the environment, so its median LRV due to environmental decay in the lake decreased by 9.8[thin space (1/6-em)]due to the shorter travel time caused by lake level decline from 329 m to 297 m. Conversely, Cryptosporidium experiences the slowest decay, so its median lake LRV decreased by only 0.44. A post hoc analysis confirmed that the baseline Lake Mead elevation of 329 m had the lowest risk profile (p < 0.0001), but there was no significant difference (p = 0.60) in the cumulative annual risk profiles for 312 m vs. 297 m (Fig. 5). The highest median log10 probability of infection was for lake level 312 m (−4.43) followed by 297 m (−4.44) and then 329 m (−4.59) (Table 3). Importantly, even when considering continued lake level decline, none of the simulated annual risks exceeded the 10−4 annual risk benchmark.
Table 3 Cumulative annual log10 transformed probabilities of infection as a function of Lake Mead elevation
Lake level (m) Mean 50th 95th 99th Max
329 −4.59 −4.59 −4.51 −4.49 −4.49
312 −4.42 −4.43 −4.33 −4.31 −4.31
297 −4.44 −4.44 −4.36 −4.35 −4.35



image file: d5ew00514k-f5.tif
Fig. 5 Probability plots of cumulative annual risk of infection for a baseline Lake Mead elevation of 329 m and under prolonged drought conditions resulting in continued decline down to 312 m and 297 m. The red dotted line indicates the annual risk benchmark of 10−4.

This analysis demonstrates that there are public health implications related to drought due to climate change and the corresponding impacts on source waters. This QMRA illustrates how declining lake elevation can lead to shorter travel/storage times in an environmental buffer like Lake Mead (Fig. S4), which then reduces natural die-off/inactivation of pathogens and increases risk of gastrointestinal infection. Other adverse water quality impacts due to climate change are also possible; for SNWA, rising water temperatures combined with higher concentrations of dissolved organic matter and total dissolved solids can potentially lead to higher concentrations of disinfection byproducts. However, it should also be reiterated that risk was not perfectly correlated with lake level in this QMRA. The initial drop in elevation from 329 m to 312 m led to a significant increase in risk, but the additional drop to 297 m led to no significant change. The shorter travel times for lake elevation of 297 m led to less pathogen decay (i.e., higher risks) than at 312 m, but the longer travel times for 312 m were offset by higher percent DFR, ultimately leading to less dilution and slightly higher risk predictions. Overall, percent DFR demonstrated a complex relationship with lake level, with the highest lake elevation sometimes yielding the highest percent DFR during certain times of the year due to stratification and hydrodynamics (Table S6). Because the LVW discharge into Lake Mead differs considerably from the ambient water quality in Lake Mead, there is a distinct plume at the confluence,18–20 and this necessitates a complex 3D hydrodynamic model to understand the interplay of bathymetry, meteorological conditions, and DFR parameters. Future QMRAs should also consider incorporating hydrodynamic modeling in reservoir augmentation applications, because only accounting for dilution is an oversimplification that can impact the accuracy of risk estimates and ultimately risk management decisions.

Impact of seasonality

Seasonality had a significant impact on GC[thin space (1/6-em)]:[thin space (1/6-em)]IU ratio-adjusted norovirus risk estimates (p < 0.0001) at a Lake Mead elevation of 329 m, with the highest risk occurring during the fall (median log10 daily risk of −12.83), followed by summer (−14.26), winter (−16.15), and then spring (−19.66) (Fig. 6). Daily risk simulations exceeded the daily risk benchmark of 2.7 × 10−7 at the 99.93rd percentile in the fall, and at the 99.96th percentile in the spring. Maximum daily risks in fall reached as high as −4.57. Variables that impact seasonality include percent DFR at the drinking water intake (i.e., dilution), Lake Mead travel times (i.e., decay/inactivation), and the WWTP-specific norovirus concentrations, although worst-case conditions for each parameter did not always align. For example, for a Lake Mead elevation of 329 m, there was a higher percent DFR observed during fall (Table S6), but higher norovirus concentrations occurred in the winter for all WWTPs in Southern Nevada (Table S1). Moreover, the season with the highest percent DFR depended on lake level due to the impact of varying temperature gradients. For example, the highest percent DFR was observed in fall at 329 m but then in summer for lake elevations of 312 m or 297 m. Therefore, the impact of seasonality was also linked to lake level.
image file: d5ew00514k-f6.tif
Fig. 6 Daily probabilities of infection for GC[thin space (1/6-em)]:[thin space (1/6-em)]IU ratio-adjusted norovirus as a function of season. White dotted lines represent the 1st and 3rd quartiles, and the solid white lines indicate the medians. The red dotted line represents the daily risk benchmark of 2.7 × 10−7 (10−4/365).

Travel times also varied seasonally throughout the lake, due primarily to lake stratification in the fall and summer that resulted in faster travel time from the confluence to the drinking water intake. Additional mixing in the winter resulted in slower travel times and greater decay/inactivation, which effectively countered the higher raw wastewater norovirus concentrations assumed for the winter. In fact, daily risk followed the order of slowest (lowest risk) to fastest (highest risk) seasonal travel times, indicating that travel times impact norovirus risk more significantly than raw wastewater concentration. This analysis highlights how dilution and decay/inactivation in the environmental buffer can somewhat attenuate higher wastewater concentrations during seasonal fluctuations, showing that norovirus outbreaks in the winter do not necessarily correspond to higher DFR risks. Additionally, since the intake is deep in Lake Mead, the temperature of the water at the intake is relatively stable throughout the year, varying by only approximately 2 °C, reducing the variation in decay coefficients due to temperature fluctuations.

Impact of ozonation at the drinking water treatment plants

As mentioned earlier, ozone is not required at the DWTPs in Southern Nevada because of their “Bin 1” designation for Cryptosporidium. Thus, it is assumed that the system can achieve a 10−4 annual risk of infection61 without additional treatment. Because ozonation is included on a voluntary basis, the drinking water system could theoretically still be operated during periods of ozone downtime. Therefore, this QMRA included a characterization of ozone's contribution to daily risk mitigation. We considered the impact of ozone being offline for a single ingestion event, which equates to a 24 hour (daily) risk scenario that was compared against the daily risk benchmark of 2.7 × 10−7 (or −6.57[thin space (1/6-em)]log10).

During normal operation, ozone at the DWTPs accounted for point estimate LRVs of 6 for viruses, 5.77 for Giardia, and 0.47 for Cryptosporidium. As shown in Fig. 7, GC[thin space (1/6-em)]:[thin space (1/6-em)]IU-adjusted norovirus and Cryptosporidium exceeded the 2.7 × 10−7 daily risk benchmark at the upper percentiles even with ozonation included. However, as noted earlier, the system still achieved the 10−4 annual risk benchmark that is typically assumed for conventional public water systems due to ‘averaging’ of daily risks throughout the year. The assumed 24 h downtime for ozonation resulted in increased risks for all pathogens, but most notably for the viruses (Table S8 and Fig. 7). Omitting the ozone LRVs resulted in an equivalent log10-increase in risk for all pathogens (i.e., NoV risks increased by 6[thin space (1/6-em)]log10), because at low doses, the relationship becomes linear between dose and risk for the dose–response models. Giardia still achieved the daily risk benchmark for all scenarios, but for adenovirus and enterovirus, regardless of enumeration method (i.e., GC[thin space (1/6-em)]:[thin space (1/6-em)]IU-adjusted molecular or culture), risks exceeded the daily benchmark at the very upper percentiles (i.e., >97th percentiles) (Table S8). This highlights the value of ozonation in terms of robust disinfection efficacy and its corresponding risk reduction potential. This can be beneficial to water utilities looking to minimize pathogen risks, even if supplemental disinfection is not mandated by federal regulations.


image file: d5ew00514k-f7.tif
Fig. 7 Daily probabilities of infection for each pathogen with and without ozonation at the drinking water treatment plants (DWTPs). The red dotted line represents the daily risk benchmark of 2.7 × 10−7 (10−4/365).

For GC[thin space (1/6-em)]:[thin space (1/6-em)]IU-adjusted norovirus with ozone omitted, even the median daily risk exceeds the benchmark at the lowest lake level of 297 m; higher lake elevations resulted in exceedances at the 50th–70th percentiles (Table S8). Therefore, virus selection during the hazard identification step of a QMRA can have significant implications when extending the results to regulatory development and/or real-world systems. There is still debate regarding the appropriateness of using molecular norovirus data, even with GC[thin space (1/6-em)]:[thin space (1/6-em)]IU adjustment, rather than culture-based enterovirus or adenovirus data when developing potable reuse regulations,24 or, in this case, operational criteria for an existing DWTP in a DFR system. That being said, omitting ozonation also dropped the exceedance percentiles for Cryptosporidium from approximately the 80th–90th percentiles down to the 60th–70th percentiles. Thus, ozonation may not be mandated by U.S. EPA regulations, but this analysis demonstrates its benefits for public health protection in DFR systems.

Log reduction value targets

Using dose–response models, deterministic pathogen concentrations, and an acceptable annual risk threshold of 10−4, LRV targets (or LRTs) can be derived using a straightforward ‘top-down’ QMRA.12 Depending on the regulatory crediting framework, these LRTs could be inclusive of any pathogen reduction occurring throughout the system, including wastewater treatment, dilution and/or inactivation in an environmental buffer, drinking water treatment, and even engineered blending of diverse sources. Top-down QMRAs are traditionally used to determine LRTs during the regulatory development process for potable reuse applications, as in California for their DPR regulation.22

With respect to pathogen concentrations, various approaches have been used, including maximum observed values, maximum values from 10[thin space (1/6-em)]000 random samplings of a concentration distribution, and various percentiles of a concentration distribution. However, the maximum is a poor choice for statistical comparisons because it is unstable, depends on the number of samples, and is influenced by the random number generator used. An alternative metric that we have previously proposed is the 97.4th percentile based on Blom's equation.62 This percentile was selected in Gerrity et al.24 to harmonize standard source water characterization data, such as that required under U.S. EPA's LT2, with simulated data from a distribution. Specifically, the 97.4th percentile from a simulated 10[thin space (1/6-em)]000-concentration dataset corresponds to the maximum observed value across 24 real-world samples. A sample size of N = 24 aligns with the standard source water characterization requirements under the U.S. EPA's LT2.24

This discussion is relevant to Nevada, as the state currently has IPR regulations with LRTs of 12/10/10 for viruses, Giardia, and Cryptosporidium26 but has not yet established DPR regulations. Using the 97.4th percentile approach coupled with the pathogen concentration distributions (Table S1), exposure assumptions, and dose–response models from this study, the LRTs could be as high as 16/10/10 for viruses, Cryptosporidium, and Giardia (Table 4). The virus LRT of 16 is driven by the high norovirus concentrations (without GC[thin space (1/6-em)]:[thin space (1/6-em)]IU adjustment) observed in Southern Nevada from 2021 to 2024.13 It is important to note that the virus LRT could be justifiably lower depending on the choice of reference virus; as noted earlier, there is still debate regarding the appropriateness of molecular data. For instance, using culturable enterovirus with a 10× correction factor63 as the reference virus, the LRTs could be reduced to 13/10/10—consistent with Gerrity et al.—or 12/10/10 without the 10× correction factor—consistent with the existing IPR regulations in Nevada. If maximum simulated values are assumed, the LRTs would increase to 18/11/11.

Table 4 Log reduction value (LRV) targets (or LRTs) for potable reuse applications based on the Southern Nevada raw wastewater pathogen concentration dataset. LRTs are reported for 97.4th percentile and maximum simulated values based on 10[thin space (1/6-em)]000 Monte Carlo simulations
Scenario 97.4th percentile Maximum
Viruses (culture) Conc. (Log10 L−1) LRT Conc. (Log10 L−1) LRT
Enterovirus culture (10×) 6.5 13.2 8.0 14.7
Adenovirus culture (baseline) 4.7 11.5 6.4 13.2

Viruses (molecular) Conc. (Log10 L−1) LRT Conc. (Log10 L−1) LRT
Enterovirus molecular (GC[thin space (1/6-em)]:[thin space (1/6-em)]IU) 5.8 12.5 8.2 14.9
Norovirus molecular combined 9.5 16.1 11.6 18.2
Norovirus molecular (GC[thin space (1/6-em)]:[thin space (1/6-em)]IU) 7.8 14.4 10.4 17.0
Adenovirus molecular (GC[thin space (1/6-em)]:[thin space (1/6-em)]IU) 5.9 12.7 8.4 15.2

Protozoa Conc. (Log10 L−1) LRT Conc. (Log10 L−1) LRT
Giardia (baseline) 4.5 9.8 5.3 10.6
Cryptosporidium (baseline) 3.1 10.1 4.1 11.1


Comparison to bottom-up QMRA

Another way to identify LRTs is by systematically varying LRV totals (e.g., in 0.5[thin space (1/6-em)]log10 increments) and then evaluating the percentage of simulations that fall below the acceptable risk threshold. This is a variation of the “bottom-up” QMRA described by Clements et al.,12 and this approach was also used by Soller et al.,56 who found that LRTs of 16/11/11 resulted in 100% of their simulations having cumulative annual risks less than 10−4. This approach also provides an inherent sensitivity analysis showing the relative impact of small changes in virus and protozoa LRVs on final risk estimates, as is shown for the Southern Nevada system in Fig. 8. For example, holding the protozoa LRV constant at 9 and increasing norovirus LRV, the percentage of simulations achieving the annual risk benchmark increases slightly but then plateaus at an LRV of 16, with marginal gains beyond that point. Similarly, when increasing protozoa LRVs from 9 to 10 while holding norovirus LRV constant, there is a considerable jump in compliance (e.g., from 5.0% to 49.1% for a norovirus LRV of 13 and from 13.7% to 95.0% for a norovirus LRV of 18). Beyond a protozoa LRV of 10, the gains diminish unless there is a corresponding increase in virus LRV. Stepwise increases in LRVs for all pathogen targets eventually leads to 100% compliance at 17/11/11 (Fig. 8), which is consistent with the maximum GC[thin space (1/6-em)]:[thin space (1/6-em)]IU-adjusted norovirus scenario in Table 4.
image file: d5ew00514k-f8.tif
Fig. 8 Percent of simulations with cumulative annual risk of infection less than the annual risk benchmark of 10−4. Norovirus is adjusted for GC[thin space (1/6-em)]:[thin space (1/6-em)]IU ratio here, whereas both non-adjusted and adjusted scenarios are included in Table 4.

Limitations

For modeling large-scale systems such as the Southern Nevada watershed, QMRAs can attempt to capture various levels of complexity, but inevitably, there will still be limitations associated with the model, assumptions, and results. WWTP LRVs were characterized using various literature sources, including conservative regulatory crediting frameworks, so it is likely that the assumed LRVs are underestimating true pathogen reduction. For example, WWTP processes such as sedimentation and filtration were not directly credited, although an “observed” LRV was incorporated for Cryptosporidium. Other pathogens likely have similar uncredited LRVs, including deposition within Lake Mead. Literature decay rates were fit to new distributions to isolate relevant enumeration methods, but these literature values still encompassed potentially non-relevant conditions (e.g., saline and non-saline environments, dark and light exposure). Site-specific decay studies (e.g., in LVW and Lake Mead) would be beneficial for reducing uncertainty around decay estimates. Percent DFR was characterized based on point values from van der Nagel et al.20 and was set to a uniform distribution, so the additional variability in seasonality was unaccounted for, except during the specific seasonal analysis. There is also considerable uncertainty with commonly adopted dose–response models. Norovirus has a newly fitted model that may provide better estimates,64 but in the low dose range relevant to many potable reuse QMRAs, the impact on risk is minimal. Future studies should consider the implications of using different dose–response models, particularly as updates become available. In general, dose–response relationships are very uncertain at low doses. Most dose–response studies do not utilize doses less than 10 organisms, including the study used to develop the Cryptosporidium dose–response model.65 The doses we consider here are far below 1 organism per event, and so extrapolating down into the low-dose range has an unknown impact on the true probability of infection.

Conclusions

This QMRA demonstrates the use of best practices for evaluating risk in potable reuse applications while also integrating novel components, such as robust, site-specific pathogen concentration distributions and 3D hydrodynamic modeling of environmental buffers. With this approach, we demonstrated that risk of infection by gastrointestinal pathogens can increase due to seasonal fluctuations in various input parameters, including dilution level at the drinking water intake and pathogen concentrations in raw wastewater. Risk of infection is also linked to climatic changes, such as prolonged drought leading to lake level decline that can shorten pathogen travel times in the environmental buffer and reduce die-off.

For the Southern Nevada de facto reuse system, this analysis demonstrates that public health is adequately protected on an annual risk basis. Simulated declines in Lake Mead elevation led to a higher cumulative annual risk of infection, but none of the simulations exceeded the annual risk benchmark of 10−4. While lower lake levels may impact water quality in other ways, our findings suggest that the multi-barrier approach encompassing wastewater treatment, natural attenuation, and drinking water treatment is sufficient to manage pathogen risks in this system.

However, the analysis also highlighted important considerations for risk management in Southern Nevada, and elsewhere. Fall exhibited the highest risk of norovirus infection, exceeding the daily risk threshold at the 99.93rd percentile, specifically due to the combined effect of increased norovirus concentrations in wastewater, shorter travel times due to lake stratification (i.e., less die-off), and higher percent DFR (i.e., less dilution). Thus, it is important to develop a comprehensive understanding of any public water system to identify high risk scenarios that might warrant closer attention or even operational modifications. This QMRA also demonstrated that ozonation, which is not required at the DWTPs, provides consequential reductions in risk, particularly for viruses and Cryptosporidium. Therefore, even if ozonation is not required for regulatory compliance, efforts should be made to ensure nominal operation at all times.

Overall, this QMRA demonstrates that public health should be adequately protected even under prolonged drought conditions, at least from gastrointestinal microbial hazards, but this is due to resilience of the overall multi-barrier system. Finally, as Nevada pursues direct potable reuse regulations, this QMRA also highlights the impacts of critical assumptions, including choice of reference pathogen and concentration distributions, with deterministic LRTs ranging from 12/10/10 to 18/11/11.

Conflicts of interest

There are no conflicts of interest to declare.

Data availability

The data supporting this article have been included as part of the supplementary information (SI).

Supplementary information: including the model parameters and R script. See DOI: https://doi.org/10.1039/d5ew00514k.

Acknowledgements

This project was funded in part by the WaterSMART Applied Sciences Program through the United States Bureau of Reclamation (BOR) (Grant No. R22AP00236). This study was also supported by Water Research Foundation (WRF) project 5197, funded by the U.S. Environmental Protection Agency (EPA) under Assistance Agreement No. 84046201. The views expressed in this paper are solely the responsibility of the authors and do not necessarily represent the official views of SNWA, BOR, EPA, or WRF.

References

  1. T. Nguyen, P. Westerhoff, E. T. Furlong, D. W. Kolpin, A. L. Batt, H. E. Mash, K. M. Schenck, J. S. Boone, J. Rice and S. T. Glassmeyer, Modeled De Facto Reuse and Contaminants of Emerging Concern in Drinking Water Source Waters, J. AWWA, 2018, 110, E2–E18 CrossRef CAS.
  2. Z. Wang, X. Li and S. Xiang, Assessing de facto wastewater reuse and its implications for water quality in Yangtze Basin (2014–2021), Heliyon, 2024, 10(22), e40275 CrossRef.
  3. D. Hannoun and T. Tietjen, Lake management under severe drought: Lake Mead, Nevada/Arizona, J. Am. Water Resour. Assoc., 2023, 59, 416–428 CrossRef.
  4. C. Dow, S. Ahmad, K. Stave and D. Gerrity, Evaluating the sustainability of indirect potable reuse and direct potable reuse: a southern Nevada case study, AWWA Water Sci., 2019, 1, e1153 CrossRef.
  5. E. Garner, N. Zhu, L. Strom, M. Edwards and A. Pruden, A human exposome framework for guiding risk management and holistic assessment of recycled water quality, Environ. Sci.: Water Res. Technol., 2016, 2, 580–598 RSC.
  6. S. J. Khan, R. Fisher and D. J. Roser, Potable reuse: Which chemicals to be concerned about, Curr. Opin. Environ. Sci. Health, 2019, 7, 76–82 CrossRef.
  7. P. Manyepa, K. M. Gani, M. Seyam, I. Banoo, B. Genthe, S. Kumari and F. Bux, Removal and risk assessment of emerging contaminants and heavy metals in a wastewater reuse process producing drinkable water for human consumption, Chemosphere, 2024, 361, 142396 CrossRef CAS PubMed.
  8. E. Amoueyan, S. Ahmad, J. N. S. Eisenberg and D. Gerrity, Equivalency of indirect and direct potable reuse paradigms based on a quantitative microbial risk assessment framework, Microb. Risk Anal., 2019, 12, 60–75 Search PubMed.
  9. B. Pecson, A. Kaufmann, D. Gerrity, C. N. Haas, E. Seto, N. J. Ashbolt, T. Slifko, E. Darby and A. Olivieri, Science-based pathogen treatment requirements for direct potable reuse, Environ. Sci.: Water Res. Technol., 2023, 9, 3377–3390 RSC.
  10. J. A. Soller, S. E. Eftim and S. P. Nappier, Comparison of Predicted Microbiological Human Health Risks Associated with de Facto, Indirect, and Direct Potable Water Reuse, Environ. Sci. Technol., 2019, 53, 13382–13389 CrossRef PubMed.
  11. S. E. Eftim, T. Hong, J. Soller, A. Boehm, I. Warren, A. Ichida and S. P. Nappier, Occurrence of norovirus in raw sewage – A systematic literature review and meta-analysis, Water Res., 2017, 111, 366–374 CrossRef CAS.
  12. E. Clements, C. van der Nagel, K. Crank, D. Hannoun and D. Gerrity, Review of quantitative microbial risk assessments for potable water reuse, Environ. Sci.: Water Res. Technol., 2025, 11, 542–559 RSC.
  13. K. Papp, C. Barber, K. Chung, E. Clements, W. Frehner, D. Hannoun, T. Lane, C. Morrison, B. Mull, E. Oh, P. Wang and D. Gerrity, Pathogen and indicator trends in southern Nevada wastewater during and after the COVID-19 pandemic, Environ. Sci.: Water Res. Technol., 2025, 11, 262–280 RSC.
  14. M. F. Smith, R. Maqsood, R. A. Sullins, E. M. Driver, R. U. Halden and E. S. Lim, Seasonality of respiratory, enteric, and urinary viruses revealed by wastewater genomic surveillance, mSphere, 2024, 9, e00105–e001024 CrossRef PubMed.
  15. L. M. Mosley, Drought impacts on the water quality of freshwater systems; review and integration, Earth-Sci. Rev., 2015, 140, 203–214 CrossRef CAS.
  16. S. A. Snyder and M. J. Benotti, Endocrine disruptors and pharmaceuticals: implications for water sustainability, Water Sci. Technol., 2010, 61, 145–154 CrossRef CAS PubMed.
  17. M. T. H. van Vliet and J. J. G. Zwolsman, Impact of summer droughts on the water quality of the Meuse river, J. Hydrol., 2008, 353, 1–17 CrossRef.
  18. F. Bonvin, R. Rutler, N. Chèvre, J. Halder and T. Kohn, Spatial and Temporal Presence of a Wastewater-Derived Micropollutant Plume in Lake Geneva, Environ. Sci. Technol., 2011, 45, 4702–4709 CrossRef CAS PubMed.
  19. N. Goldscheider, L. Haller, J. Poté, W. Wildi and J. Zopfi, Characterizing Water Circulation and Contaminant Transport in Lake Geneva Using Bacteriophage Tracer Experiments and Limnological Methods, Environ. Sci. Technol., 2007, 41, 5252–5258 CrossRef CAS PubMed.
  20. C. van der Nagel, E. Clements, C. Wilkerson, D. Hannoun and T. Tietjen, Impact of drought on de facto reuse and water quality: Insights from Hydrodynamic Modeling versus Machine Learning, Environmental Modelling & Software, 2025, 193, 106649 Search PubMed.
  21. Colorado Department of Public Health and Environment, Direct Potable Reuse Policy, 2023 Search PubMed.
  22. Division of Drinking Water, Direct Potable Reuse, California, 2023 Search PubMed.
  23. Division of Drinking Water, Surface Water Augmentation Using Recycled Water, California, 2017 Search PubMed.
  24. D. Gerrity, K. Crank, E. Steinle-Darling and B. M. Pecson, Establishing pathogen log reduction value targets for direct potable reuse in the United States, AWWA Water Sci., 2023, 5, e1353 CrossRef CAS.
  25. Texas Commission on Environmental Quality, Direct Potable Reuse for Public Water Systems, 2022 Search PubMed.
  26. US EPA, Summary of Nevada's Water Reuse Guideline or Regulation for Potable Water Reuse, https://www.epa.gov/waterreuse/summary-nevadas-water-reuse-guideline-or-regulation-potable-water-reuse, (accessed 20 May 2024) Search PubMed.
  27. E. Amoueyan, S. Ahmad, J. N. S. Eisenberg and D. Gerrity, A dynamic quantitative microbial risk assessment for norovirus in potable reuse systems, Microb. Risk Anal., 2020, 14, 100088 Search PubMed.
  28. R Core Team, R: A language and environment for statistical computing, R Foundation for Statistical Computing, Vienna, Austria, 2023 Search PubMed.
  29. G. C. Holdren and K. Turner, Characteristics of Lake Mead, Arizona–Nevada, Lake Reservoir Manage., 2010, 26, 230–239 CrossRef CAS.
  30. Weather averages Las Vegas, Nevada, https://www.usclimatedata.com/climate/las-vegas/nevada/united-states/usnv0049, (accessed 23 October 2025).
  31. Bureau of Reclamation, Lower Colorado River Operations, https://www.usbr.gov/lc/region/g4000/hourly/mead-elv.html, (accessed 26 March 2025) Search PubMed.
  32. D. Gerrity, K. Papp, E. Dickenson, M. Ejjada, E. Marti, O. Quinones, M. Sarria, K. Thompson and R. A. Trenholm, Characterizing the chemical and microbial fingerprint of unsheltered homelessness in an urban watershed, Sci. Total Environ., 2022, 840, 156714 CrossRef CAS PubMed.
  33. D. Hannoun, T. Tietjen and K. Brooks, The potential effects of climate change and drawdown on a newly constructed drinking water intake: Study case in Las Vegas, NV, USA, Water Util. J., 2021, 1–13 Search PubMed.
  34. P. Jeffrey, Z. Yang and S. J. Judd, The status of potable water reuse implementation, Water Res., 2022, 214, 118198 CrossRef CAS.
  35. E. Darby, A. Olivieri, C. Haas, G. D. Giovanni, W. Jakubowski, M. Leddy, K. L. Nelson, C. Rock, T. Slifko and B. M. Pecson, Identifying and aggregating high-quality pathogen data: a new approach for potable reuse regulatory development, Environ. Sci.: Water Res. Technol., 2023, 9, 1646–1653 RSC.
  36. Nevada Division of Environmental Protection, Revised proposed regulation of the state environmental commission. (LCB File No. R101-16), Carson City, NV, 2016 Search PubMed.
  37. D. Gerrity, K. Papp, E. Dickenson, M. Ejjada, E. Marti, O. Quinones, M. Sarria, K. Thompson and R. A. Trenholm, Characterizing the chemical and microbial fingerprint of unsheltered homelessness in an urban watershed, Sci. Total Environ., 2022, 840, 156714 CrossRef CAS PubMed.
  38. WaterSecure, Membrane Bioreactor WaterVal Validation Protocol, Australian WaterSecure Innovations Ltd, Brisbane, 2017 Search PubMed.
  39. T. Hill, P. Wang, A. Olivieri, J. Batista and D. Gerrity, Assessing the basis for regulatory crediting of virus LRVs for secondary biological wastewater treatment: A systematic review, Water Res., 2025, 271, 122886 CrossRef CAS PubMed.
  40. G. Tchobanoglous, J. Kenny and H. Leverenz, Rationale for constant flow to optimize wastewater treatment and advanced water treatment performance for potable reuse applications, Water Environ. Res., 2021, 93, 1231–1242 CrossRef CAS PubMed.
  41. A. Olivieri, J. Crook, M. Anderson, B. Richard, D. Jorg, H. Charles, W. Jakubowski, P. McCarty, N. Kara, R. Joan, S. David and W. Timothy, Expert Panel Final Report: Evaluation of the Feasibility of Developing Uniform Water Recycling Criteria for Direct Potable Reuse, California State Water Resources Control Board, Sacramento, CA, 2016 Search PubMed.
  42. B. Pecson, N. Ashbolt, C. Haas, T. Slifko, A. Kaufmann, D. Gerrity, E. Seto and A. Olivieri, Tools to Evaluate Quantitative Microbial Risk and Plant Performance/Reliability, The Water Research Foundation, 2021 Search PubMed.
  43. US EPA, Ultraviolet Disinfection Guidance Manual For The Final Long Term 2 Enhanced Surface Water Treatment Rule | PDF, United States Environmental Protection Agency, Washington, DC, USA, 2006 Search PubMed.
  44. S. Gamage, D. Gerrity, A. N. Pisarenko, E. C. Wert and S. A. Snyder, Evaluation of Process Control Alternatives for the Inactivation of Escherichia coli, MS2 Bacteriophage, and Bacillus subtilis Spores during Wastewater Ozonation, Ozone: Sci. Eng., 2013, 35, 501–513 CrossRef CAS.
  45. US EPA, Long Term 2 Enhanced Surface Water Treatment Rule Toolbox Guidance Manual, United States Environmental Protection Agency, Washington, DC, USA, 2010 Search PubMed.
  46. US EPA, Disinfection Profiling and Benchmarking: Technical Guidance, 2020 Search PubMed.
  47. K. A. Thompson, H. Ray, D. Gerrity, O. Quiñones, E. Dano, J. Prieur, B. Vanderford, E. Steinle-Darling and E. R. V. Dickenson, Sources of per- and polyfluoroalkyl substances in an arid, urban, wastewater-dominated watershed, Sci. Total Environ., 2024, 940, 173361 CrossRef CAS PubMed.
  48. B. Blasius, J. Kirsch and A. Danner, Las Vegas Wash Time-of-Travel Study, United States Bureau of Reclamation, 2016 Search PubMed.
  49. A. B. Boehm, K. E. Graham and W. C. Jennings, Can We Swim Yet? Systematic Review, Meta-Analysis, and Risk Assessment of Aging Sewage in Surface Waters, Environ. Sci. Technol., 2018, 52, 9634–9645 CrossRef CAS.
  50. A. B. Boehm, A. I. Silverman, A. Schriewer and K. Goodwin, Systematic review and meta-analysis of decay rates of waterborne mammalian viruses and coliphages in surface waters, Water Res., 2019, 164, 114898 CrossRef CAS.
  51. C. L. Marti, R. Mills and J. Imberger, Pathways of multiple inflows into a stratified reservoir: Thomson Reservoir, Australia, Adv. Water Resour., 2011, 34, 551–561 CrossRef.
  52. US EPA, Method 1623: Cryptosporidium and Giardia in Water by Filtration/IMS/FA, 2005 Search PubMed.
  53. L. Lee, Comprehensive R Archive Network, NADA: Nondetects and Data Analysis for Environmental Data, 2020, 1.6-1.1 Search PubMed.
  54. M. Burke, E. Wells, C. Larison, G. Rao, M. J. Bentley, Y. S. Linden, P. Smeets, J. DeFrance, J. Brown and K. G. Linden, Systematic Review of Microorganism Removal Performance by Physiochemical Water Treatment Technologies, Environ. Sci. Technol., 2025, 59(41), 21763–21775 CrossRef CAS PubMed.
  55. SWRCB, DPRisk, https://cawaterdatadive.shinyapps.io/DPRisk/.
  56. J. A. Soller, S. E. Eftim and S. P. Nappier, Direct potable reuse microbial risk assessment methodology: Sensitivity analysis and application to State log credit allocations, Water Res., 2018, 128, 286–292 CrossRef CAS.
  57. J. A. Soller, S. E. Eftim, I. Warren and S. P. Nappier, Evaluation of microbiological risks associated with direct potable reuse, Microb. Risk Anal., 2017, 5, 3–14 Search PubMed.
  58. US EPA, Update for Chapter 3 of the Exposure Factors Handbook: Ingestion of Water and Other Select Liquids, 2019 Search PubMed.
  59. C. H. Jones, V. Wylie, H. Ford, J. Fawell, M. Holmer and K. Bell, A robust scenario analysis approach to water recycling quantitative microbial risk assessment, J. Appl. Microbiol., 2023, 134, lxad029 CrossRef CAS PubMed.
  60. B. M. Pecson, S. C. Triolo, S. Olivieri, E. C. Chen, A. N. Pisarenko, C.-C. Yang, A. Olivieri, C. N. Haas, R. S. Trussell and R. R. Trussell, Reliability of pathogen control in direct potable reuse: Performance evaluation and QMRA of a full-scale 1 MGD advanced treatment train, Water Res., 2017, 122, 258–268 CrossRef CAS PubMed.
  61. S. Regli, J. B. Rose, C. N. Haas and C. P. Gerba, Modeling the Risk From Giardia and Viruses in Drinking Water, J. AWWA, 1991, 83, 76–84 CrossRef CAS.
  62. G. Blom, Statistical estimates and transformed Beta-variables, John Wiley & Sons, 1958 Search PubMed.
  63. C. P. Gerba and W. Q. Betancourt, Assessing the Occurrence of Waterborne Viruses in Reuse Systems: Analytical Limits and Needs, Pathogens, 2019, 8, 107 CrossRef CAS PubMed.
  64. P. F. M. Teunis, F. S. Le Guyader, P. Liu, J. Ollivier and C. L. Moe, Noroviruses are highly infectious but there is strong variation in host susceptibility and virus pathogenicity, Epidemics, 2020, 32, 100401 CrossRef CAS.
  65. M. J. Messner and P. Berger, Cryptosporidium Infection Risk: Results of New Dose-Response Modeling, Risk Anal., 2016, 36, 1969–1982 CrossRef.

Footnote

These authors contributed equally.

This journal is © The Royal Society of Chemistry 2026
Click here to see how this site uses Cookies. View our privacy policy here.