Mike Sharpe
For this special issue on drinking water quality, Focus takes a look at the regulatory frameworks for drinking water in Europe and the US and the role of environmental analysis in their future evolution.
|
In developed countries we tend to take the ready availability of drinking water for granted. But even here safe water supplies are a relatively recent phenomenon. Historically, the quality of drinking water was assessed only in terms of aesthetic aspects such as appearance, taste and smell. As recently as the mid-nineteenth century the sources and effects of drinking water contamination were poorly understood and waterborne diseases, such as typhoid, dysentery and cholera, were widespread.1 Only in 1855, when epidemiologist Dr. John Snow linked an outbreak of cholera in London to a public well that was contaminated by sewage, did people begin to appreciate that water could be a source of illness as well as a giver of life. In the 1880s this understanding was advanced further by Louis Pasteur, whose “germ theory” of disease explained how microscopic organisms could transmit disease through media such as water.
During the late nineteenth and early twentieth centuries, concerns regarding drinking water quality, particularly disease-causing microbes (pathogens), led to the development of sophisticated drinking water treatment systems. These focused primarily on reducing turbidity through slow sand filtration. Disinfectants also began to be used; initially chlorine and later other disinfectants such as ozone. The first regulations on drinking water quality were also introduced in Europe and in the US at about this time.
By the late 1960s it was apparent that aesthetic problems, pathogens, and naturally-occurring chemicals were not the only concerns in relation to drinking water quality. Industrial and agricultural advances and the creation of new man-made chemicals also had negative impacts on the environment and public health. Many of these new chemicals were finding their way into water supplies through factory discharges, street and farm field runoff, and leaking underground storage and disposal tanks. Although treatment techniques such as aeration, flocculation, and granular activated carbon (GAC) adsorption (for removal of organic contaminants) existed at the time, they were either underutilised by water operators or ineffective at removing some new contaminants.
This wider recognition of drinking water issues has led to the more comprehensive regulatory frameworks we see today. While regulations in Europe and the US differ in their detail and implementation (see below), they are predicated on the same approach: the development of standards for individual contaminants based on environmental and health risk assessments, and set to protect the most vulnerable population groups.
Enforceable “at tap” quality standards backed by analytical monitoring are an important aspect of the regulatory regime, but they are just one element of a multilayered approach to drinking water safety.2 Other safeguards include: protecting drinking water sources to prevent contamination; controlling the quality of underground wells; efficient operation of water distribution systems; and public education on drinking water quality.
The health effects of such contaminants manifest themselves at two levels.3 Acute effects occur within hours or days of the time that a person consumes a contaminant. In drinking water, microbes, such as bacteria and viruses, are the contaminants with the greatest chance of reaching levels high enough to cause acute reactions, such as vomiting or diarrhoea. The most vulnerable group is people with weakened immune systems due to HIV/AIDS, chemotheraphy, steroid use, or other reasons. Children may also be susceptible, since their immune systems are not fully developed and they drink more fluid per unit of body weight than adults.2
Long-term exposure to some contaminants at levels above standards may cause chronic health effects such as gastrointestinal problems, skin irritations, cancer, reproductive and development disorders. Potential contaminants are organic chemicals, disinfection by-products, radionuclides, and metals, especially lead and arsenic.
Drinking water standards aim to safeguard against both acute and chronic effects. The key international reference here is the World Health Organisation (WHO), which provides national governments with extensive guidance necessary to establish national drinking water quality standards [Box 1]. WHO's risk-benefit approach is now internationally regarded and forms the basis for drinking water quality standards in both developed and developing countries.
Box 1: Gidance on drinking water qualityThe World Health Organisation has been concerned with drinking water quality and its effects on human health for over 45 years. WHO first codified its guidance in a publication entitled International Standards for Drinking Water, issued in 1958 and has updated it several times since.During the 1970s the underlying philosophy was revised to emphasise a risk-benefit approach in the formulation and enforcement of national standards. This new approach was contained in the 1984 publication, Guidelines For Drinking Water Quality, the most recent version of which (2nd edition) was published in 1993. This highly detailed publication covers potential microbiological, chemical, and radiological contaminants. For each it presents the following information:
The knowledge base on water and health continues to expand in the light of research findings and field experience. To ensure that new information is disseminated as quickly as possible, WHO has made its Guidelines database available online at:www.who.int/water_sanitation_health/GDWQ/ |
SDWA requires the Environmental Protection Agency (EPA) to regulate contaminants that present health risks and are known to, or are likely to, occur in public drinking water supplies. For each contaminant requiring regulation, EPA sets a legal limit on the concentrations allowed in drinking water, and if necessary states may also set limits that are at least as strict as EPA's. The most recent (1996) amendments to the SDWA emphasise sound science and risk-based standard setting. New measures were also introduced in relation to small water supply systems, the assessment and protection of source waters, public right-to-know, and assistance for states in upgrading the water system infrastructure.
Since 1974, the number of contaminants regulated under SDWA has quadrupled, from around 20 to over 90. Seven of these are new standards which have entered into force over the last three years, including a highly controversial standard for arsenic.5 Standards are of two types: legally-enforceable primary standards for contaminants known or anticipated to affect public health, and non-enforceable guidelines, or secondary standards, for contaminants that may cause cosmetic or aesthetic effects.
The SDWA requires EPA to review the regulations at least every six years. Such revisions will depend mainly on the re-evaluation of exposure and occurrence data, health effects, and technological advances in relation to analytical measurements and water treatment techniques. Candidates for potential regulation are listed in the National Drinking Water Contaminant Candidate List (CCL), published by EPA in 1998.6 Contaminants on the CCL are prioritised within three categories: regulation, health research, and occurrence data collection. Data on the latter is stored within the National Contaminant Occurrence Database.
Where health effects studies suggest regulation presents a meaningful opportunity to reduce health risk, EPA will propose a primary standard (called a National Primary Drinking Water Regulation). This is set in terms of a Maximum Contaminant Level Goal (MCLG), the maximum level of a contaminant in drinking water at which no known or anticipated adverse effect on human health should occur. The determination considers the risk to sensitive subpopulations (children, the elderly, and those with compromised immune systems) of experiencing a variety of adverse health effects. Since MCLGs consider only public health and not the limits of detection and treatment technologies, sometimes they are set at a level which water systems cannot meet. The variation (if any) between the goals and the capabilities of existing technology is reflected in the Maximum Contaminant Level (MCL), the maximum permissible level of a contaminant in water delivered through the public water system.
EPA's current priorities for new or strengthened regulations are: microbials, disinfectants and DBPs as a group (the so-called M/DBP cluster); radon; radionuclides; groundwater; and arsenic.7
The revision of the directive, adopted in November 1998, introduced major changes both in terms of the scientific and technical basis and the managerial approach. Specifically, it aimed to improve the transparency of regulation by emphasising that the point of compliance with the quality standard is at the consumer's point of use (rather than at the treatment works).
The 1998 Directive introduced new mandatory standards for microbiological and chemical parameters in line with WHO Guidelines and other scientific advice, while removing others. The eight new parameters were: acrylamide, epichlorohydrin, vinyl chloride, benzene, bromate, 1,2,dichloroethane, and two radioactive measures, tritium and total indicative dose. Nine parameters were tightened: antimony, arsenic, boron, chloride, copper, lead, nickel, PAH and tri & tetra chloroethenes. Quantified standards for colour and turbidity were also removed. Member States must comply with most of the measures by December 2003, except for bromate and trihalomethanes (by 2008) and lead (by 2013). The long transition period on the latter is to allow for the major upgrading needed to replace lead distribution pipes.
As in the US, drinking water legislation in the EU allows for a mixture of mandatory and non-mandatory standards. For non-mandatory standards the 1998 Directive introduced a new concept known as indicator parameters.9 These are parameters for which values only need to be fixed for monitoring purposes and to enable Member States to take remedial action to restore water quality. The Directive specifies aluminium, iron, manganese, colour, odour, taste and turbidity as indicator parameters, although Member States are free to retain them as mandatory parameters within national legislation, as the UK has chosen to do. Other parameters specified in terms of non-mandatory indicator values are: ammonium, chloride, colony counts, conductivity, sulfate, total indicative dose for radioactivity, total organic carbon, tritium, and turbidity (for water leaving the treatment works).
The directive is part of a fundamental overhaul of European Water Policy which aims towards a more integrated approach to water management and regulation.8 The most important element of this is the Water Framework Directive, adopted in December 2000. This introduced a new approach, requiring water management in the EU to be organised by river basin – the natural geographical and hydrological unit – instead of according to administrative or political boundaries. Under this approach, numerous objectives, such as protection of the aquatic ecology, protection of drinking water resources, and protection of bathing water, must be integrated for each river basin.
Current issues for some key contaminants are discussed briefly below, focusing on those species for which regulations and supporting analytical methods are still evolving. Arsenic is also a high priority with levels having been cut in both the US and the EU over recent years.5 Other concerns are more localised. For instance, perchlorate, a component of solid rocket fuel, has been detected in drinking water supplies in the US and is now the subject of intensive research.10
Control strategies for these problem microbes have progressed significantly over the last ten years.11 There is now a better understanding of issues such as: removal and inactivation during water treatment; infectious doses; and techniques for sampling and analysis. Major advances have also been made in the understanding of the microbiology. Characteristics of the parasite that confer resistance to attack by chlorine and susceptibility to attack by ozone and UV light are both being investigated. Researchers are also beginning to develop genetic fingerprinting techniques that should lead to improvements in assessment and control.
A number of epidemiological studies have reported a relationship between consumption of chlorinated drinking water and small increases in the incidence of certain types of cancer. Additional studies are underway to investigate this further. There are significant difficulties with the design and interpretation of drinking water epidemiological studies, however. Everyone drinks water and it is very difficult to isolate the contribution of DBPs from that of other sources of carcinogenic chemicals, such as smoking and workplace exposure. It is also very difficult to estimate either the drinking water consumption or the concentration of DBP in the water consumed over a particular period. Weighed against this, the value of disinfection in preventing the spread of waterborne diseases is undisputed. Potential alternatives to chemical disinfection are UV irradiation and membrane processes (such as microfiltration and reverse osmosis).
In the light of increasing concern about MTBE and related compounds, EPA established a Blue Ribbon Panel, a panel of experts charged with investigating public health and environmental effects.13 Following publication of the Panel's report in 1999, EPA developed non-regulatory guidance on levels of MTBE contamination that would affect taste and odour of drinking water and hence its acceptability to consumers. This guidance encourages early monitoring of MTBE under the Unregulated Contaminants Monitoring Rule and the incorporation of MTBE sources into source water assessments. EPA has also placed MTBE on the Contaminant Candidate List for further evaluation to determine whether or not a primary regulation is necessary. Around 40 research projects are currently on-going or anticipated to support the risk assessment process.
In Europe MTBE is less of an issue, for two reasons. Firstly, concentrations in petrol are generally lower than in the US. Secondly, the EU does not rely so heavily for drinking water on shallow acquifers which are susceptible to MTBE pollution. In areas where shallow acquifers are used, such as Mediterranean countries, population and industrial concentrations tend to be low. However, EU experts have recently stressed the importance of assessing airborne exposures as well as those from drinking water.
Since the mid-1970s, it has been known that compounds such as aspirin, caffeine, nicotine, and clofibric acid (a heart drug), were present in sewage sludge influent and effluent.15 Although these findings were not pursued further at the time, improvements in detection technology since then have revealed that raw sewage and treated wastewater can contain numerous pharmaceutical and personal care products (PPCPs) at concentrations of ng l−1 to µg l−1(ppt to ppb). These include not just drugs and their metabolites, but also synthetic fragrances, detergents and veterinary products, all of which can find their way into the environment after excretion or disposal by end-users. Particular concern has been expressed about substances such as oestrogens, constituents of the birth control pill, that have been implicated as potential endocrine disrupters.
Recent studies indicate the presence of a large number of PPCP contaminants. For example, a study at a wetland complex in Missouri identified 22 such compounds, including ethynyl oestradiol, the main component of the birth control pill, and nonyl phenol, a principal ingredient in detergents.15 Other compounds identified were atrazine, hydroxyatrazine, ibuprofen, and caffeine. Similar work in Canada has reported traces of many common drugs, including significant levels of neproxen, an anti-inflammatory, and carbamazepine, an anti-depressant.16
Conventional disinfection systems (chlorination, ozonation) and physical treatment systems (sedimentation, filtration) offer a degree of protection.14 The sophisticated treatment processes, such as ozone or granular activated carbon (GAC), that have been installed to remove pesticides and other organic substances from source waters should also be effective at removing PPCPs. However, as yet the practical performance of these systems is largely unproven.
A great many analytical techniques have been developed over the years and these are now readily accessible to laboratories and other interested parties. Certified methods developed by EPA, ISO and CEN are available (including over the internet) for a wide range of contaminants and analytical approaches.18 Further research is still required, however, for two reasons.
Firstly, new methods help to improve the efficiency of compliance monitoring. As regulations embrace ever greater numbers of contaminants across an increasing number of sampling points, operators are seeking to improve the efficiency and throughout of the analytical process. To achieve this they need to be able to use broadly similar techniques and equipment for individual contaminant species. Thus, there is a continuing need for new and improved monitoring techniques for recognised contaminants.
Secondly, monitoring helps determine the scope of legislation. As additional substances are identified as potential candidates for regulation, so appropriate analytical techniques are required to inform the necessary health effects and occurrence studies. Where health risks are identified, practical analytical tests are essential for enforceable regulations. The main issues here are in two areas: analytical techniques for monitoring microbiological agents; and robust methods for measuring trace chemical contaminants, such as PPCPs and endocrine disruptors, at very low concentrations.
Microbiological monitoring poses a particular challenge. Microbial agents are genetically diverse, a characteristic that has complicated the development of a comprehensive set of analytical techniques. Contaminants are often difficult to detect by laboratory cultivation, and detection of their genetic material by polymerase chain reaction (PCR) techniques is unreliable because of the detection of dead cells or their remnants. The development of reliable detection techniques will be essential for mandatory standards.
DNA microarrays represent a potentially significant analytical technique for the simultaneous detection of multiple pathogens in a single water sample. Live and dead organisms can be discriminated through mRNA analysis. Current research includes investigations in relation to Cryptosporidium, Helicobacter pylori, human caliciviruses, and cyanobacteria.19 The use of genetic fingerprinting techniques is another research interest.11 The aim here is to develop a means to identify the exact source of an infectious parasite at the molecular level, as a basis for improved surveillance and control programmes.
Recognised analytical methods and techniques are essential for operators in assessing how well systems are running. In particular they are needed to assess the effectiveness of treatment technologies in combating emerging contaminants, such as UV disinfection for microbiological agents, and granular carbon activation for trace organics and PPCPs.
From an operational point of view, the issue of disinfection by-products is particularly complex and much further work is needed. The formation of DBPs is influenced both by the quality of the source waters and by the type of treatment equipment used.20 Proteins, peptides, and amino acids in non-humic source waters have been implicated as potential precursors of halosubstituted nitriles and cyanogen halides. In addition, the shift away from chlorine-based disinfection to ozonation and to mixed ozonation/disinfection creates new types of by-products. Changes in disinfection practices can also influence the responses of microbial agents.21 New analytical methods are required to identify these DBPs and to help determine water quality, treatment, and distribution system conditions which influence their relative concentrations. Coherent risk assessment frameworks for these complex situations are also needed.22
Analysis also has important applications both upstream and downstream of the water treatment plant. In the assessment of source water quality, for example, molecular tracers offer a potential means to identify the sources of chemical contaminants in surface water drinking supplies. This approach assumes that the contribution of point and non-point sources to contaminant budgets in drinking water supplies is reflected by unique compounds that act as molecular tracers. Although the in-stream fate of contaminants and corresponding molecular tracers may differ, selection of more than one tracer can strengthen quantification of contaminant sources. Researchers at the Stroud Water Research Center in the US are investigating threshold values for the molecular tracers that are predictive for unacceptable levels of contamination.23
A similar approach can be applied in investigating an individual's actual exposure. In this case the tracers are biochemical species that act as markers for the bioavailability of and exposure to particular contaminants. For instance, recent work in the US has investigated various metabolites as biomarkers for exposure to halogenated DBPs.24
This journal is © The Royal Society of Chemistry 2002 |