Overview of the Microbiological Aspects of Drinking Water Quality
Health Canada
Ottawa, Ontario
March, 2021
This document may be cited as follows:
Health Canada (2021). Guidance document: Overview of the Microbiological Aspects of Drinking Water Quality. Water and Air Quality Bureau, Healthy Environments and Consumer Safety Branch, Health Canada, Ottawa, Ontario. (Catalogue No H144-79/1-2021E-PDF).
The document was prepared in collaboration with the Federal-Provincial-Territorial Committee on Drinking Water of the Federal-Provincial-Territorial Committee on Health and the Environment.
Any questions or comments on this document may be directed to:
Tel.: 1-833-223-1014 (toll free)
E-mail: hc.water-eau.sc@canada.ca
Other documents developed as part of the Guidelines for Canadian Drinking Water Quality can be found at: Water Quality - Reports and Publications .
Table of Contents
- 1.0 Introduction
- 2.0 Guideline technical documents and guidance documents
- 3.0 Source-to-tap approach for microbiological risks
- 4.0 References
- 5.0 Further reading
- Appendix A: List of abbreviations
1.0 Introduction
The provision of reliable, safe drinking water requires water utilities to take a holistic approach to the management of drinking water supplies. A holistic approach considers risks from all potential contaminants (microbiological, chemical and radiological) in a supply and allows the development of risk management strategies that prioritize risks and minimize potential impacts on human health. There are many examples of risk management approaches for drinking water systems, such as the source-to-tap approach used in the Guidelines for Canadian Drinking Water Quality (GCDWQ) and the water safety plan approach developed by the World Health Organization (WHO) (WHO, 2004). These approaches can be applied to both municipal and residential scale systems. It is widely recognized that microbiological risks are considered a top priority in drinking water management and that the microbiological quality of drinking water should never be compromised. This document integrates the relevant microbiological considerations found in the various documents developed as part of the GCDWQ in order to illustrate their use as part of a source-to-tap approach.
The guideline technical documents and guidance documents should be consulted for detailed information on using the parameters related to microbiological quality to understand the water quality in each component of the drinking water system including the source water, during the treatment processes, and in the treated and distributed drinking water. The list of documents that should be consulted for further information can be found in section 5.0.
2.0 Guideline technical documents and guidance documents
Specific guideline technical documents and guidance documents developed as part of the GCDWQ focus on microbiological risks. They address methods and parameters that are used to assess water quality, as well as the removal and inactivation of specific groups of human pathogens. These include documents on Escherichia coli (E. coli), total coliforms, enterococci, enteric protozoa, enteric viruses, other waterborne pathogens, quantitative microbial risk assessment (QMRA), natural organic matter (NOM), turbidity, and various parameters for monitoring the biological stability of drinking water distribution systems. These documents are briefly discussed below.
2.1 E. coli and total coliforms
E. coli and total coliforms are bacterial indicators that are used to verify the quality of the drinking water. The primary role of E. coli is as an indicator of fecal contamination, whereas total coliforms are used to indicate changes in water quality. Monitoring for these indicators is one of the measures used to determine groundwater vulnerability and surface water quality, and to verify that water has been adequately treated and safely distributed. Enterococci is a complementary bacterial indicator of fecal contamination. It can be used to supplement E. coli and total coliforms monitoring programs, particularly for groundwater sources that are vulnerable to fecal contamination and in distribution systems that experience frequent disruptions to their systems.
2.2 Pathogenic microorganisms
Health-based treatment goals are set for enteric protozoa and enteric viruses. The health-based treatment goal for the enteric protozoa Giardia and Cryptosporidium (oo)cysts is a minimum 3 log removal and/or inactivation in surface waters and in groundwater under the direct influence of surface waters (GUDI). The health-based treatment goal for enteric viruses is a minimum 4 log removal and/or inactivation for all water sources, including groundwater sources. The responsible authority may choose to allow a groundwater source to have less than the recommended minimum 4 log reduction if the assessment of the drinking water system meets the provincial or territorial requirements in place to ensure that the risk of enteric virus presence is minimal. Source water assessments may determine that log reductions greater than the minimum requirements are necessary to produce water of an acceptable microbiological quality.
Guidance on risks from numerous other pathogens which may be found in drinking water sources (e.g., Legionella, Mycobacteria) and how these risks can be minimized through the application of a source-to-tap approach, is also available in the guidance document on waterborne pathogens.
2.3 Quantitative microbial risk assessment (QMRA)
QMRA is used to support the health-based treatment goals for enteric protozoa and enteric viruses. QMRA uses mathematical modelling and relevant information on source water quality data, treatment barrier information and pathogen-specific characteristics to estimate the risk from pathogenic microorganisms in a drinking water source. Using Canadian data, this process shows that the health-based treatment goals for enteric protozoa and enteric viruses are the minimum reductions necessary to meet the tolerable or acceptable level of risk of 10-6 disability-adjusted life years (DALYs) per person per year. In many source waters, log reductions greater than the minimum requirements may be necessary. The benefit of using a QMRA approach is that assessments can include site-specific information to investigate how changes in the source water quality or the addition or optimization of treatment barriers can impact the microbiological quality of the drinking water being produced. Although QMRA approaches can range from screening assessments that use simple point estimates to full probabilistic risk assessments that include uncertainty analysis, the approach used should only be as complicated as necessary to make decisions on the risk management options.
2.4 Natural organic matter (NOM)
NOM is a mixture of organic compounds found in all surface and subsurface waters. It has significant impacts on drinking water treatment processes and drinking water quality. NOM exerts a demand on coagulants and chemical disinfectants and interferes with ultraviolet (UV) disinfection, all of which can lead to a deterioration of pathogen log removal and inactivation capability. Seasonal and weather-related events can significantly affect the concentration and character of NOM. The NOM guidance document presents tools to monitor raw and treated water. Recognizing and responding to changes in NOM helps to ensure the production of water of acceptable microbiological quality.
2.5 Turbidity
Turbidity measurements have different implications and limits depending on the nature of the particle/organic matter and whether it is being monitored in the source water, during treatment processes or in the distribution system. High turbidity measurements or measurement fluctuations can indicate changes in source water quality, inadequate water treatment, or disturbances in the distribution system that may impact the microbiological quality or safety of the water.
2.6 Biological stability of drinking water distribution systems
The guidance document on monitoring the biological stability of drinking water distribution systems discusses the causes of deterioration in microbiological quality in the distribution system. The guidance document presents methods and parameters that can be used to assess biological stability. A multi-parameter approach to monitoring is suggested. This may include traditional bacteriological indicators, disinfectant residual concentrations, adenosine tri-phosphate (ATP) measurements, as well as other physical and chemical parameters.
3.0 Source-to-tap approach for microbiological risks
Monitoring water for all pathogens that could be present in a drinking water system is not technically or economically feasible. Instead, a source-to-tap approach is used to remove or inactivate microbiological contaminants and thereby reduce the risk of exposure to pathogenic microorganisms to an acceptable level. This approach can be applied to both municipal and residential scale systems.
A source-to-tap approach for the management of drinking water requires a system assessment that involves: characterizing the source water; describing the treatment barriers that prevent or reduce contamination; highlighting the conditions that can result in contamination; and identifying control measures to mitigate those risks through the treatment and distribution systems to the consumer. The complexity of the system assessment will depend on the size and type of system. This approach is discussed in detail in "From source to tap: guidance on the multi-barrier approach to safe drinking water" (CCME, 2004). The present overview document does not provide a comprehensive list of activities and measures that should be implemented as part of a source-to-tap approach. Its purpose is to summarize how the Guidelines for Canadian Drinking Water Quality guideline technical documents and guidance documents should be used throughout the source-to-tap process to provide microbiologically safe drinking water.
3.1 Source water quality
Source water protection and management is an important first step to providing safe and reliable drinking water in a source-to-tap approach. To adequately understand a water source, it is important to carry out a source water assessment.
Source water assessments include the identification of current and potential fecal contamination sources in the watershed/aquifer (e.g., land uses), potential pathways and events by which the microbiological pathogens can make their way into the source water (e.g., surface runoff), and conditions likely to lead to peak concentrations (e.g., rainfall events). Depending on the water source, assessments should also include some source water monitoring for enteric protozoa and enteric viruses. Testing for microbiological indicators can also be used to provide information on fecal impacts. Given the impact that NOM can have on the treatment processes designed to produce microbiologically safe drinking water, the assessment should also include an understanding of NOM sources in the watershed/aquifer and the conditions that lead to changes in the concentration and/or character of NOM (e.g., precipitation/snowmelt events, algal blooms, drought, fire - see the guidance document on NOM).
Subsurface sources should have a comprehensive assessment conducted to determine if the source is vulnerable to contamination from fecal sources. This assessment should ideally include a hydrogeological assessment, and at a minimum, an evaluation of aquifer vulnerability, well integrity, and a survey of activities and physical features in the area that could result in fecal contamination. Individual households with private wells should also assess well vulnerability to fecal contamination to determine if treatment is necessary. General guidance on well construction, maintenance, protection and testing is typically available from provincial/territorial authorities. All subsurface sources should be periodically reassessed.
3.1.1 Sources and transport of enteric pathogens
In Canadian drinking water sources, the pathogens of greatest concern to human health are those that result from fecal contamination. Sources of human fecal matter, such as sewage treatment plant effluents, sewage lagoon discharges, combined sewer overflows, and improperly maintained septic systems have the potential to be significant sources of enteric protozoa, particularly Giardia and Cryptosporidium species, enteric viruses, and many bacterial pathogens. Fecal matter from animals is also considered an important source of enteric protozoa and bacterial pathogens, but is not a significant source of human pathogenic enteric viruses. Pathogens can be transported overland or through the subsurface to impact surface and groundwater sources. The extent of transport can be impacted by numerous factors such as land use, type of overlying soil and subsurface materials, and climatic factors such as drought, rainfall, and snowmelt.
3.1.2 Source water monitoring
Source waters may be contaminated with any or all categories of enteric pathogens (protozoa, viruses, or bacteria), and monitoring may include E. coli, Giardia, Cryptosporidium, enteric viruses, NOM and turbidity. Monitoring should reflect both normal pathogen loads in the water source and peak events. The conditions that are likely to lead to peak events can be identified using information about sources of fecal contamination from a source water assessment, together with historical data on rainfall, snowmelt, river flow, NOM and turbidity measurements.
Surface water and GUDI sources may be contaminated with protozoa, viruses and bacteria. Protocols for assessing whether subsurface sources are GUDI or non-GUDI may differ between jurisdictions and guidance should be obtained from the responsible drinking water authority. In surface water and GUDI sources, E. coli, turbidity, and NOM measurements are used to signify changing conditions, such as a decline in source water quality, higher loadings of pathogens and increased challenges to filtration and/or disinfection. They are typically measured on an ongoing basis (e.g., continuous, daily or weekly). If used, enterococci could also be monitored on a routine basis or during periods of targeted sampling, depending on the goal of the monitoring. Giardia, Cryptosporidium and enteric virus monitoring data help determine the types and level of treatment that should be in place to produce water that meets an acceptable level of risk from these pathogens. Monitoring for these pathogens is not practical on a routine basis but can be carried out less frequently (monthly, or during peak events).
Where source water sampling and analysis for Giardia, Cryptosporidium, and enteric viruses are not feasible (e.g., small supplies), pathogen concentrations can be estimated by taking into account information obtained from the source water assessment along with other water quality parameters that can provide information on the risk and/or level of fecal contamination in the water. To ensure that the drinking water is microbiologically safe, the application of disinfection or filtration processes should account for the variability and uncertainty in the source water quality data.
Subsurface sources that are found to be non-GUDI and that have a protection and management plan in place should not have protozoa or enteric bacteria present. However, these sources may still be vulnerable to contamination with enteric viruses. The occurrence of enteric viruses is not generally continuous and can vary greatly over time. Groundwater should be routinely monitored for fecal indicators, such as E. coli, along with total coliforms and turbidity. These data are used to help identify any changes occurring in the water that may indicate it is being influenced by surface water. Enterococci can also be used for this purpose in systems that choose to include additional non-regulatory monitoring. Testing for indicator bacteria and turbidity alone is generally not sufficient to fully assess the potential for fecal contamination. The indicator data should be considered in conjunction with a site-specific vulnerability assessment, to determine the risk of fecal contamination. Where feasible, monitoring for enteric viruses is also recommended for these sources. A subsurface source that is not GUDI does not need to be monitored for Giardia or Cryptosporidium.
For private well owners, the bacteriological indicators (e.g., total coliforms and E. coli) should be monitored at a minimum of two times per year (Health Canada, 2019). Enterococci monitoring may also be recommended by the province/territory. More frequent testing should be conducted if the well is shallow or in bedrock and there is a septic system on the property or nearby.
3.1.3 Source water protection and management
Where source waters are well characterized, it may be possible to implement barriers or risk management measures to help reduce the levels of pathogens in the water source. For example, in surface water sources this may include protection measures such as ensuring wastewater discharges are treated, or reducing or eliminating combined sewer overflows. Further information on source water protection measures can be found in "From source to tap: Guidance on the multi-barrier approach to safe drinking water" (CCME, 2004). Water utilities may also implement management strategies to address changes in their source water quality. For example, managers of surface water and GUDI sources may be able to limit capture of raw water during high-risk events, selectively operate an additional barrier during high-risk events, use an alternative source of water, or blend varying sources (groundwater and surface water), to lower the level of pathogens in the water they are treating. For groundwater sources, protection measures may include ensuring the well is appropriately sited and constructed and that activities such as ongoing well maintenance are being carried out to protect groundwater quality and maintain system integrity.
As a follow-up to any measures implemented, the water utilities should conduct an assessment to determine the impact that the measures have had on the quality of the water entering the treatment process.
3.2 Drinking water treatment
The primary goal of drinking water treatment is to reduce the presence of disease-causing organisms and associated health risks to an acceptable or tolerable level. For drinking water treatment processes to be effective, they need to be appropriately designed, operated, and optimized for the quality of the source water. Process control measures also need to be in place to ensure that the treatment processes are working, and verification monitoring should be conducted to confirm that the system is operating as expected.
3.2.1 Selection and optimization of treatment barriers
In general, all water supplies derived from surface water or GUDI sources should include adequate filtration (or equivalent technologies) and disinfection to meet the minimum health-based treatment goals for enteric viruses and protozoa. Source water assessments may determine that log reductions greater than the minimum requirements are necessary to produce water of an acceptable microbiological quality. For sources that are not GUDI but are determined to be vulnerable to viruses, adequate treatment should be in place to achieve the minimum health-based treatment goal for enteric viruses.
Some water systems will have multiple redundant barriers, so that even if a given barrier fails, the water will continue to be adequately treated. In the case of systems with non-redundant or single barriers, all barriers must be working well to provide the required level of treatment. For these systems, failure of a single treatment barrier could lead to an increased risk of a waterborne disease outbreak. It is important to note that many treatment processes are interdependent and rely on optimal conditions upstream in the treatment process for efficient operation of subsequent treatment steps. For example, coagulation and flocculation should be optimized to effectively account for changes in NOM in order to adequately remove particles by filtration. Optimization may also include adjustments for seasonal variations.
There are numerous types and combinations of treatment processes that can be used to reduce the concentration of pathogens in the treated water to an acceptable level. The treatment barrier(s) selected may be relatively simple, consisting of only disinfection, or they can be complex, using combinations of pre-treatments, filtration, and disinfection. The choice of treatment processes will depend on many factors, including:
- the type and concentration of pathogens in the source water (including short-term water quality degradation)
- the type of source water
- the physical and chemical qualities of the source water, in particular the concentration and character of NOM
- the variability of the raw water quality, which may require seasonal adjustments of the treatment process to ensure optimal treatment performance at all times
- other practical or operational considerations
Where possible, water utilities should evaluate the performance of existing and proposed treatment barriers/processes. Exact pathogen removal efficiencies will depend on the particulars of the water to be treated (e.g., the source water quality) and the treatment processes, including optimization. Specific log reduction rates can be established on the basis of demonstrated performance or pilot studies. Demonstration and challenge testing using pilot plant trials, or challenge trials with microbiological surrogates (such as bacillus spores, bacteriophages and microspheres) provide site-specific information on treatment plant performance and aid in optimization of the system. This can also help assess the performance of treatment barrier(s) under a variety of operating conditions. In addition to challenge testing, water systems can use approaches such as QMRA to assess how variations in treatment performance contribute to the overall risks in their water system.
Table 1 provides a summary of filtration technologies that are used for pathogen removal and the average pathogen log removal credits when these technologies are meeting the treatment limits for turbidity. As shown in the table, filtration can be a very effective method for removing protozoan pathogens. Many types of filtration, however, are less effective at removing enteric viruses due to their small size. Although not included in the table, waterborne bacterial pathogens can also be effectively removed using filtration technologies. Further information is available in the guidance document on waterborne pathogens.
Table 1: Treatment limits for turbidity and average log removal credits for various treatment barriers as specified in the Guidelines for Canadian Drinking Water Quality
Treatment barrier | Treatment limits for turbidity | Cryptosporidium removal credit (average) | Giardia removal credit (average) | Virus removal credit (average) |
---|---|---|---|---|
Conventional filtration |
≤ 0.3 NTUFootnote a Never exceed 1.0 NTU |
3.0 log |
3.0 log |
2.0 log |
Direct filtration |
≤ 0.3 NTUFootnote a Never exceed 1.0 NTU |
2.5 log |
2.5 log |
1.0 log |
Slow sand filtration |
≤ 1.0 NTUFootnote a Never exceed 3.0 NTU |
3.0 log |
3.0 log |
2.0 log |
Diatomaceous earth filtration |
≤ 1.0 NTUFootnote a Never exceed 3.0 NTU |
3.0 log |
3.0 log |
1.0 log |
Microfiltration | ≤ 0.1 NTUFootnote b |
Removal efficiency demonstrated through challenge testing and verified by direct integrity testing |
Removal efficiency demonstrated through challenge testing and verified by direct integrity testing |
Microfiltration membranes may be eligible for virus removal credit when preceded by a coagulation step. Removal efficiency demonstrated through challenge testing and verified by direct integrity testingFootnote c |
Ultrafiltration |
≤ 0.1 NTUFootnote b |
Removal efficiency demonstrated through challenge testing and verified by direct integrity testing |
Removal efficiency demonstrated through challenge testing and verified by direct integrity testing |
Removal efficiency demonstrated through challenge testing and verified by direct integrity testingFootnote c |
Nanofiltration and reverse osmosis |
≤ 0.1 NTUFootnote b |
Removal efficiency demonstrated through challenge testing and verified by direct integrity testingFootnote d |
Removal efficiency demonstrated through challenge testing and verified by direct integrity testingFootnote d |
Removal efficiency demonstrated through challenge testing and verified by direct integrity testingFootnote d |
Riverbank filtration/In-situ filtration |
N/A |
Site-specific determinationFootnote e |
Site-specific determinationFootnote e |
Site-specific determinationFootnote e |
Footnotes
- Footnote a
-
Must meet this value in 95% of measurements per filter cycle or per month.
- Footnote b
-
Must meet this value in 99% of measurements per operational filter period or per month. If measurement exceeds 0.1 nephelometric turbidity unit (NTU) for more than 15 minutes, membrane integrity should be investigated.
- Footnote c
-
Current direct integrity testing technologies for virus removal may not be able to verify > 2 log removal. Acceptable verification methods should be approved by the responsible drinking water authority.
- Footnote d
-
Nanofiltration/ reverse osmosis membranes do not currently come equipped with direct integrity testing capability - acceptable verification methods should be approved by the responsible drinking water authority.
- Footnote e
-
As required by the responsible drinking water authority.
Disinfection is used for two different objectives. The goal of primary disinfection is to inactivate microorganisms before the water enters the distribution system. The effectiveness of primary disinfection is dependent on the physical characteristics of the water, such as temperature, pH, NOM and turbidity, and on the pathogen and the type of disinfectant. For example, chlorine is highly effective at inactivating enteric viruses and enteric bacterial pathogens, and may be used as a primary disinfectant to meet the treatment goals for these groups of pathogens. On the other hand, it is not effective for Cryptosporidium inactivation in a drinking water system and other disinfectants, such as ozone and ultraviolet light (UV), are better choices than chlorine for meeting the health-based treatment goals for this organism.
The efficacy of chemical disinfectants is commonly described using the CT concept. CT is the product of "C" (the residual concentration of disinfectant, measured in mg/L) and "T" (the disinfectant contact time, measured in minutes) for a specific microorganism under defined conditions (e.g., temperature and pH). The contact time T is often calculated using a T10 value, which is defined as the detention time at which 90% of the water meets or exceeds the required contact time. To account for disinfectant decay, the residual concentration is usually determined at the exit of the contact chamber rather than using the applied dose or initial concentration. UV disinfection is described using the IT concept. IT is the product of light intensity "I" (measured in mW/cm2 or W/m2) and time "T" (measured in seconds) which results in a computed dose (fluence) in mJ/cm2 for a specific microorganism.
Disinfectants commonly used for primary disinfection include chlorine, ozone, chlorine dioxide and UV. Chloramine is generally not used for primary disinfection as it has a lower disinfecting power and requires very high CT values. The guideline technical documents for enteric viruses and enteric protozoa, as well as the guidance documents on NOM and waterborne pathogens, should be consulted for further information.
Secondary disinfection is practiced to provide a disinfectant residual throughout the distribution system. This has two main benefits: (1) maintaining a disinfectant residual helps limit the growth of biofilms within the distribution system; and (2) a drop in disinfectant residual concentration can serve as a sentinel for water quality changes, such as increased microbiological activity or physical integrity issues. Although no log inactivation credits are awarded for secondary disinfection processes, disinfectant residual monitoring can provide an early warning that contaminants have entered the distribution system. Variability in disinfectant residual concentrations (measured as the coefficient of variation) can be a useful indicator of biological stability. The only disinfectants that provide a disinfectant residual are chlorine-based, namely chlorine and chloramine.
For those water utilities using chloramination, it is important to optimize the process such that monochloramine formation is favoured and that the formation of di- and tri-chloramine are minimized. When measuring the residual concentration, it is important to determine the percent of monochloramine to total chloramine to check if organic chloramines are present; organic chloramines are undesirable because they provide little to no disinfection. By evaluating trends, water utilities can quickly observe any increase or decrease in concentrations and take appropriate action, if required (e.g., enhance monitoring, reform chloramines, boost residual). More information can be found in the guideline technical document on chloramines.
3.2.2 Assessing water treatment processes
As part of a robust treatment system that reliably produces microbiologically safe drinking water, process control measures are needed. Monitoring NOM, turbidity, and disinfectant residual are used as part of these control measures. Disinfection effectiveness will be impacted by these factors, as well as by pH and temperature.
Parameters related to NOM should be monitored to identify water quality changes and ensure that optimal coagulation conditions exist for pathogen removal. Other parameters are also helpful (e.g., total organic carbon for biostability control or chemical oxygen demand for oxidizability). More information is available in the guidance document on NOM.
Turbidity monitoring of the treated water is used for assessing the removal credits for the various filtration technologies. To achieve the average pathogen log removals for these various filtration technologies, turbidity measurements must, as a minimum, meet the treatment limits as recommended in the guideline technical document for turbidity. Systems that already meet the applicable treatment limits should strive to meet a treated water turbidity target of less than 0.1 NTU to ensure production of the highest water quality possible. Systems that are using filtration for reasons other than pathogen removal do not need to meet the treatment limits. However, it is good practice to ensure that water entering the distribution system has turbidity levels below 1.0 NTU. Systems may be allowed to have greater than 1.0 NTU based on their site-specific risk characterization. More information can be found in the guideline technical document on turbidity and in the guidance document on NOM.
Control, optimization and monitoring of disinfection processes are essential to ensure that the pathogens of concern are being inactivated to the level necessary, while minimizing the formation of disinfection by-products as much as possible. The health risks from consuming water that has not been adequately treated for pathogens is much higher than any health risk associated with disinfection by-products. Thus, disinfection must never be compromised by efforts to minimize disinfection by-products. It is also essential that the reduction in pathogen levels is achieved before drinking water reaches the first consumer in the distribution system.
The primary disinfection process should be continuously monitored to verify the required CT/IT is met at all times. Continuous monitoring is also recommended throughout the distribution system or grab samples should be frequently tested for disinfectant residuals (e.g., free chlorine or monochloramine).
3.2.3 Verification monitoring
An important component of a source-to-tap approach is verification monitoring. Verification monitoring involves routinely monitoring the treated drinking water for an indicator organism that provides assurance that the treatment system is operating as expected.
E. coli and total coliforms are used as indicator organisms. They are routinely monitored as a confirmation that the treatment process control measures are working and that the water has been adequately treated and is, therefore, of an acceptable microbiological quality. Enterococci can also be used for these purposes when jurisdictions wish to include supplemental indicators. E. coli, total coliforms, and enterococci are more susceptible than some pathogens to many of the disinfectants commonly used in drinking water. Therefore, these indicators need to be used in conjunction with information on treatment performance to reliably produce drinking water of an acceptable quality. When used as part of a source-to-tap approach, the presence of these indicators in the water leaving the treatment plant indicates a serious breach in the treatment process. Actions should be carried out immediately (e.g., notifications, corrective actions) and the cause of the breach investigated. The guideline technical documents on E. coli and total coliforms, and the guidance document on enterococci, provide further information on the actions that should be taken.
3.2.4 Trained personnel
The successful operation of any drinking water supply system (from private wells to large complex treatment plants) depends on the skills, abilities, and knowledge of the responsible owners and operators. Individuals need to understand the impact their activities and decisions can have on the quality and safety of the water being produced. It is therefore important that these individuals have the appropriate type and level of training for their systems. Training can include studies at post-secondary institutions, water association training courses, in-house training and mentoring programs, on-the-job experience in consultation with other trained operators or government specialists, workshops, seminars, courses, and conferences (CCME, 2004). Training should be an ongoing process to ensure owners and operators maintain and update their skills, and are kept informed of new regulatory requirements. It is recommended that jurisdictions have in place programs that ensure responsible individuals are properly trained and certified.
3.3 Distribution system integrity
Properly managed distribution systems are important for maintaining the water quality as it is delivered to consumers. Ideally, there should be minimal change in water quality in the distribution system, a concept referred to as maintaining biological stability. To maintain biological stability, a distribution system management plan should be in place. This may include: identification of a system's vulnerabilities and the corresponding risk management plan for preventing and handling contamination events; required maintenance activities; frequencies and locations for monitoring water quality parameters; and documentation requirements for adverse events and corrective actions. For water utilities that are using chloramination, as well as utilities with ammonia in the source water, a nitrification monitoring program is also recommended. Further information can be found in the guideline technical documents on chloramines, nitrate and nitrite, and ammonia.
3.3.1 Distribution system monitoring
A distribution system monitoring program should be designed and implemented to establish baseline conditions, monitor changes and detect ongoing or potential contamination events.
The parameters included for routine monitoring should be based on a system-specific assessment and meet the requirements of the responsible drinking water authority. Monitoring should be carried out throughout the entire distribution system particularly in areas with long retention times (e.g., dead-ends and poor hydraulics) or that have demonstrated deteriorating water quality. The frequency of monitoring should be based on system-specific characteristics such as the type and size of the system and the vulnerability of the system to changes in water quality. Changes in the trend of distribution system parameters should trigger more frequent monitoring. The guidance document on monitoring biostability of drinking water distribution systems discusses a full range of options, including rapid methods (e.g., ATP, turbidity), laboratory methods (e.g., bacterial indicators, heterotrophic plate counts) and advanced methods (e.g., flow cytometry, molecular methods).
3.3.2 Distribution system operation and management
A well-maintained and operated distribution system is a critical component of providing safe drinking water. Examples of best management practices for minimizing risks in the distribution system are discussed in the various guideline and guidance documents and include:
- treatment optimization to minimize nutrients entering into the system
- managing water age
- managing impacts of water temperature
- maintaining an effective disinfectant residual
- maintaining pH within ±0.2 units
- keeping the distribution system clean
- maintaining positive pressure
- minimizing physical and hydraulic disturbances
Monitoring the water quality in the distribution system is used to determine when water quality changes occur. If water quality changes lead to the release of contaminants at concentrations that may cause public health problems, corrective actions, which may include the issuance of a boil water advisory, should be implemented. A guidance document on issuing and rescinding boil water advisories is available as part of the GCDWQ.
3.4 Communication and public education
An important component of the source-to-tap approach is helping consumers understand the quality of their drinking water and the role they play in maintaining this quality. This could include informing the public of activities such as service disruptions, water quality testing results, and boil water advisories. It can also include engaging the public in aspects of the drinking water program such as source water protection and the development of infrastructure projects. Further information can be found in CCME (2004).
Although it is generally outside the scope of drinking water utilities, premise plumbing is included in the overall source-to-tap approach. It is important that the quality of the water delivered to consumers not result in premise plumbing issues, such as leaching of lead, and that activities within a consumer location not adversely affect the distribution system, such as contamination through cross-connections. Consumers should also be made aware of how to avoid deterioration of water quality within their premise plumbing, including maintaining hot water temperatures sufficient to minimize the growth of opportunistic pathogens such as Legionella. The key to managing many premise plumbing concerns is consumer education. This becomes particularly important as premise plumbing becomes more complex with dual plumbing and water reuse systems.
4.0 References
CCME (2004). From source to tap: Guidance on the multi-barrier approach to safe drinking water. Produced jointly by the Federal-Provincial-Territorial Committee on Drinking Water and the Canadian Council of the Ministers of the Environment Water Quality Task Group. Available at www.ccme.ca/assets/pdf/mba_guidance_doc_e.pdf.
Health Canada (2019). Be Well Aware - Information for private well owners. Government of Canada. Available at https://www.canada.ca/en/health-canada/services/publications/healthy-living/water-talk-information-private-well-owners.html.
WHO (2004) Managing microbial water quality in piped distribution systems Edited by R. Ainsworth Published by IWA Publishing on behalf of the World Health Organization. A report on microbial contaminants and growth of microorganisms in distribution networks and the practices that contribute to ensuring drinking-water safety in piped distribution systems. (http://www.who.int/entity/water_sanitation_health/dwq/en/safepipedwater.pdf).
5.0 Further reading
Guideline Technical Documents (GTD) and Guidance Documents (GD):
- Ammonia (GTD - 2013)
- Bromate (GTD - 2018)
- Chloral hydrate (GD - 2008)
- Chloramines (GTD - 2020)
- Chlorine (GTD - 2009)
- Chlorite and chlorate (GTD - 2008)
- Enteric protozoa: Giardia and Cryptosporidium (GTD - 2019)
- Enteric viruses (GTD - 2019)
- Escherichia coli (GTD - 2020)
- Haloacetic acids (GTD - 2008)
- Issuing and rescinding boil water advisories in Canadian drinking water supplies (GD - 2015)
- Monitoring the biological stability of drinking water in distribution systems (for public consultation) (GD- 2020)
- Natural organic matter in drinking water (2020)
- N-Nitrosodimethylamine (GTD - 2011)
- Nitrate and nitrite (GTD - 2013)
- The use of enterococci bacteria as indicators in Canadian drinking water supplies (GD - 2020)
- The use of quantitative microbial risk assessment in drinking water (GD - 2019)
- Total coliforms (GTD - 2020)
- Trihalomethanes (GTD - 2006)
- Turbidity (GTD - 2012)
- Waterborne pathogens (for public consultation) (GD - 2020)
All of the "Further reading" documents are available at Water Quality - Reports and Publications .
Appendix A: List of abbreviations
ATP adenosine triphosphate
CT concentration × time
DALYs disability adjusted life years
E. coli Escherichia coli
GCDWQ Guidelines for Canadian Drinking Water Quality
GUDI groundwater under the direct influence of surface water
IT ultraviolet light intensity "I" x time "T"
NOM natural organic matter
NTU nephelometric turbidity unit
QMRA quantitative microbial risk assessment
UV ultraviolet
WHO World Health Organization
Page details
- Date modified: