Votre recherche
Résultats 16 ressources
-
A framework is proposed using multi-stream phycocyanin probe readings to manage cyanobacterial risks at the source water and across the drinking water treatment processes. , In situ phycocyanin (PC) probes have been deployed as a cost-effective and efficient way to monitor cyanobacterial (CB) abundance in drinking water sources and to identify periods of potential risk at drinking water treatment plants (DWTPs). Monitoring CB removal efficacies in near real-time by sequentially using a single probe for multiple streams across the treatment plant provides a more useful assessment of CB risk breakthrough in treated water. Removal efficacies were measured in three DWTPs using integrated mass fluxes estimated from PC readings and grab sample total CB biovolume estimations in raw, clarified, filtered, and treated water. Selective CB species removal during the treatment processes was also evaluated. In addition, relationships between physio-chemical parameters (turbidity, pH, dissolved oxygen, conductivity, chlorophyll-a and temperature) and PC across the treatment processes were investigated. Finally, a framework to use in situ multi-stream PC monitoring, gathering data across the treatment chain, is proposed to manage risks of CB cells breakthrough in treated water. Estimates of 2 hour moving average PC metrics are proposed to provide short term alert in raw water and establish periods of treatment vulnerability or dysfunction, while daily mean PC values can be used to estimate total and process specific log removals to justify treatment adjustments. Benefits and limitations of the tested in situ probes for the application of alert levels are discussed, and key knowledge gaps for future research and guidance are identified.
-
Abstract Worldwide, there has been an increase in the presence of potentially toxic cyanobacterial blooms in drinking water sources and within drinking water treatment plants (DWTPs). The objective of this study is to validate the use of in situ probes for the detection and management of cyanobacterial breakthrough in high and low-risk DWTPs. In situ phycocyanin YSI EXO2 probes were devised for remote control and data logging to monitor the cyanobacteria in raw, clarified, filtered, and treated water in three full-scale DWTPs. An additional probe was installed inside the sludge holding tank to measure the water quality of the surface of the sludge storage tank in a high-risk DWTP. Simultaneous grab samplings were carried out for taxonomic cell counts and toxin analysis. A total of 23, 9, and 4 field visits were conducted at the three DWTPs. Phycocyanin readings showed a 93-fold fluctuation within 24 h in the raw water of the high cyanobacterial risk plant, with higher phycocyanin levels during the afternoon period. These data provide new information on the limitations of weekly or daily grab sampling. Also, different moving averages for the phycocyanin probe readings can be used to improve the interpretation of phycocyanin signal trends. The in situ probe successfully detected high cyanobacterial biovolumes entering the clarification process in the high-risk plant. Grab sampling results revealed high cyanobacterial biovolumes in the sludge for both high and low-risk plants.
-
Freshwater bodies and, consequently, drinking water treatment plants (DWTPs) sources are increasingly facing toxic cyanobacterial blooms. Even though conventional treatment processes including coagulation, flocculation, sedimentation, and filtration can control cyanobacteria and cell-bound cyanotoxins, these processes may encounter challenges such as inefficient removal of dissolved metabolites and cyanobacterial cell breakthrough. Furthermore, conventional treatment processes may lead to the accumulation of cyanobacteria cells and cyanotoxins in sludge. Pre-oxidation can enhance coagulation efficiency as it provides the first barrier against cyanobacteria and cyanotoxins and it decreases cell accumulation in DWTP sludge. This critical review aims to: (i) evaluate the state of the science of cyanobacteria and cyanotoxin management throughout DWTPs, as well as their associated sludge, and (ii) develop a decision framework to manage cyanobacteria and cyanotoxins in DWTPs and sludge. The review identified that lab-cultured-based pre-oxidation studies may not represent the real bloom pre-oxidation efficacy. Moreover, the application of a common exposure unit CT (residual concentration × contact time) provides a proper understanding of cyanobacteria pre-oxidation efficiency. Recently, reported challenges on cyanobacterial survival and growth in sludge alongside the cell lysis and cyanotoxin release raised health and technical concerns with regards to sludge storage and sludge supernatant recycling to the head of DWTPs. According to the review, oxidation has not been identified as a feasible option to handle cyanobacterial-laden sludge due to low cell and cyanotoxin removal efficacy. Based on the reviewed literature, a decision framework is proposed to manage cyanobacteria and cyanotoxins and their associated sludge in DWTPs.
-
Waterborne pathogens are heterogeneously distributed across various spatiotemporal scales in water resources, and representative sampling is therefore crucial for accurate risk assessment. Since regulatory monitoring of microbiological water quality is usually conducted at fixed time intervals, it can miss short-term fecal contamination episodes and underestimate underlying microbial risks. In the present paper, we developed a new automated sampling methodology based on near real-time measurement of a biochemical indicator of fecal pollution. Online monitoring of β-D-glucuronidase (GLUC) activity was used to trigger an automated sampler during fecal contamination events in a drinking water supply and at an urban beach. Significant increases in protozoan parasites, microbial source tracking markers and E. coli were measured during short-term (<24 h) fecal pollution episodes, emphasizing the intermittent nature of their occurrence in water. Synchronous triggering of the automated sampler with online GLUC activity measurements further revealed a tight association between the biochemical indicator and culturable E. coli. The proposed event sampling methodology is versatile and in addition to the two triggering modes validated here, others can be designed based on specific needs and local settings. In support to regulatory monitoring schemes, it should ultimately help gathering crucial data on waterborne pathogens more efficiently during episodic fecal pollution events.
-
Abstract Temporal variations in concentrations of pathogenic microorganisms in surface waters are well known to be influenced by hydrometeorological events. Reasonable methods for accounting for microbial peaks in the quantification of drinking water treatment requirements need to be addressed. Here, we applied a novel method for data collection and model validation to explicitly account for weather events (rainfall, snowmelt) when concentrations of pathogens are estimated in source water. Online in situ β ‐ d ‐glucuronidase activity measurements were used to trigger sequential grab sampling of source water to quantify Cryptosporidium and Giardia concentrations during rainfall and snowmelt events at an urban and an agricultural drinking water treatment plant in Quebec, Canada. We then evaluate if mixed Poisson distributions fitted to monthly sampling data ( = 30 samples) could accurately predict daily mean concentrations during these events. We found that using the gamma distribution underestimated high Cryptosporidium and Giardia concentrations measured with routine or event‐based monitoring. However, the log‐normal distribution accurately predicted these high concentrations. The selection of a log‐normal distribution in preference to a gamma distribution increased the annual mean concentration by less than 0.1‐log but increased the upper bound of the 95% credibility interval on the annual mean by about 0.5‐log. Therefore, considering parametric uncertainty in an exposure assessment is essential to account for microbial peaks in risk assessment.
-
Agricultural activities can result in the contamination of surface runoff with pathogens, pesticides, and nutrients. These pollutants can enter surface water bodies in two ways: by direct discharge into surface waters or by infiltration and recharge into groundwater, followed by release to surface waters. Lack of financial resources makes risk assessment through analysis of drinking water pollutants challenging for drinking water suppliers. Inability to identify agricultural lands with a high-risk level and implement action measures might lead to public health issues. As a result, it is essential to identify hazards and conduct risk assessments even with limited data. This study proposes a risk assessment model for agricultural activities based on available data and integrating various types of knowledge, including expert and literature knowledge, to estimate the levels of hazard and risk that different agricultural activities could pose to the quality of withdrawal waters. To accomplish this, we built a Bayesian network with continuous and discrete inputs capturing raw water quality and land use upstream of drinking water intakes (DWIs). This probabilistic model integrates the DWI vulnerability, threat exposure, and threats from agricultural activities, including animal and crop production inventoried in drainage basins. The probabilistic dependencies between model nodes are established through a novel adaptation of a mixed aggregation method. The mixed aggregation method, a traditional approach used in ecological assessments following a deterministic framework, involves using fixed assumptions and parameters to estimate ecological outcomes in a specific case without considering inherent randomness and uncertainty within the system. After validation, this probabilistic model was used for four water intakes in a heavily urbanized watershed with agricultural activities in the south of Quebec, Canada. The findings imply that this methodology can assist stakeholders direct their efforts and investments on at-risk locations by identifying agricultural areas that can potentially pose a risk to DWIs.
-
Conventional processes (coagulation, flocculation, sedimentation, and filtration) are widely used in drinking water treatment plants and are considered a good treatment strategy to eliminate cyanobacterial cells and cell-bound cyanotoxins. The diversity of cyanobacteria was investigated using taxonomic cell counts and shotgun metagenomics over two seasons in a drinking water treatment plant before, during, and after the bloom. Changes in the community structure over time at the phylum, genus, and species levels were monitored in samples retrieved from raw water (RW), sludge in the holding tank (ST), and sludge supernatant (SST). Aphanothece clathrata brevis, Microcystis aeruginosa, Dolichospermum spiroides, and Chroococcus minimus were predominant species detected in RW by taxonomic cell counts. Shotgun metagenomics revealed that Proteobacteria was the predominant phylum in RW before and after the cyanobacterial bloom. Taxonomic cell counts and shotgun metagenomic showed that the Dolichospermum bloom occurred inside the plant. Cyanobacteria and Bacteroidetes were the major bacterial phyla during the bloom. Shotgun metagenomics also showed that Synechococcus, Microcystis, and Dolichospermum were the predominant detected cyanobacterial genera in the samples. Conventional treatment removed more than 92% of cyanobacterial cells but led to cell accumulation in the sludge up to 31 times more than in the RW influx. Coagulation/sedimentation selectively removed more than 96% of Microcystis and Dolichospermum. Cyanobacterial community in the sludge varied from raw water to sludge during sludge storage (1–13 days). This variation was due to the selective removal of coagulation/sedimentation as well as the accumulation of captured cells over the period of storage time. However, the prediction of the cyanobacterial community composition in the SST remained a challenge. Among nutrient parameters, orthophosphate availability was related to community profile in RW samples, whereas communities in ST were influenced by total nitrogen, Kjeldahl nitrogen (N- Kjeldahl), total and particulate phosphorous, and total organic carbon (TOC). No trend was observed on the impact of nutrients on SST communities. This study profiled new health-related, environmental, and technical challenges for the production of drinking water due to the complex fate of cyanobacteria in cyanobacteria-laden sludge and supernatant.
-
Health-related concerns about cyanobacteria-laden sludge of drinking water treatment plants (DWTPs) have been raised in the past few years. Microscopic taxonomy, shotgun metagenomic sequencing, and microcystin (MC) measurement were applied to study the fate of cyanobacteria and cyanotoxins after controlled sludge storage (stagnation) in the dark in a full-scale drinking water treatment plant within 7 to 38 days. For four out of eight dates, cyanobacterial cell growth was observed by total taxonomic cell counts during sludge stagnation. The highest observed cell growth was 96% after 16 days of stagnation. Cell growth was dominated by potential MC producers such as Microcystis, Aphanocapsa, Chroococcus, and Dolichospermum. Shotgun metagenomic sequencing unveiled that stagnation stress shifts the cyanobacterial communities from the stress-sensitive Nostocales (e.g., Dolichospermum) order towards less compromised orders and potential MC producers such as Chroococcales (e.g., Microcystis) and Synechococcales (e.g., Synechococcus). The relative increase of cyanotoxin producers presents a health challenge when the supernatant of the stored sludge is recycled to the head of the DWTP or discharged into the source. These findings emphasize the importance of a strategy to manage cyanobacteria-laden sludge and suggest practical approaches should be adopted to control health/environmental impacts of cyanobacteria and cyanotoxins in sludge.
-
ABSTRACT Urbanization is leading to more frequent flooding as cities have more impervious surfaces and runoff exceeds the capacity of combined sewer systems. In heavy rainfall, contaminated excess water is discharged into the natural environment, damaging ecosystems and threatening drinking water sources. To address these challenges aggravated by climate change, urban blue-green water management systems, such as bioretention cells, are increasingly being adopted. Bioretention cells use substrate and plants adapted to the climate to manage rainwater. They form shallow depressions, allowing infiltration, storage, and gradual evacuation of runoff. In 2018, the City of Trois-Rivières (Québec, Canada) installed 54 bioretention cells along a residential street, several of which were equipped with access points to monitor performance. Groundwater quality was monitored through the installation of piezometers to detect potential contamination. This large-scale project aimed to improve stormwater quality and reduce sewer flows. The studied bioretention cells reduced the flow and generally improved water quality entering the sewer system, as well as the quality of stormwater, with some exceptions. Higher outflow concentrations were observed for contaminants such as manganese and nitrate. The results of this initiative provide useful recommendations for similar projects for urban climate change adaptation.
-
Wastewater surveillance for SARS-CoV-2 RNA is a relatively recent adaptation of long-standing wastewater surveillance for infectious and other harmful agents. Individuals infected with COVID-19 were found to shed SARS-CoV-2 in their faeces. Researchers around the world confirmed that SARS-CoV-2 RNA fragments could be detected and quantified in community wastewater. Canadian academic researchers, largely as volunteer initiatives, reported proof-of-concept by April 2020. National collaboration was initially facilitated by the Canadian Water Network. Many public health officials were initially skeptical about actionable information being provided by wastewater surveillance even though experience has shown that public health surveillance for a pandemic has no single, perfect approach. Rather, different approaches provide different insights, each with its own strengths and limitations. Public health science must triangulate among different forms of evidence to maximize understanding of what is happening or may be expected. Well-conceived, resourced, and implemented wastewater-based platforms can provide a cost-effective approach to support other conventional lines of evidence. Sustaining wastewater monitoring platforms for future surveillance of other disease targets and health states is a challenge. Canada can benefit from taking lessons learned from the COVID-19 pandemic to develop forward-looking interpretive frameworks and capacity to implement, adapt, and expand such public health surveillance capabilities.
-
ABSTRACT Wastewater-based epidemiology has emerged as a promising tool to monitor pathogens in a population, particularly when clinical diagnostic capacities become overwhelmed. During the ongoing COVID-19 pandemic caused by Severe Acute Respiratory Syndrome Coronavirus-2 (SARS-CoV-2), several jurisdictions have tracked viral concentrations in wastewater to inform public health authorities. While some studies have also sequenced SARS-CoV-2 genomes from wastewater, there have been relatively few direct comparisons between viral genetic diversity in wastewater and matched clinical samples from the same region and time period. Here we report sequencing and inference of SARS-CoV-2 mutations and variant lineages (including variants of concern) in 936 wastewater samples and thousands of matched clinical sequences collected between March 2020 and July 2021 in the cities of Montreal, Quebec City, and Laval, representing almost half the population of the Canadian province of Quebec. We benchmarked our sequencing and variant-calling methods on known viral genome sequences to establish thresholds for inferring variants in wastewater with confidence. We found that variant frequency estimates in wastewater and clinical samples are correlated over time in each city, with similar dates of first detection. Across all variant lineages, wastewater detection is more concordant with targeted outbreak sequencing than with semi-random clinical swab sampling. Most variants were first observed in clinical and outbreak data due to higher sequencing rate. However, wastewater sequencing is highly efficient, detecting more variants for a given sampling effort. This shows the potential for wastewater sequencing to provide useful public health data, especially at places or times when sufficient clinical sampling is infrequent or infeasible.