Votre recherche
Résultats 584 ressources
-
Abstract. The ongoing warming of cold regions is affecting hydrological processes, causing deep changes, such as a ubiquitous increase in river winter discharges. The drivers of this increase are not yet fully identified mainly due to the lack of observations and field measurements in cold and remote environments. In order to provide new insights into the sources generating winter runoff, the present study explores the possibility of extracting information from icings that form over the winter and are often still present early in the summer. Primary sources detection was performed using time-lapse camera images of icings found in both proglacial fields and upper alpine meadows in June 2016 in two subarctic glacierized catchments in the upper part of the Duke watershed in the St. Elias Mountains, Yukon. As images alone are not sufficient to entirely cover a large and hydrologically complex area, we explore the possibility of compensating for that limit by using four supplementary methods based on natural tracers: (a) stable water isotopes, (b) water ionic content, (c) dissolved organic carbon, and (d) cryogenic precipitates. The interpretation of the combined results shows a complex hydrological system where multiple sources contribute to icing growth over the studied winter. Glaciers of all sizes, directly or through the aquifer, represent the major parent water source for icing formation in the studied proglacial areas. Groundwater-fed hillslope tributaries, possibly connected to suprapermafrost layers, make up the other detectable sources in icing remnants. If similar results are confirmed in other cold regions, they would together support a multi-causal hypothesis for a general increase in winter discharge in glacierized catchments. More generally, this study shows the potential of using icing formations as a new, barely explored source of information on cold region winter hydrological processes that can contribute to overcoming the paucity of observations in these regions.
-
Abstract This study aims to isolate and quantify the role of shrinking glaciers in recent hydrological changes in eight watersheds in the southwestern Yukon (Canada) by using an original dual approach that consists of (i) watershed hydrological regime identification, followed by a trend analysis of discharge time series, and (ii) a model‐based peak water (PW) analysis using glacier cover change measurements. A distinction between hydrological regimes is a necessary add‐up to commonly used trend attribution methods as the lake runoff regime shares common characteristics with the glacier regime. Results show a link between shrinking glaciers and hydrological changes in the region, but the link is complex, and glacier retreat does not explain all the observed changes. Model outputs show that the two watersheds with a glacierized area exceeding 30% and one watershed with 2.9% glacierized area have not reached PW, whereas a 9.2% glacierized watershed and another watershed with 2.1% glacierized area have already passed it. These results suggest that glacierized area alone cannot explain short‐term changes related to watershed current position in terms of PW, and the rate of glacier retreat must be considered. By contrast, the actual rate of glacier retreat does not influence long‐term changes, such as the magnitude of PW and of the consequent drop in discharge. Once glaciers will have retreated to a point close to extinction, declines in summer discharge from 10% to 70% and proportional to the actual glacier cover are anticipated at watersheds that are currently more than 9% glacierized. , Plain Language Summary In this study, we aim to understand how shrinking glacier cover affects river discharges. In conditions of continuous retreat, glaciers produce an initial increase in runoff as they lose mass. The discharge then reaches a turning point, a plateau called peak water, and subsequently declines as the volume of glacial ice continues to decrease. When analyzing eight watersheds with different glacier covers in the southwestern Yukon, we found that two watersheds that are 30% covered by glaciers have not yet reached this plateau, and therefore, the discharge will continue to increase. Several watersheds with smaller glacierized portions have passed peak water, which means that the discharge will now continue to decrease. We were also able to estimate the magnitudes of these changes in discharge. We show that two watersheds with 30% glacierized area can still experience a 1.5‐ to 2‐fold increase in discharge and that watersheds currently more than 9% glacierized are predicted to show noticeable changes after peak water, with the possibility of discharge decreasing by a factor of 3 to 5 by the time glaciers have retreated to a point when their hydrological influence at the watershed scale becomes insignificant. , Key Points Noticeable acceleration of glacier retreat occurred in southwestern Yukon since 1999 with measured consequences for the regional hydrology Various hydrological changes have been detected at the study watersheds. Glacier retreat explains many but not all of those changes Long‐term hydrological changes are glacier cover dependent while decadal‐scale changes are driven by glacier retreat rate
-
Evaluating the historical contribution of the volume loss of ice to stream flow based on reconstructed volume changes through the Little Ice Age (LIA) can be directly related to the understanding of glacier-hydrology in the current epoch of rapid glacier loss that has disquieting implications for a water resource in the Cordillera Blanca in the Peruvian Andes. However, the accurate prediction of the future glacial meltwater availability for the developing regional Andean society needs more extensive quantitative estimation from long-term glacial meltwater of reconstructed glacial volume. Modeling the LIA paleoglaciers through the mid-19th century (with the most extensive recent period of mountain glacier expansion having occurred around 1850 AD) in different catchments of the Cordillera Blanca allows us to reconstruct glacier volume and its change from likely combinations of climatic control variables and time. We computed the rate and magnitude of centennial-scale glacier volume changes for glacier surfaces between the LIA and the modern era, as defined by 2011 Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) Global Digital Elevation Model Version 2 (GDEM V2) and 2008 Light Detection and Range (LiDAR) data. The model simulation showed good agreement with the observed geomorphic data and the volume and surface area (V-S) scaling remained within the 25% error range in the reconstructed simulation. Also, we employed a recently demonstrated approach (Baraër, M. et al.) to calculate meltwater contribution to glacierized catchment runoff. The results revealed multiple peaks of both mean annual and dry season discharge that have never been shown in previous research on the same mountain range.
-
The rapidly melting glaciers of Peru are posing new risks to regional dry season water supplies, and this is evident in the Cordillera Blanca, the mountain range with the world's largest concentration of tropical glaciers. Permanent ice loss is causing reductions to dry season streamflow, which is coupled with shifting demands and control over water access and entitlements in the region. A full evaluation of hydrologic inputs is required to inform future water management in the relative absence of glaciers. Over the last decade, new studies have shown groundwater to be a significant component of the regional water budget during the dry season, and it cannot be ignored when accounting for water quality and quantity downstream of the Cordillera Blanca's alpine catchments. Distinctive common features of the Cordillera Blanca's proglacial catchments are sediment‐filled valleys that were once under proglacial lake conditions. The combination of lake sediments with other alpine depositional features results in storage and interseasonal release of groundwater that comprises up to 80% of the valley's streamflow during the driest months of the year. We summarize the emerging understanding of hydrogeologic processes in proglacial headwater systems of the region's principal river, the Rio Santa, and make suggestions for future research that will more clearly characterize the spatial distribution of stored groundwater within the mountain range. As glaciers continue to recede, differences in aquifer thickness and groundwater residence time between alpine catchments in the region will increasingly control dry season water availability at the local and basin scale. This article is categorized under: Science of Water > Hydrological Processes Science of Water > Water and Environmental Change Engineering Water > Planning Water
-
Abstract Debris-covered glaciers are an important component of the mountain cryosphere and influence the hydrological contribution of glacierized basins to downstream rivers. This study examines the potential to make estimates of debris thickness, a critical variable to calculate the sub-debris melt, using ground-based thermal infrared radiometry (TIR) images. Over four days in August 2019, a ground-based, time-lapse TIR digital imaging radiometer recorded sequential thermal imagery of a debris-covered region of Peyto Glacier, Canadian Rockies, in conjunction with 44 manual excavations of debris thickness ranging from 10 to 110 cm, and concurrent meteorological observations. Inferring the correlation between measured debris thickness and TIR surface temperature as a base, the effectiveness of linear and exponential regression models for debris thickness estimation from surface temperature was explored. Optimal model performance ( R 2 of 0.7, RMSE of 10.3 cm) was obtained with a linear model applied to measurements taken on clear nights just before sunrise, but strong model performances were also obtained under complete cloud cover during daytime or nighttime with an exponential model. This work presents insights into the use of surface temperature and TIR observations to estimate debris thickness and gain knowledge of the state of debris-covered glacial ice and its potential hydrological contribution.
-
Abstract Temporal variations in concentrations of pathogenic microorganisms in surface waters are well known to be influenced by hydrometeorological events. Reasonable methods for accounting for microbial peaks in the quantification of drinking water treatment requirements need to be addressed. Here, we applied a novel method for data collection and model validation to explicitly account for weather events (rainfall, snowmelt) when concentrations of pathogens are estimated in source water. Online in situ β ‐ d ‐glucuronidase activity measurements were used to trigger sequential grab sampling of source water to quantify Cryptosporidium and Giardia concentrations during rainfall and snowmelt events at an urban and an agricultural drinking water treatment plant in Quebec, Canada. We then evaluate if mixed Poisson distributions fitted to monthly sampling data ( = 30 samples) could accurately predict daily mean concentrations during these events. We found that using the gamma distribution underestimated high Cryptosporidium and Giardia concentrations measured with routine or event‐based monitoring. However, the log‐normal distribution accurately predicted these high concentrations. The selection of a log‐normal distribution in preference to a gamma distribution increased the annual mean concentration by less than 0.1‐log but increased the upper bound of the 95% credibility interval on the annual mean by about 0.5‐log. Therefore, considering parametric uncertainty in an exposure assessment is essential to account for microbial peaks in risk assessment.
-
Recently, in situ YSI EXO2 phycocyanin fluorescence probes have been widely deployed as a means to determine cyanobacterial abundance in drinking water sources, yet few studies have evaluated the effects of natural organic matter (NOM) and the ambient water temperature on the probe readings. In this study, Suwannee River NOM was added to laboratory cultivated cyanobacterial species to test the performance of the phycocyanin probe. The impact of temperature on phycocyanin fluorescence was evaluated by monitoring the laboratory cultivated cyanobacterial species and extracted phycocyanin pigment. Additionally, in situ phycocyanin fluorescence of the field samples from the water intake of a drinking water treatment plant (DWTP) in 2018 were compared with grab sample laboratory taxonomic analyses. We found: (1) the presence of Suwannee River NOM leads to the decrease in cell-bound cyanobacterial phycocyanin readings; (2) increasing ambient water temperature reduces dissolved and cell-bound cyanobacterial phycocyanin readings; (3) field study phycocyanin probe readings significantly correlated with the total cyanobacterial biovolume (R = 0.73, p < 0.1), and the relationship depends on the biovolume of dominant cyanobacterial species; (4) phycocyanin probe readings have a strong positive correlation with the natural light intensities; and (5) probe users should be fully aware of the sources of interferences when interpreting the results and apply the other physical-chemical parameters data simultaneously generated by the fluorometry to improve the probe’s measurements.
-
Health-related concerns about cyanobacteria-laden sludge of drinking water treatment plants (DWTPs) have been raised in the past few years. Microscopic taxonomy, shotgun metagenomic sequencing, and microcystin (MC) measurement were applied to study the fate of cyanobacteria and cyanotoxins after controlled sludge storage (stagnation) in the dark in a full-scale drinking water treatment plant within 7 to 38 days. For four out of eight dates, cyanobacterial cell growth was observed by total taxonomic cell counts during sludge stagnation. The highest observed cell growth was 96% after 16 days of stagnation. Cell growth was dominated by potential MC producers such as Microcystis, Aphanocapsa, Chroococcus, and Dolichospermum. Shotgun metagenomic sequencing unveiled that stagnation stress shifts the cyanobacterial communities from the stress-sensitive Nostocales (e.g., Dolichospermum) order towards less compromised orders and potential MC producers such as Chroococcales (e.g., Microcystis) and Synechococcales (e.g., Synechococcus). The relative increase of cyanotoxin producers presents a health challenge when the supernatant of the stored sludge is recycled to the head of the DWTP or discharged into the source. These findings emphasize the importance of a strategy to manage cyanobacteria-laden sludge and suggest practical approaches should be adopted to control health/environmental impacts of cyanobacteria and cyanotoxins in sludge.
-
The excessive proliferation of cyanobacteria in surface waters is a widespread problem worldwide, leading to the contamination of drinking water sources. Short- and long-term solutions for managing cyanobacterial blooms are needed for drinking water supplies. The goal of this research was to investigate the cyanobacteria community composition using shotgun metagenomics in a short term, in situ mesocosm experiment of two lakes following their coagulation with ferric sulfate (Fe2(SO4)3) as an option for source water treatment. Among the nutrient paramenters, dissolved nitrogen was related to Microcystis in both Missisquoi Bay and Petit Lac St. François, while the presence of Synechococcus was related to total nitrogen, dissolved nitrogen, dissolved organic carbon, and dissolved phosphorus. Results from the shotgun metagenomic sequencing showed that Dolichospermum and Microcystis were the dominant genera in all of the mesocosms in the beginning of the sampling period in Missisquoi Bay and Petit Lac St. François, respectively. Potentially toxigenic genera such as Microcystis were correlated with intracellular microcystin concentrations. A principal component analysis showed that there was a change of the cyanobacterial composition at the genus level in the mesocosms after two days, which varied across the studied sites and sampling time. The cyanobacterial community richness and diversity did not change significantly after its coagulation by Fe2(SO4)3 in all of the mesocosms at either site. The use of Fe2(SO4)3 for an onsite source water treatment should consider its impact on cyanobacterial community structure and the reduction of toxin concentrations.
-
Freshwater bodies and, consequently, drinking water treatment plants (DWTPs) sources are increasingly facing toxic cyanobacterial blooms. Even though conventional treatment processes including coagulation, flocculation, sedimentation, and filtration can control cyanobacteria and cell-bound cyanotoxins, these processes may encounter challenges such as inefficient removal of dissolved metabolites and cyanobacterial cell breakthrough. Furthermore, conventional treatment processes may lead to the accumulation of cyanobacteria cells and cyanotoxins in sludge. Pre-oxidation can enhance coagulation efficiency as it provides the first barrier against cyanobacteria and cyanotoxins and it decreases cell accumulation in DWTP sludge. This critical review aims to: (i) evaluate the state of the science of cyanobacteria and cyanotoxin management throughout DWTPs, as well as their associated sludge, and (ii) develop a decision framework to manage cyanobacteria and cyanotoxins in DWTPs and sludge. The review identified that lab-cultured-based pre-oxidation studies may not represent the real bloom pre-oxidation efficacy. Moreover, the application of a common exposure unit CT (residual concentration × contact time) provides a proper understanding of cyanobacteria pre-oxidation efficiency. Recently, reported challenges on cyanobacterial survival and growth in sludge alongside the cell lysis and cyanotoxin release raised health and technical concerns with regards to sludge storage and sludge supernatant recycling to the head of DWTPs. According to the review, oxidation has not been identified as a feasible option to handle cyanobacterial-laden sludge due to low cell and cyanotoxin removal efficacy. Based on the reviewed literature, a decision framework is proposed to manage cyanobacteria and cyanotoxins and their associated sludge in DWTPs.
-
The impact of oxidation on mitigation of cyanobacteria and cyanotoxins in drinking water treatment sludge was investigated at the laboratory and treatment plant scales. Two common oxidants, KMnO4 (5 and 10 mg/L) and H2O2 (10 and 20 mg/L) were applied under controlled steady-state conditions. Non-oxidized and oxidized sludge was left to stagnate in the dark for 7 to 38 days. Controlled laboratory trials show that KMnO4 and H2O2 decreased cell counts up to 62% and 77%, respectively. The maximum total MC level reduction achieved after oxidation was 41% and 98% using 20 mg/L H2O2 and 10 mg/L KMnO4, respectively. Stagnation caused cell growth up to 2.6-fold in 8 out of 22 oxidized samples. Microcystin (MC) producer orders as Chroococcales and Synechococcales were persistent while Nostocales was sensitive to combined oxidation and stagnation stresses. In parallel, two on-site shock oxidation treatments were performed in the DWTP’s sludge holding tank using 10 mg/L KMnO4. On-site shock oxidation decreased taxonomic cell counts by up to 43% within 24 h. Stagnation preceded by on-site shock oxidation could increase total cell counts by up to 55% as compared to oxidation alone. The increase of cell counts and mcyD gene copy numbers during stagnation revealed the impact of oxidation/stagnation on cyanobacterial cell growth. These findings show the limitations of sludge oxidation as a strategy to manage cyanobacteria and cyanotoxins in sludge and suggest that alternative approaches to prevent the accumulation and mitigation of cyanobacteria in sludge should be considered.
-
Conventional processes (coagulation, flocculation, sedimentation, and filtration) are widely used in drinking water treatment plants and are considered a good treatment strategy to eliminate cyanobacterial cells and cell-bound cyanotoxins. The diversity of cyanobacteria was investigated using taxonomic cell counts and shotgun metagenomics over two seasons in a drinking water treatment plant before, during, and after the bloom. Changes in the community structure over time at the phylum, genus, and species levels were monitored in samples retrieved from raw water (RW), sludge in the holding tank (ST), and sludge supernatant (SST). Aphanothece clathrata brevis, Microcystis aeruginosa, Dolichospermum spiroides, and Chroococcus minimus were predominant species detected in RW by taxonomic cell counts. Shotgun metagenomics revealed that Proteobacteria was the predominant phylum in RW before and after the cyanobacterial bloom. Taxonomic cell counts and shotgun metagenomic showed that the Dolichospermum bloom occurred inside the plant. Cyanobacteria and Bacteroidetes were the major bacterial phyla during the bloom. Shotgun metagenomics also showed that Synechococcus, Microcystis, and Dolichospermum were the predominant detected cyanobacterial genera in the samples. Conventional treatment removed more than 92% of cyanobacterial cells but led to cell accumulation in the sludge up to 31 times more than in the RW influx. Coagulation/sedimentation selectively removed more than 96% of Microcystis and Dolichospermum. Cyanobacterial community in the sludge varied from raw water to sludge during sludge storage (1–13 days). This variation was due to the selective removal of coagulation/sedimentation as well as the accumulation of captured cells over the period of storage time. However, the prediction of the cyanobacterial community composition in the SST remained a challenge. Among nutrient parameters, orthophosphate availability was related to community profile in RW samples, whereas communities in ST were influenced by total nitrogen, Kjeldahl nitrogen (N- Kjeldahl), total and particulate phosphorous, and total organic carbon (TOC). No trend was observed on the impact of nutrients on SST communities. This study profiled new health-related, environmental, and technical challenges for the production of drinking water due to the complex fate of cyanobacteria in cyanobacteria-laden sludge and supernatant.
-
Waterborne pathogens are heterogeneously distributed across various spatiotemporal scales in water resources, and representative sampling is therefore crucial for accurate risk assessment. Since regulatory monitoring of microbiological water quality is usually conducted at fixed time intervals, it can miss short-term fecal contamination episodes and underestimate underlying microbial risks. In the present paper, we developed a new automated sampling methodology based on near real-time measurement of a biochemical indicator of fecal pollution. Online monitoring of β-D-glucuronidase (GLUC) activity was used to trigger an automated sampler during fecal contamination events in a drinking water supply and at an urban beach. Significant increases in protozoan parasites, microbial source tracking markers and E. coli were measured during short-term (<24 h) fecal pollution episodes, emphasizing the intermittent nature of their occurrence in water. Synchronous triggering of the automated sampler with online GLUC activity measurements further revealed a tight association between the biochemical indicator and culturable E. coli. The proposed event sampling methodology is versatile and in addition to the two triggering modes validated here, others can be designed based on specific needs and local settings. In support to regulatory monitoring schemes, it should ultimately help gathering crucial data on waterborne pathogens more efficiently during episodic fecal pollution events.
-
La personnalisation des services est de plus en plus populaire dans le réseau de la santé et des services sociaux. Plutôt que de piger parmi des offres de services existantes, l’usager coconstruit des services selon ses besoins. Or, malgré les avancées dans ce domaine, peu d’approches de personnalisation des services existent pour répondre aux besoins des communautés. Dans ce texte, nous souhaitons contribuer au champ de la personnalisation des services en l’enrichissant d’une perspective communautaire. Vers cet objectif, nous recensons diverses approches employées en santé publique qui tiennent compte des besoins spécifiques des communautés. Nous déclinons ces approches au moyen d’interventions locales pour illustrer comment elles contribuent au développement d’une perspective communautaire à superposer au modèle actuel de personnalisation des services. Nous soutenons que la fonction promotion de la santé en santé publique permet de relier les individus à leur communauté d’appartenance au sein d’un modèle unique de personnalisation des services. Nous pensons que ce modèle intégré de personnalisation des services permettra d’une part, la coproduction de services individuels et communautaires et d’autre part, qu’il favorisera le rapprochement des acteurs des domaines de la santé publique et des services sociaux autour d’un projet commun de développement de communautés productrices de bien-être.
-
The temperate mixedwood forests of eastern North America have been managed by partial cutting for several decades. To ensure that regeneration contributes to replacing the commercial-size stems that are removed by partial cutting, forest managers need to anticipate how saplings (i.e., regenerating trees with a diameter at breast height >1.0 cm) develop in terms of number and diameter. Using up to 20 years of monitoring data from three study sites, we developed a transition matrix model to predict the future number of saplings and their diameter distribution for mixed yellow birch ( Betula alleghaniensis Britton) – conifer stands. Our results show that partial cutting allowed yellow birch, red maple ( Acer rubrum L.), red spruce ( Picea rubens Sarg.), and balsam fir ( Abies balsamea (L.) Mill.) saplings to reach merchantable size faster and in greater numbers than in untreated control plots. We also found that fewer hardwood saplings (yellow birch and red maple) than softwood saplings (red spruce and balsam fir) were required to produce 1 m 2 ·ha −1 of merchantable basal area after 20–40 years. Finally, our model provides a tool for forest managers to predict sapling development in mixed hardwood and softwood stands over a full cutting cycle.
-
Abstract In northern hardwood forests, tree markers select the trees to be harvested during logging operations using classification systems that assign harvest priorities based on the presence of a wide range of individual defects. According to the most recent advances in our understanding of the impact of defects on both tree vigour (the risk of mortality or decline in growth) and quality (the potential for recovering valuable sawlogs), tree markers should adopt a simpler classification system that considers fewer defects than the current operational practice, and they should prioritize the removal of trees with crown dieback. Since the probability of developing defects and dying increases substantially with tree diameter, tree markers should also favour the removal of larger trees that have maintained their quality. However, these recommendations were developed based on tree-level analyses. To provide further validation at the stand scale, we compared stand improvement and value recovery under three tree marking regimes: a new, simplified regime based on the recommendations above, and two regimes used in the province of Quebec, Canada. To do so, we conducted tree marking simulations and value recovery assessments in 14 managed stands distributed across the northern hardwood range of Quebec. Our results confirmed that the simplified tree marking regime not only facilitated stand improvement by removing a greater proportion of low-vigour trees, but also recovered significantly more value (17% on average) at the stand scale. By prioritizing the removal of trees with crown dieback, the simplified regime was superior at salvaging the current value of low-vigour trees before they die or decline in quality. Based on our results, we propose simplified and empirically-validated tree marking guidelines for northern hardwood forests.