Votre recherche
Résultats 1 120 ressources
-
Abstract. The ongoing warming of cold regions is affecting hydrological processes, causing deep changes, such as a ubiquitous increase in river winter discharges. The drivers of this increase are not yet fully identified mainly due to the lack of observations and field measurements in cold and remote environments. In order to provide new insights into the sources generating winter runoff, the present study explores the possibility of extracting information from icings that form over the winter and are often still present early in the summer. Primary sources detection was performed using time-lapse camera images of icings found in both proglacial fields and upper alpine meadows in June 2016 in two subarctic glacierized catchments in the upper part of the Duke watershed in the St. Elias Mountains, Yukon. As images alone are not sufficient to entirely cover a large and hydrologically complex area, we explore the possibility of compensating for that limit by using four supplementary methods based on natural tracers: (a) stable water isotopes, (b) water ionic content, (c) dissolved organic carbon, and (d) cryogenic precipitates. The interpretation of the combined results shows a complex hydrological system where multiple sources contribute to icing growth over the studied winter. Glaciers of all sizes, directly or through the aquifer, represent the major parent water source for icing formation in the studied proglacial areas. Groundwater-fed hillslope tributaries, possibly connected to suprapermafrost layers, make up the other detectable sources in icing remnants. If similar results are confirmed in other cold regions, they would together support a multi-causal hypothesis for a general increase in winter discharge in glacierized catchments. More generally, this study shows the potential of using icing formations as a new, barely explored source of information on cold region winter hydrological processes that can contribute to overcoming the paucity of observations in these regions.
-
Abstract This study aims to isolate and quantify the role of shrinking glaciers in recent hydrological changes in eight watersheds in the southwestern Yukon (Canada) by using an original dual approach that consists of (i) watershed hydrological regime identification, followed by a trend analysis of discharge time series, and (ii) a model‐based peak water (PW) analysis using glacier cover change measurements. A distinction between hydrological regimes is a necessary add‐up to commonly used trend attribution methods as the lake runoff regime shares common characteristics with the glacier regime. Results show a link between shrinking glaciers and hydrological changes in the region, but the link is complex, and glacier retreat does not explain all the observed changes. Model outputs show that the two watersheds with a glacierized area exceeding 30% and one watershed with 2.9% glacierized area have not reached PW, whereas a 9.2% glacierized watershed and another watershed with 2.1% glacierized area have already passed it. These results suggest that glacierized area alone cannot explain short‐term changes related to watershed current position in terms of PW, and the rate of glacier retreat must be considered. By contrast, the actual rate of glacier retreat does not influence long‐term changes, such as the magnitude of PW and of the consequent drop in discharge. Once glaciers will have retreated to a point close to extinction, declines in summer discharge from 10% to 70% and proportional to the actual glacier cover are anticipated at watersheds that are currently more than 9% glacierized. , Plain Language Summary In this study, we aim to understand how shrinking glacier cover affects river discharges. In conditions of continuous retreat, glaciers produce an initial increase in runoff as they lose mass. The discharge then reaches a turning point, a plateau called peak water, and subsequently declines as the volume of glacial ice continues to decrease. When analyzing eight watersheds with different glacier covers in the southwestern Yukon, we found that two watersheds that are 30% covered by glaciers have not yet reached this plateau, and therefore, the discharge will continue to increase. Several watersheds with smaller glacierized portions have passed peak water, which means that the discharge will now continue to decrease. We were also able to estimate the magnitudes of these changes in discharge. We show that two watersheds with 30% glacierized area can still experience a 1.5‐ to 2‐fold increase in discharge and that watersheds currently more than 9% glacierized are predicted to show noticeable changes after peak water, with the possibility of discharge decreasing by a factor of 3 to 5 by the time glaciers have retreated to a point when their hydrological influence at the watershed scale becomes insignificant. , Key Points Noticeable acceleration of glacier retreat occurred in southwestern Yukon since 1999 with measured consequences for the regional hydrology Various hydrological changes have been detected at the study watersheds. Glacier retreat explains many but not all of those changes Long‐term hydrological changes are glacier cover dependent while decadal‐scale changes are driven by glacier retreat rate
-
Evaluating the historical contribution of the volume loss of ice to stream flow based on reconstructed volume changes through the Little Ice Age (LIA) can be directly related to the understanding of glacier-hydrology in the current epoch of rapid glacier loss that has disquieting implications for a water resource in the Cordillera Blanca in the Peruvian Andes. However, the accurate prediction of the future glacial meltwater availability for the developing regional Andean society needs more extensive quantitative estimation from long-term glacial meltwater of reconstructed glacial volume. Modeling the LIA paleoglaciers through the mid-19th century (with the most extensive recent period of mountain glacier expansion having occurred around 1850 AD) in different catchments of the Cordillera Blanca allows us to reconstruct glacier volume and its change from likely combinations of climatic control variables and time. We computed the rate and magnitude of centennial-scale glacier volume changes for glacier surfaces between the LIA and the modern era, as defined by 2011 Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) Global Digital Elevation Model Version 2 (GDEM V2) and 2008 Light Detection and Range (LiDAR) data. The model simulation showed good agreement with the observed geomorphic data and the volume and surface area (V-S) scaling remained within the 25% error range in the reconstructed simulation. Also, we employed a recently demonstrated approach (Baraër, M. et al.) to calculate meltwater contribution to glacierized catchment runoff. The results revealed multiple peaks of both mean annual and dry season discharge that have never been shown in previous research on the same mountain range.
-
The rapidly melting glaciers of Peru are posing new risks to regional dry season water supplies, and this is evident in the Cordillera Blanca, the mountain range with the world's largest concentration of tropical glaciers. Permanent ice loss is causing reductions to dry season streamflow, which is coupled with shifting demands and control over water access and entitlements in the region. A full evaluation of hydrologic inputs is required to inform future water management in the relative absence of glaciers. Over the last decade, new studies have shown groundwater to be a significant component of the regional water budget during the dry season, and it cannot be ignored when accounting for water quality and quantity downstream of the Cordillera Blanca's alpine catchments. Distinctive common features of the Cordillera Blanca's proglacial catchments are sediment‐filled valleys that were once under proglacial lake conditions. The combination of lake sediments with other alpine depositional features results in storage and interseasonal release of groundwater that comprises up to 80% of the valley's streamflow during the driest months of the year. We summarize the emerging understanding of hydrogeologic processes in proglacial headwater systems of the region's principal river, the Rio Santa, and make suggestions for future research that will more clearly characterize the spatial distribution of stored groundwater within the mountain range. As glaciers continue to recede, differences in aquifer thickness and groundwater residence time between alpine catchments in the region will increasingly control dry season water availability at the local and basin scale. This article is categorized under: Science of Water > Hydrological Processes Science of Water > Water and Environmental Change Engineering Water > Planning Water
-
Abstract Debris-covered glaciers are an important component of the mountain cryosphere and influence the hydrological contribution of glacierized basins to downstream rivers. This study examines the potential to make estimates of debris thickness, a critical variable to calculate the sub-debris melt, using ground-based thermal infrared radiometry (TIR) images. Over four days in August 2019, a ground-based, time-lapse TIR digital imaging radiometer recorded sequential thermal imagery of a debris-covered region of Peyto Glacier, Canadian Rockies, in conjunction with 44 manual excavations of debris thickness ranging from 10 to 110 cm, and concurrent meteorological observations. Inferring the correlation between measured debris thickness and TIR surface temperature as a base, the effectiveness of linear and exponential regression models for debris thickness estimation from surface temperature was explored. Optimal model performance ( R 2 of 0.7, RMSE of 10.3 cm) was obtained with a linear model applied to measurements taken on clear nights just before sunrise, but strong model performances were also obtained under complete cloud cover during daytime or nighttime with an exponential model. This work presents insights into the use of surface temperature and TIR observations to estimate debris thickness and gain knowledge of the state of debris-covered glacial ice and its potential hydrological contribution.
-
Abstract Background Posttraumatic stress disorder (PTSD) has been hailed by some as the emblematic mental disorder of the COVID-19 pandemic, assuming that PTSD’s life-threat criterion was met de facto. More plausible outcomes like adjustment disorder (AD) have been overlooked. Methods An online cross-sectional survey was launched in the initial stage of the pandemic using a convenience sample of 5 913 adults to compare the prevalence of COVID-related probable PTSD versus probable AD. The abridged Impact of Event Scale – Revised (IES-6) assessed the severity of trauma- and stressor-related symptoms over the previous week. Demographic and pandemic-related data (e.g., receiving a formal diagnosis of COVID-19, job loss, loss of loved one, confinement, material hardship) were collected. A Classification and Regression Tree analysis was conducted to uncover the pandemic experiences leading to clinical ‘caseness’. Caseness was defined by a score > 9 on the IES-6 symptom measure and further characterized as PTSD or AD depending on whether the Peritraumatic Distress Inventory’s life-threat item was endorsed or not. Results The participants were predominantly Caucasian (72.8%), women (79.2%), with a university degree (85%), and a mean age of 42.22 ( SD = 15.24) years; 3 647 participants (61.7%; 95%CI [60.4, 63.0]) met the threshold for caseness. However, when perceived life-threat was accounted for, only 6.7% (95%CI [6.1, 7.4]) were classified as PTSD cases, and 55% (95%CI [53.7, 56.2]) as AD cases. Among the AD cases, three distinct profiles emerged marked by the following: (i) a worst personal pandemic experience eliciting intense fear, helplessness or horror (in the absence, however, of any life-threat), (ii) a pandemic experience eliciting sadness/grief, and (iii) worrying intensely about the safety of significant others. Conclusions Studies considering the life-threat criterion as met de facto during the pandemic are confusing PTSD for AD on most counts. This misconception is obscuring the various AD-related idioms of distress that have emerged during the pandemic and the actual treatment needs.
-
Abstract Temporal variations in concentrations of pathogenic microorganisms in surface waters are well known to be influenced by hydrometeorological events. Reasonable methods for accounting for microbial peaks in the quantification of drinking water treatment requirements need to be addressed. Here, we applied a novel method for data collection and model validation to explicitly account for weather events (rainfall, snowmelt) when concentrations of pathogens are estimated in source water. Online in situ β ‐ d ‐glucuronidase activity measurements were used to trigger sequential grab sampling of source water to quantify Cryptosporidium and Giardia concentrations during rainfall and snowmelt events at an urban and an agricultural drinking water treatment plant in Quebec, Canada. We then evaluate if mixed Poisson distributions fitted to monthly sampling data ( = 30 samples) could accurately predict daily mean concentrations during these events. We found that using the gamma distribution underestimated high Cryptosporidium and Giardia concentrations measured with routine or event‐based monitoring. However, the log‐normal distribution accurately predicted these high concentrations. The selection of a log‐normal distribution in preference to a gamma distribution increased the annual mean concentration by less than 0.1‐log but increased the upper bound of the 95% credibility interval on the annual mean by about 0.5‐log. Therefore, considering parametric uncertainty in an exposure assessment is essential to account for microbial peaks in risk assessment.
-
Cyanobacterial bloom events produce toxins and taste and odor issues, disturbing drinkable water quality. Vacuum UV (VUV) is a promising advanced oxidation process used to treat impacted water, with potential applicability in small and remote communities. , Cyanobacterial blooms are a growing concern around the world. A feasible approach for small treatment plants fed by sources contaminated with cyanobacteria is vacuum UV (VUV). VUV is a promising advanced oxidation process used to treat water impacted by cyanobacterial blooms, with potential applicability in small and remote communities because of its simplicity. In this work, water samples from three Canadian lakes periodically affected by cyanobacteria were used to assess the impact of natural and algal organic matter (NOM/AOM) on treatment with VUV. NOM and AOM were characterized before and after VUV treatment by size exclusion chromatography (SEC) and fluorescence emission–excitation matrix (FEEM). FEEM spectra were analyzed with the parallel factor analysis (PARAFAC) tool. As a result, we found seven principal components describing the whole dataset. Disinfection by-product (DBP) formation after VUV treatment was analyzed and trihalomethanes (THM) yield was calculated. THM yield increased by 15–20% after VUV treatment. Regarding DBP formation and NOM/AOM fractions from SEC, we found that humic substances are the most important fraction causing the increase in DBP formation with at least 3 times higher yield than the other fractions: biopolymers, building blocks, low weight molecular acids and neutrals.
-
A framework is proposed using multi-stream phycocyanin probe readings to manage cyanobacterial risks at the source water and across the drinking water treatment processes. , In situ phycocyanin (PC) probes have been deployed as a cost-effective and efficient way to monitor cyanobacterial (CB) abundance in drinking water sources and to identify periods of potential risk at drinking water treatment plants (DWTPs). Monitoring CB removal efficacies in near real-time by sequentially using a single probe for multiple streams across the treatment plant provides a more useful assessment of CB risk breakthrough in treated water. Removal efficacies were measured in three DWTPs using integrated mass fluxes estimated from PC readings and grab sample total CB biovolume estimations in raw, clarified, filtered, and treated water. Selective CB species removal during the treatment processes was also evaluated. In addition, relationships between physio-chemical parameters (turbidity, pH, dissolved oxygen, conductivity, chlorophyll-a and temperature) and PC across the treatment processes were investigated. Finally, a framework to use in situ multi-stream PC monitoring, gathering data across the treatment chain, is proposed to manage risks of CB cells breakthrough in treated water. Estimates of 2 hour moving average PC metrics are proposed to provide short term alert in raw water and establish periods of treatment vulnerability or dysfunction, while daily mean PC values can be used to estimate total and process specific log removals to justify treatment adjustments. Benefits and limitations of the tested in situ probes for the application of alert levels are discussed, and key knowledge gaps for future research and guidance are identified.