Votre recherche
Résultats 210 ressources
-
La rivière Chaudière, située au sud de la Ville de Québec, est sujette aux inondations provoquées par la formation d'embâcles. Des inondations ont été enregistrées depuis 1896 jusqu'à ce jour malgré la mise en service, en 1967, d'un ouvrage de contrôle des glaces (ICS) à 3 kilomètres en amont de la Ville de Saint-Georges-de-Beauce afin de réduire les inondations causées par la glace dans le secteur le plus à risque de la rivière Chaudière. Les inondations par embâcles demeurent donc un problème récurrent qui affecte régulièrement 8 villes le long du tronçon de 90 kilomètres en aval de l'ICS. Dans le cadre d'un programme gouvernemental d'aléas d'inondation initié par le ministère des Affaires Municipales et de l'Habitation (MAMH), un mandat pour évaluer les cotes de crues en présence de glace de la rivière Chaudière a été confié à l'Université Laval. La modélisation d'embâcles combinée à des données d'observations historiques d'embâcles est utilisée pour déterminer les niveaux d'inondation par embâcles. L'approche préconisée consiste à contrôler un modèle de simulation hydraulique fluviale, plus spécifiquement le module HEC-RAS, avec un script externe en Python pour générer une distribution Monte-Carlo (MOCA) d'évènements d'embâcles le long du secteur de la rivière à l'étude. Les paramètres mécaniques tels que l'angle de frottement, la porosité et les vitesses de contrainte de cisaillement critiques sont également attribués de manière aléatoire par le script dans une plage délimitée. Les paramètres physiques et hydrologiques attribués à chaque évènement sont choisis au hasard en fonction d'une probabilité estimée à partir des observations historiques, soit le débit calculé à l'ICS, l'emplacement de l'embâcle, la longueur de l'embâcle et les degrés-jours de gel (épaisseur de la glace). Les cotes de crues selon les périodes de retour de 2, 20, 100 et 350 ans sont alors déterminées selon une équation statistique empirique de Gringorten, suivie d'une modulation pour tenir compte des facteurs externes non considérés par MOCA. Ces cotes de crues en présence de glace sont comparées à celles en eau libre telles que déterminées par la méthode classique. Le projet démontre que les niveaux d'eau calculés en présence de glace prédominent ceux en eau libre pour les villes en amont de Saint-Joseph-de-Beauce. La combinaison des niveaux d'eau en présence de glace et en eau libre, réalisée à l'aide de l'équation de la FEMA, montre que la probabilité d'atteindre un seuil spécifique d'élévation diminue la période de retour et en conséquence augmente les probabilités reliées aux inondations. Ce mémoire est le premier travail scientifique qui présente une validation complète de l'approche hydrotechnique utilisant les valeurs in situ de débit, de DJGC et de l'emplacement et de la longueur d'embâcles pour la détermination des cotes de crue par embâcles. Les valeurs de cotes de crues calculées avec la méthode MOCA sont comparées avec les données historiques dans le secteur à l'étude de la rivière Chaudière. La présente étude met en évidence les limitations et les conditions nécessaires pour l'utilisation de cette méthode. Ce projet de recherche montre aussi pour la première fois que l'approche hydrotechnique permet de calculer des courbes fréquentielles de niveaux d'eau en présence de glace qui peuvent être utilisées à des fins réglementaires au Québec.
-
Extreme flood events continue to be one of the most threatening natural disasters around the world due to their pronounced social, environmental and economic impacts. Changes in the magnitude and frequency of floods have been documented during the last years, and it is expected that a changing climate will continue to affect their occurrence. Therefore, understanding the impacts of climate change through hydroclimatic simulations has become essential to prepare adaptation strategies for the future. However, the confidence in flood projections is still low due to the considerable uncertainties associated with their simulations, and the complexity of local features influencing these events. The main objective of this doctoral thesis is thus to improve our understanding of the modelling uncertainties associated with the generation of flood projections as well as evaluating strategies to reduce these uncertainties to increase our confidence in flood simulations. To address the main objective, this project aimed at (1) quantifying the uncertainty contributions of different elements involved in the modelling chain used to produce flood projections and, (2) evaluating the effects of different strategies to reduce the uncertainties associated with climate and hydrological models in regions with diverse hydroclimatic conditions. A total of 96 basins located in Quebec (basins dominated by snow-related processes) and Mexico (basins dominated by rain-related processes), covering a wide range of climatic and hydrological regimes were included in the study. The first stage consisted in decomposing the uncertainty contributions of four main uncertainty sources involved in the generation of flood projections: (1) climate models, (2) post-processing methods, (3) hydrological models, and (4) probability distributions used in flood frequency analyses. A variance decomposition method allowed quantifying and ranking the influence of each uncertainty source on floods over the two regions studied and by seasons. The results showed that the uncertainty contributions of each source vary over the different regions and seasons. Regions and seasons dominated by rain showed climate models as the main uncertainty source, while those dominated by snowmelt showed hydrological models as the main uncertainty contributor. These findings not only show the dangers of relying on single climate and hydrological models, but also underline the importance of regional uncertainty analyses. The second stage of this research project focused in evaluating strategies to reduce the uncertainties arising from hydrological models on flood projections. This stage includes two steps: (1) the analysis of the reliability of hydrological model’s calibration under a changing climate and (2) the evaluation of the effects of weighting hydrological simulations on flood projections. To address the first part, different calibration strategies were tested and evaluated using five conceptual lumped hydrological models under contrasting climate conditions with datasets lengths varying from 2 up to 21 years. The results revealed that the climatic conditions of the calibration data have larger impacts on hydrological model’s performance than the lengths of the climate time series. Moreover, changes on precipitation generally showed greater impacts than changes in temperature across all the different basins. These results suggest that shorter calibration and validation periods that are more representative of possible changes in climatic conditions could be more appropriate for climate change impact studies. Following these findings, the effects of different weighting strategies based on the robustness of hydrological models (in contrasting climatic conditions) were assessed on flood projections of the different studied basins. Weighting the five hydrological models based on their robustness showed some improvements over the traditional equal-weighting approach, particularly over warmer and drier conditions. Moreover, the results showed that the difference between these approaches was more pronounced over flood projections, as contrasting flood magnitudes and climate change signals were observed between both approaches. Additional analyses performed over four selected basins using a semi-distributed and more physically-based hydrological model suggested that this type of models might have an added value when simulating low-flows, and high flows on small basins (of about 500 km2). These results highlight once again the importance of working with ensembles of hydrological models and presents the potential impacts of weighting hydrological models on climate change impact studies. The final stage of this study focused on evaluating the impacts of weighting climate simulations on flood projections. The different weighting strategies tested showed that weighting climate simulations can improve the mean hydrograph representation compared to the traditional model “democracy” approach. This improvement was mainly observed with a weighting approach proposed in this thesis that evaluates the skill of the seasonal simulated streamflow against observations. The results also revealed that weighting climate simulations based on their performance can: (1) impact the floods magnitudes, (2) impact the climate change signals, and (3) reduce the uncertainty spreads of the resulting flood projection. These effects were particularly clear over rain-dominated basins, where climate modelling uncertainty plays a main role. These finding emphasize the need to reconsider the traditional climate model democracy approach, especially when studying processes with higher levels of climatic uncertainty. Finally, the implications of the obtained results were discussed. This section puts the main findings into perspective and identifies different ways forward to keep improving the understanding of climate change impacts in hydrology and increasing our confidence on flood projections that are essential to guide adaptation strategies for the future.
-
Globally, the number of people at risk from flooding has been increasing since 2000, with the population from the South being more vulnerable. Millions of households are displaced by disasters every year. In 2009, the city of Ouagadougou in Burkina Faso experienced its most disastrous flood ever recorded. As a response, the government designed a permanent relocation plan in Yagma, a village located outside the city of Ouagadougou. The relocation plan disrupted the livelihoods of the households that were affected by the flood, leading many of them to return and rebuild their houses in flood prone areas. This paper contributes to a body of literature analyzing the heritage of postcolonialism on the flood vulnerability on the poorer communities in Ouagadougou. Using a political ecology frame, the thesis attempts to understand how the government of Burkina Faso and flood victims understand land and belongings, and how that understanding shaped the relocation program. After interviewing flood victims and government officials, an analysis revealed that contrasting views are at work. A perspective based on technical calculations and a neo-colonialist vision of development, on the one hand, and a grounded perspective based on relationships to the land and each other, on the other.
-
Alors que pour la plupart des rivières, la rupture du couvert de glace est un processus d'amont en aval, la rupture en estuaire est un processus de l'aval vers l'amont. Les ondes de marée qui remontent l'embouchure des estuaires parviennent à fracturer la couverture de glace en morceaux lors de la marée montante et transportent ensuite ces morceaux hors de l'estuaire lors de la marée descendante. Les conditions hydrométéorologiques affectant ce processus restent peu documentées et mal comprises. À cette fin, une campagne de terrain de deux ans a été mise sur pied pour surveiller la débâcle printanière dans quatre estuaires du Québec. Les résultats ont montré que l'affaiblissement thermique de la glace est un précurseur nécessaire à la débâcle de celle-ci. D'autres facteurs comme le marnage et le débit sont aussi des paramètres importants pour prédire le type de débâcle qui sera observé. En parallèle, une simulation numérique (Hec-Ras) d'une section de la rivière Saint-Anne, au Québec, a été menée en y ajoutant un barrage à seuil gonflable (à Chute-Panet). Celui-ci permettrait de reproduire artificiellement les cycles de marées. Les critères hydrauliques montrent que l'ajout de ce type de barrage permettrait de faciliter la destruction et l'évacuation du couvert de glace avant que celui-ci ne provoque des inondations dans la ville de Saint-Raymond. Des critères empiriques et l'application de la théorie des poutres reposant sur des fondations élastiques ont aussi été utilisés pour affiner notre compréhension du phénomène de rupture en rivière gelée. Finalement, le troisième volet de cette étude porte sur la réalisation d'une maquette de rivière en laboratoire d'hydraulique. Son utilisation vise à combler les limites des modèles numériques et à étudier différents scénarios d'opération du barrage.
-
In the tropical environment such as Brazil, the frequency of rainfall-induced landslides is particularly high because of the rugged terrain, heavy rainfall, increasing urbanization, and the orographic effect of mountain ranges. Since such landslides repeatedly interfere with human activities and infrastructures, improved knowledge related to spatial and temporal prediction of the phenomenon is of interest for risk management. This study is an analysis of empirical rainfall thresholds, which aims to establish local and regional scale correlations between rainfall and the triggering of landslides in Angra dos Reis in the State of Rio de Janeiro. A statistical analysis combining quantile regression and binary logistic regression was performed on 1640 and 526 landslides triggered by daily rainfall over a 6-year period in the municipality and the urban center of Angra dos Reis, in order to establish probabilistic rainfall duration thresholds and assess the role of antecedent rainfall. The results show that the frequency of landslides is highly correlated with rainfall events, and surprisingly the thresholds in dry season are lower than those in wet season. The aspect of the slopes also seems to play an important role as demonstrated by the different thresholds between the southern and northern regions. Finally, the results presented in this study provide new insight into the spatial and temporal dynamics of landslides and rainfall conditions leading to their activation in this tropical and mountainous environment.
-
Abstract Resilience has become a cornerstone for risk management and disaster reduction. However, it has evolved extensively both etymologically and conceptually in time and across scientific disciplines. The concept has been (re)shaped by the evolution of research and practice efforts. Considered the opposite of vulnerability for a long time, resilience was first defined as the ability to resist, bounce back, cope with, and recover quickly from the impacts of hazards. To avoid the possible return to conditions of vulnerability and exposure to hazards, the notions of post-disaster development, transformation, and adaptation (build back better) and anticipation, innovation, and proactivity (bounce forward) were then integrated. Today, resilience is characterized by a multitude of components and several classifications. We present a selection of 25 components used to define resilience, and an interesting linkage emerges between these components and the dimensions of risk management (prevention, preparedness, response, and recovery), offering a perspective to strengthen resilience through the development of capacities. Despite its potential, resilience is subject to challenges regarding its operationalization, effectiveness, measurement, credibility, equity, and even its nature. Nevertheless, it offers applicability and opportunities for local communities as well as an interdisciplinary look at global challenges.
-
Explore millions of resources from scholarly journals, books, newspapers, videos and more, on the ProQuest Platform.
-
Seasonal snowpack deeply influences the distribution of meltwater among watercourses and groundwater. During rain-on-snow (ROS) events, the structure and properties of the different snow and ice layers dictate the quantity and timing of water flowing out of the snowpack, increasing the risk of flooding and ice jams. With ongoing climate change, a better understanding of the processes and internal properties influencing snowpack outflows is needed to predict the hydrological consequences of winter melting episodes and increases in the frequency of ROS events. This study develops a multi-method approach to monitor the key snowpack properties in a non-mountainous environment in a repeated and non-destructive way. Snowpack evolution during the winter of 2020–2021 was evaluated using a drone-based, ground-penetrating radar (GPR) coupled with photogrammetry surveys conducted at the Ste-Marthe experimental watershed in Quebec, Canada. Drone-based surveys were performed over a 200 m2 area with a flat and a sloped section. In addition, time domain reflectometry (TDR) measurements were used to follow water flow through the snowpack and identify drivers of the changes in snowpack conditions, as observed in the drone-based surveys. The experimental watershed is equipped with state-of-the-art automatic weather stations that, together with weekly snow pit measurements over the ablation period, served as a reference for the multi-method monitoring approach. Drone surveys conducted on a weekly basis were used to generate georeferenced snow depth, density, snow water equivalent and bulk liquid water content maps. Despite some limitations, the results show that the combination of drone-based GPR, photogrammetric surveys and TDR is very promising for assessing the spatiotemporal evolution of the key hydrological characteristics of the snowpack. For instance, the tested method allowed for measuring marked differences in snow pack behaviour between the first and second weeks of the ablation period. A ROS event that occurred during the first week did not generate significant changes in snow pack density, liquid water content and water equivalent, while another one that happened in the second week of ablation generated changes in all three variables. After the second week of ablation, differences in density, liquid water content (LWC) and snow water equivalent (SWE) between the flat and the sloped sections of the study area were detected by the drone-based GPR measurements. Comparison between different events was made possible by the contact-free nature of the drone-based measurements.
-
Abstract Background Posttraumatic stress disorder (PTSD) has been hailed by some as the emblematic mental disorder of the COVID-19 pandemic, assuming that PTSD’s life-threat criterion was met de facto. More plausible outcomes like adjustment disorder (AD) have been overlooked. Methods An online cross-sectional survey was launched in the initial stage of the pandemic using a convenience sample of 5 913 adults to compare the prevalence of COVID-related probable PTSD versus probable AD. The abridged Impact of Event Scale – Revised (IES-6) assessed the severity of trauma- and stressor-related symptoms over the previous week. Demographic and pandemic-related data (e.g., receiving a formal diagnosis of COVID-19, job loss, loss of loved one, confinement, material hardship) were collected. A Classification and Regression Tree analysis was conducted to uncover the pandemic experiences leading to clinical ‘caseness’. Caseness was defined by a score > 9 on the IES-6 symptom measure and further characterized as PTSD or AD depending on whether the Peritraumatic Distress Inventory’s life-threat item was endorsed or not. Results The participants were predominantly Caucasian (72.8%), women (79.2%), with a university degree (85%), and a mean age of 42.22 ( SD = 15.24) years; 3 647 participants (61.7%; 95%CI [60.4, 63.0]) met the threshold for caseness. However, when perceived life-threat was accounted for, only 6.7% (95%CI [6.1, 7.4]) were classified as PTSD cases, and 55% (95%CI [53.7, 56.2]) as AD cases. Among the AD cases, three distinct profiles emerged marked by the following: (i) a worst personal pandemic experience eliciting intense fear, helplessness or horror (in the absence, however, of any life-threat), (ii) a pandemic experience eliciting sadness/grief, and (iii) worrying intensely about the safety of significant others. Conclusions Studies considering the life-threat criterion as met de facto during the pandemic are confusing PTSD for AD on most counts. This misconception is obscuring the various AD-related idioms of distress that have emerged during the pandemic and the actual treatment needs.
-
Recently, in situ YSI EXO2 phycocyanin fluorescence probes have been widely deployed as a means to determine cyanobacterial abundance in drinking water sources, yet few studies have evaluated the effects of natural organic matter (NOM) and the ambient water temperature on the probe readings. In this study, Suwannee River NOM was added to laboratory cultivated cyanobacterial species to test the performance of the phycocyanin probe. The impact of temperature on phycocyanin fluorescence was evaluated by monitoring the laboratory cultivated cyanobacterial species and extracted phycocyanin pigment. Additionally, in situ phycocyanin fluorescence of the field samples from the water intake of a drinking water treatment plant (DWTP) in 2018 were compared with grab sample laboratory taxonomic analyses. We found: (1) the presence of Suwannee River NOM leads to the decrease in cell-bound cyanobacterial phycocyanin readings; (2) increasing ambient water temperature reduces dissolved and cell-bound cyanobacterial phycocyanin readings; (3) field study phycocyanin probe readings significantly correlated with the total cyanobacterial biovolume (R = 0.73, p < 0.1), and the relationship depends on the biovolume of dominant cyanobacterial species; (4) phycocyanin probe readings have a strong positive correlation with the natural light intensities; and (5) probe users should be fully aware of the sources of interferences when interpreting the results and apply the other physical-chemical parameters data simultaneously generated by the fluorometry to improve the probe’s measurements.
-
Health-related concerns about cyanobacteria-laden sludge of drinking water treatment plants (DWTPs) have been raised in the past few years. Microscopic taxonomy, shotgun metagenomic sequencing, and microcystin (MC) measurement were applied to study the fate of cyanobacteria and cyanotoxins after controlled sludge storage (stagnation) in the dark in a full-scale drinking water treatment plant within 7 to 38 days. For four out of eight dates, cyanobacterial cell growth was observed by total taxonomic cell counts during sludge stagnation. The highest observed cell growth was 96% after 16 days of stagnation. Cell growth was dominated by potential MC producers such as Microcystis, Aphanocapsa, Chroococcus, and Dolichospermum. Shotgun metagenomic sequencing unveiled that stagnation stress shifts the cyanobacterial communities from the stress-sensitive Nostocales (e.g., Dolichospermum) order towards less compromised orders and potential MC producers such as Chroococcales (e.g., Microcystis) and Synechococcales (e.g., Synechococcus). The relative increase of cyanotoxin producers presents a health challenge when the supernatant of the stored sludge is recycled to the head of the DWTP or discharged into the source. These findings emphasize the importance of a strategy to manage cyanobacteria-laden sludge and suggest practical approaches should be adopted to control health/environmental impacts of cyanobacteria and cyanotoxins in sludge.
-
The excessive proliferation of cyanobacteria in surface waters is a widespread problem worldwide, leading to the contamination of drinking water sources. Short- and long-term solutions for managing cyanobacterial blooms are needed for drinking water supplies. The goal of this research was to investigate the cyanobacteria community composition using shotgun metagenomics in a short term, in situ mesocosm experiment of two lakes following their coagulation with ferric sulfate (Fe2(SO4)3) as an option for source water treatment. Among the nutrient paramenters, dissolved nitrogen was related to Microcystis in both Missisquoi Bay and Petit Lac St. François, while the presence of Synechococcus was related to total nitrogen, dissolved nitrogen, dissolved organic carbon, and dissolved phosphorus. Results from the shotgun metagenomic sequencing showed that Dolichospermum and Microcystis were the dominant genera in all of the mesocosms in the beginning of the sampling period in Missisquoi Bay and Petit Lac St. François, respectively. Potentially toxigenic genera such as Microcystis were correlated with intracellular microcystin concentrations. A principal component analysis showed that there was a change of the cyanobacterial composition at the genus level in the mesocosms after two days, which varied across the studied sites and sampling time. The cyanobacterial community richness and diversity did not change significantly after its coagulation by Fe2(SO4)3 in all of the mesocosms at either site. The use of Fe2(SO4)3 for an onsite source water treatment should consider its impact on cyanobacterial community structure and the reduction of toxin concentrations.
-
Freshwater bodies and, consequently, drinking water treatment plants (DWTPs) sources are increasingly facing toxic cyanobacterial blooms. Even though conventional treatment processes including coagulation, flocculation, sedimentation, and filtration can control cyanobacteria and cell-bound cyanotoxins, these processes may encounter challenges such as inefficient removal of dissolved metabolites and cyanobacterial cell breakthrough. Furthermore, conventional treatment processes may lead to the accumulation of cyanobacteria cells and cyanotoxins in sludge. Pre-oxidation can enhance coagulation efficiency as it provides the first barrier against cyanobacteria and cyanotoxins and it decreases cell accumulation in DWTP sludge. This critical review aims to: (i) evaluate the state of the science of cyanobacteria and cyanotoxin management throughout DWTPs, as well as their associated sludge, and (ii) develop a decision framework to manage cyanobacteria and cyanotoxins in DWTPs and sludge. The review identified that lab-cultured-based pre-oxidation studies may not represent the real bloom pre-oxidation efficacy. Moreover, the application of a common exposure unit CT (residual concentration × contact time) provides a proper understanding of cyanobacteria pre-oxidation efficiency. Recently, reported challenges on cyanobacterial survival and growth in sludge alongside the cell lysis and cyanotoxin release raised health and technical concerns with regards to sludge storage and sludge supernatant recycling to the head of DWTPs. According to the review, oxidation has not been identified as a feasible option to handle cyanobacterial-laden sludge due to low cell and cyanotoxin removal efficacy. Based on the reviewed literature, a decision framework is proposed to manage cyanobacteria and cyanotoxins and their associated sludge in DWTPs.
-
The impact of oxidation on mitigation of cyanobacteria and cyanotoxins in drinking water treatment sludge was investigated at the laboratory and treatment plant scales. Two common oxidants, KMnO4 (5 and 10 mg/L) and H2O2 (10 and 20 mg/L) were applied under controlled steady-state conditions. Non-oxidized and oxidized sludge was left to stagnate in the dark for 7 to 38 days. Controlled laboratory trials show that KMnO4 and H2O2 decreased cell counts up to 62% and 77%, respectively. The maximum total MC level reduction achieved after oxidation was 41% and 98% using 20 mg/L H2O2 and 10 mg/L KMnO4, respectively. Stagnation caused cell growth up to 2.6-fold in 8 out of 22 oxidized samples. Microcystin (MC) producer orders as Chroococcales and Synechococcales were persistent while Nostocales was sensitive to combined oxidation and stagnation stresses. In parallel, two on-site shock oxidation treatments were performed in the DWTP’s sludge holding tank using 10 mg/L KMnO4. On-site shock oxidation decreased taxonomic cell counts by up to 43% within 24 h. Stagnation preceded by on-site shock oxidation could increase total cell counts by up to 55% as compared to oxidation alone. The increase of cell counts and mcyD gene copy numbers during stagnation revealed the impact of oxidation/stagnation on cyanobacterial cell growth. These findings show the limitations of sludge oxidation as a strategy to manage cyanobacteria and cyanotoxins in sludge and suggest that alternative approaches to prevent the accumulation and mitigation of cyanobacteria in sludge should be considered.
-
The temperate mixedwood forests of eastern North America have been managed by partial cutting for several decades. To ensure that regeneration contributes to replacing the commercial-size stems that are removed by partial cutting, forest managers need to anticipate how saplings (i.e., regenerating trees with a diameter at breast height >1.0 cm) develop in terms of number and diameter. Using up to 20 years of monitoring data from three study sites, we developed a transition matrix model to predict the future number of saplings and their diameter distribution for mixed yellow birch ( Betula alleghaniensis Britton) – conifer stands. Our results show that partial cutting allowed yellow birch, red maple ( Acer rubrum L.), red spruce ( Picea rubens Sarg.), and balsam fir ( Abies balsamea (L.) Mill.) saplings to reach merchantable size faster and in greater numbers than in untreated control plots. We also found that fewer hardwood saplings (yellow birch and red maple) than softwood saplings (red spruce and balsam fir) were required to produce 1 m 2 ·ha −1 of merchantable basal area after 20–40 years. Finally, our model provides a tool for forest managers to predict sapling development in mixed hardwood and softwood stands over a full cutting cycle.
-
Snow is the dominant form of precipitation and the main cryospheric feature of the High Arctic (HA) covering its land, sea, lake and river ice surfaces for a large part of the year. The snow cover in the HA is involved in climate feedbacks that influence the global climate system, and greatly impacts the hydrology and the ecosystems of the coldest biomes of the Northern Hemisphere. The ongoing global warming trend and its polar amplification is threatening the long-term stability of the snow cover in the HA. This study presents an extensive review of the literature on observed and projected snow cover conditions in the High Arctic region. Several key snow cover metrics were reviewed, including snowfall, snow cover duration (SCD), snow cover extent (SCE), snow depth (SD), and snow water equivalent (SWE) since 1930 based on in situ, remote sensing and simulations results. Changes in snow metrics were reviewed and outlined from the continental to the local scale. The reviewed snow metrics displayed different sensitivities to past and projected changes in precipitation and air temperature. Despite the overall increase in snowfall, both observed from historical data and projected into the future, some snow cover metrics displayed consistent decreasing trends, with SCE and SCD showing the most widespread and steady decreases over the last century in the HA, particularly in the spring and summer seasons. However, snow depth and, in some regions SWE, have mostly increased; nevertheless, both SD and SWE are projected to decrease by 2030. By the end of the century, the extent of Arctic spring snow cover will be considerably less than today (10–35%). Model simulations project higher winter snowfall, higher or lower maximum snow depth depending on regions, and a shortened snow season by the end of the century. The spatial pattern of snow metrics trends for both historical and projected climates exhibit noticeable asymmetry among the different HA sectors, with the largest observed and anticipated changes occurring over the Canadian HA.
-
Reduced snow storage has been associated with lower river low flows in mountainous catchments, exacerbating summer hydrological droughts. However, the impacts of changing snow storage on summer low flows in low-elevation, snow-affected catchments has not yet been investigated. To address this knowledge gap, the dominant hydroclimate predictors of summer low flows were first identified through correlation analysis in 12 tributary catchments of the St. Lawrence River in the Canadian province of Quebec. The correlation results show that summer low flow is most sensitive to summer rainfall, while maximum snow water equivalent (SWE) is the dominant winter preconditioning factor of low flows, particularly at the end of summer. The multivariate sensitivity of summer low flow to hydroclimate predictors was then quantified by multilevel regression analysis, considering also the effect of catchment biophysical attributes. Accumulated rainfall since snow cover disappearance was found to be the prime control on summer low flow, as expected for the humid climate of Quebec. Maximum SWE had a secondary but significant positive influence on low flow, sometimes on the same order as the negative effect of evapotranspiration losses. As a whole, our results show that in these low elevation catchments, thicker winter snowpacks that last longer and melt slower in the spring are conducive to higher low flows in the following summer. More rugged and forested catchments with coarser soils were found to have higher summer low flows than flatter agricultural catchments with compacted clayed soils. This emphasizes the role of soils and geology on infiltration, aquifer recharge and related river baseflow in summer. Further climate warming and snowpack depletion could reduce future summer low flow, exacerbating hydrological droughts and impacting ecosystems integrity and ecological services.