Votre recherche
Résultats 21 ressources
-
This paper presents a new framework for floodplain inundation modeling in an ungauged basin using unmanned aerial vehicles (UAVs) imagery. This method is based on the integrated analysis of high-resolution ortho-images and elevation data produced by the structure from motion (SfM) technology. To this end, the Flood-Level Marks (FLMs) were created from high-resolution UAV ortho-images and compared to the flood inundated areas simulated using the HEC-RAS hydraulic model. The flood quantiles for 25, 50, 100, and 200 return periods were then estimated by synthetic hydrographs using the Natural Resources Conservation Service (NRCS). The proposed method was applied to UAV image data collected from the Khosban village, in Taleghan County, Iran, in the ungauged sub-basin of the Khosban River. The study area is located along one kilometre of the river in the middle of the village. The results showed that the flood inundation areas modeled by the HEC-RAS were 33%, 19%, and 8% less than those estimated from the UAV’s FLMs for 25, 50, and 100 years return periods, respectively. For return periods of 200 years, this difference was overestimated by more than 6%, compared to the UAV’s FLM. The maximum flood depth in our four proposed scenarios of hydraulic models varied between 2.33 to 2.83 meters. These analyses showed that this method, based on the UAV imagery, is well suited to improve the hydraulic modeling for seasonal inundation in ungauged rivers, thus providing reliable support to flood mitigation strategies
-
Geohazards associated with the dynamics of the liquid and solid water of the Earth’s hydrosphere, such as floods and glacial processes, may pose significant risks to populations, activities and properties [...]
-
Abstract A modified hybrid terrain-following vertical coordinate has recently been implemented within the Global Environmental Multiscale atmospheric model that introduces separately controlled height-dependent progressive decaying of the small- and large-scale orography contributions on the vertical coordinate surfaces. The new vertical coordinate allows for a faster decay of the finescale orography imprints on the coordinate surfaces with increasing height while relaxing the compression of the lowest model levels over complex terrain. A number of tests carried out—including experiments involving Environment and Climate Change Canada’s operational regional and global deterministic prediction systems—demonstrate that the new vertical coordinate effectively eliminates terrain-induced spurious generation and amplification of upper-air vertical motion and kinetic energy without increasing the computational cost. Results also show potential improvements in precipitation over complex terrain.
-
This dataset contains key characteristics about the data described in the Data Descriptor A comprehensive, multisource database for hydrometeorological modeling of 14,425 North American watersheds. <br> Contents: <br> 1. human readable metadata summary table in CSV format 2. machine readable metadata file in JSON format <br> <br> <br>
-
This dataset contains key characteristics about the data described in the Data Descriptor A global database of Holocene paleotemperature records. <br> Contents: <br> 1. human readable metadata summary table in CSV format 2. machine readable metadata file in JSON format <br> <br> <br>
-
This study details the enhancement and calibration of the Arctic implementation of the HYdrological Predictions for the Environment (HYPE) hydrological model established for the BaySys group of projects to produce freshwater discharge scenarios for the Hudson Bay Drainage Basin (HBDB). The challenge in producing estimates of freshwater discharge for the HBDB is that it spans over a third of Canada’s continental landmass and is 40% ungauged. Scenarios for BaySys require the separation between human and climate interactions, specifically the separation of regulated river discharge from a natural, climate-driven response. We present three key improvements to the modelling system required to support the identification of natural from anthropogenic impacts: representation of prairie disconnected landscapes (i.e., non-contributing areas), a method to generalize lake storage-discharge parameters across large regions, and frozen soil modifications. Additionally, a unique approach to account for irregular hydrometric gauge density across the basins during model calibration is presented that avoids overfitting parameters to the densely gauged southern regions. We summarize our methodologies used to facilitate improved separation of human and climate driven impacts to streamflow within the basin and outline the baseline discharge simulations used for the BaySys group of projects. Challenges remain for modeling the most northern reaches of the basin, and in the lake-dominated watersheds. The techniques presented in this work, particularly the lake and flow signature clusters, may be applied to other high latitude, ungauged Arctic basins. Discharge simulations are subsequently used as input data for oceanographic, biogeochemical, and ecosystem studies across the HBDB.
-
Fluvial systems in southern Ontario are regularly affected by widespread early-spring flood events primarily caused by rain-on-snow events. Recent studies have shown an increase in winter floods in this region due to increasing winter temperature and precipitation. Streamflow simulations are associated with uncertainties mainly due to the different scenarios of greenhouse gas emissions, global climate models (GCMs) or the choice of the hydrological model. The internal variability of climate, defined as the chaotic variability of atmospheric circulation due to natural internal processes within the climate system, is also a source of uncertainties to consider. Uncertainties of internal variability can be assessed using hydrological models fed by downscaled data of a global climate model large ensemble (GCM-LE), but GCM outputs have too coarse of a scale to be used in hydrological modeling. The Canadian Regional Climate Model Large Ensemble (CRCM5-LE), a 50-member ensemble downscaled from the Canadian Earth System Model version 2 Large Ensemble (CanESM2-LE), was developed to simulate local climate variability over northeastern North America under different future climate scenarios. In this study, CRCM5-LE temperature and precipitation projections under an RCP8.5 scenario were used as input in the Precipitation Runoff Modeling System (PRMS) to simulate streamflow at a near-future horizon (2026–2055) for four watersheds in southern Ontario. To investigate the role of the internal variability of climate in the modulation of streamflow, the 50 members were first grouped in classes of similar projected change in January–February streamflow and temperature and precipitation between 1961–1990 and 2026–2055. Then, the regional change in geopotential height (Z500) from CanESM2-LE was calculated for each class. Model simulations showed an average January–February increase in streamflow of 18 % (±8.7) in Big Creek, 30.5 % (±10.8) in Grand River, 29.8 % (±10.4) in Thames River and 31.2 % (±13.3) in Credit River. A total of 14 % of all ensemble members projected positive Z500 anomalies in North America's eastern coast enhancing rain, snowmelt and streamflow volume in January–February. For these members the increase of streamflow is expected to be as high as 31.6 % (±8.1) in Big Creek, 48.3 % (±11.1) in Grand River, 47 % (±9.6) in Thames River and 53.7 % (±15) in Credit River. Conversely, 14 % of the ensemble projected negative Z500 anomalies in North America's eastern coast and were associated with a much lower increase in streamflow: 8.3 % (±7.8) in Big Creek, 18.8 % (±5.8) in Grand River, 17.8 % (±6.4) in Thames River and 18.6 % (±6.5) in Credit River. These results provide important information to researchers, managers, policymakers and society about the expected ranges of increase in winter streamflow in a highly populated region of Canada, and they will help to explain how the internal variability of climate is expected to modulate the future streamflow in this region.
-
Les modèles hydrologiques traditionnels n’imposent pas la contrainte de conservation d’énergie à la surface. Lorsque soumis à des températures plus élevées, ils ont le potentiel de surestimer l’évapotranspiration. Le modèle de surface physique CLASS est couplé au modèle de routage RAPID, basé sur la méthode de Muskingum, pour former un modèle hydrologique plus robuste en contexte de réchauffement global. CLASS-RAPID est implanté sur le bassin versant du Haut-Montmorency (47.4°N, 71.1°O). CLASS est calibré et validé à l’aide d’observations hydrométéorologiques à la Forêt Montmorency ; RAPID est optimisé d’après les observations de débits de la Direction d’expertise hydrique du Québec. Des projections climatiques provenant des modèles CanESM2, CNRM-CM5, GFDL-ESM2M et MPI-ESM du Projet d’intercomparaison des modèles couplés et des scénarios climatiques RCP 4.5 et RCP 8.5 sont fournies en entrées à CLASS-RAPID afin de réaliser des simulations hydrologiques pour la période future de 2041 à 2070. Des projections climatiques provenant des mêmes modèles pour la période de référence de 1981 à 2005 sont également utilisées par CLASS-RAPID afin de générer une séquence de débits pouvant être comparée à celle de la période future. CLASS-RAPID obtient un score de NSE = 0, 66 au critère de performance de Nash-Sutcliffe. Le modèle reproduit fidèlement la séquence des évènements hydrologiques, mais sous-estime systématiquement les pointes de crue. Les simulations de CLASS-RAPID réalisées en condition de changements climatiques projettent que les crues printanières se produisent plusieurs dizaines de jours à l’avance pour la période future de 2041 à 2070 en comparaison à la période de référence. Pour les quatre modèles à l’étude, les simulations en condition de changements climatiques permettent de prévoir une diminution moyenne des débits d’étiage d’été de 40% pour le scénario climatique RCP 4.5 et de 50% pour le scénario climatique RCP 8.5. Pour les mêmes scénarios climatiques, l’Atlas hydroclimatique du Québec, qui repose sur une modélisation hydrologique traditionnelle, prévoit une diminution des débits de respectivement 37% et 45%.
-
Reliable long-term streamflow forecast is essential in water resources management and plays a key role in reservoir management and hydropower generation. Properly framing the uncertainty is the key issue in providing a reliable long-term streamflow forecast, and probabilistic forecasts have been used to this effect. In a probabilistic approach, each observed historical data is taken as a possible realization of the future. Non stationarity of hydrometeorological variables, either due to the climate internal variability or anthropogenic change, is another important problem for long-term streamflow forecasts as it is becoming increasingly clearer that past historical data may not adequately represent the current climate. Therefore, there is a need to develop flexible approaches taking into account non-stationarity for long-term streamflow forecasts. Resampling past historical time series is the main approach used for probabilistic long term streamflow forecasts. However, non-stationarity is a key issue of resampling approaches. One possible approach is to make use of a stochastic weather generator coupled to a hydrological model to generate long-term probabilistic streamflow forecasts. Weather generators can easily be modified to account for climatic trends and therefore have the potential to take non-stationarity into account. However, before weather generators can be modified to account for climate non-stationarity, it is first necessary to evaluate whether the modeling chain consisting of a stochastic weather generator and a hydrological model can generate probabilistic streamflow forecasts with a performance similar to that of more traditional resampling approaches. The first objective of this study is therefore, to compare the performance of a stochastic weather generator against that of resampling historical meteorological time series in order to produce ensemble streamflow forecasts. Results indicate that while there are differences between both methods, they nevertheless largely both perform similarly, thus showing that weather generators can be used as substitutes to resampling the historical past. Based on these results, two approaches for taking non-stationarity into account have been proposed. Both approaches are based on a climate-based perturbation of the stochastic weather generator parameters. The first approach explored a simple perturbation method in which the entire length of the historical record is used to quantify internal variability, while a subset of recent years is used to characterize mean climatic values for precipitation, minimum and maximum temperatures. Results show that the approach systematically improves long-term streamflow forecasts accuracy, and that results are dependent on the time window used to estimate current mean climatic estimates. The second approach conditioned the parameters of a stochastic weather generator on largescale climate indices. In this approach, the most important climate indices are identified by looking at yearly correlations between a set of 40 indices and precipitation and temperature. A linear model is then constructed to identify precipitation and temperature anomalies which are then used to induce perturbations in the stochastic weather generator. Five different time windows are defined to determine the optimal linear model. Results show that temperatures are significantly correlated with large-scale climate indices, whereas precipitation is only weakly related to the same indices. The length of the time window has a considerable impact on the prediction ability of the linear models. The precipitation models based on short-duration time windows performed better than those based on longer windows, while the reverse was found for the temperature models. Results show that the proposed method improves long-term streamflow forecasting, particularly around the spring flood.
-
While impressive results have been achieved in the well-known fields where Deep Learning allowed for breakthroughs such as computer vision, its impact on different older areas is still vastly unexplored. In Computational Fluid Dynamics and especially in Flood Modeling, many phenomena are very high-dimensional, and predictions require the use of numerical simulations, which can be, while very robust and tested, computationally heavy and may not prove useful in the context of real-time predictions. This issue led to various attempts at developing Reduced-Order Modeling techniques, both intrusive and non-intrusive. One recent relevant addition is a combination of Proper Orthogonal Decomposition with Deep Neural Networks (POD-NN). Yet, to our knowledge, little has been performed in implementing uncertainty-aware regression tools in the example of the POD-NN framework. In this work, we aim at comparing different novel methods addressing uncertainty quantification in Neural Networks, pushing forward the POD-NN concept with Deep Ensembles and Bayesian Neural Networks, which we first test on benchmark problems, and then apply to a real-life application: flooding predictions in the Mille-Iles river in Laval, QC, Canada. Building a non-intrusive surrogate model, able to know when it doesn’t know, is still an open research area as far as neural networks are concerned.
-
RÉSUMÉ: Les événements de submersion sont en augmentation sur les côtes du fleuve Saint-Laurent en raison des tempêtes, de la hausse du niveau marin et de la diminution de la glace de mer. À ce jour, le Québec ne possède pas de zonage de la submersion. Dans le cadre de cette thèse, une approche de cartographie de la submersion est développée en intégrant les vagues, les niveaux d'eau et la morphologie des plages de l'estuaire et du golfe du Saint-Laurent (EGSL). Deux types d'approches cartographiques ont été comparés : la simulation empirique qui projette un niveau total statique sur le territoire (niveau d'eau observé + effet des vagues sur la côte, le jet de rive ou runup), et le modèle numérique XBeach en mode surfbeat. Ces deux approches nécessitent une surface topo-bathymétrique précise et actualisée de la plage. Grâce au développement d'un réseau de suivi des plages par vidéo, nous évaluons dans un premier temps l'efficacité d'une méthode de topographie intertidale par vidéo par rapport à des levés LiDAR terrestres, et améliorons sa performance en intégrant les niveaux d'eau près de la plage au module d'élévation des lignes d'eau. Ce projet a permis la création de surfaces topographiques à précision centimétrique comparable au LiDAR et d'y extraire des paramètres morphologiques, comme la pente de la plage, nécessaire aux modèles empiriques de niveaux d'eau. La capacité des deux approches de cartographie à simuler la submersion du 6 décembre 2010 au Bas-Saint-Laurent a ensuite été analysée en comparant les surfaces inondées. La correspondance spatiale entre les simulations et les observations de submersion a été évaluée. Il en ressort que malgré la complexité du modèle XBeach et une légère surprédiction du modèle empirique (36%), les surfaces submergées obtenues par les deux approches sont similaires et correctement prédites à hauteur de 66-78%. Dans le cadre d'une troisième étude, XBeach a également été utilisé dans la baie des Chaleurs pour évaluer l'impact d'un événement extrême pour l'horizon 2100 sur l'aléa de submersion. Les simulations montrent que les débordements côtiers ont été engendrés par des vagues de relativement faible amplitude à la côte (Hs < 1 m) et que malgré des profondeurs d'eau avoisinant 1,2 m, des vitesses de courants élevées se sont produites dans les espaces urbanisés (U > 2 m/s). L'analyse de la cartographie de la submersion à Maria suggère qu'en 2100, l'impact de la hausse du niveau marin sur les communautés riveraines du Saint-Laurent pourrait provoquer des submersions plus vastes avec des profondeurs d'eau et vitesses de courants plus élevées, ce qui pourraient intensifier l'aléa auquel fait face la population. Même si les simulations numériques permettent de comprendre comment les phénomènes physiques engendrent la submersion, l'intérêt de la méthode statique réside dans sa rapidité d'application, mais son efficacité est fonction de la validité et l'applicabilité des modèles empiriques de runup utilisés. Ainsi, le dernier volet de la thèse porte sur le paramétrage d'un modèle empirique de runup adapté à l'EGSL. L'observation du runup (et de ses composantes moyenne et haute fréquence, le setup et le swash) par vidéo réalisée sur 5 plages couvre un large spectre de paramètres environnementaux et de types de côte sur une période de 3 ans. Des analyses de corrélation entre les niveaux d'eau à la côte et les caractéristiques de vagues au large et la pente de plage ont été réalisées. Les résultats montrent que l'influence des paramètres hydrodynamiques sur le runup, setup, et swash est paramétrée de façon similaire. Le rôle de la morphologie de la plage sur le setup est par ailleurs paramétré par une fonction inverse de la pente, alors que le swash est fonction de la racine carrée de la pente. Avec une erreur moyenne de 23 cm et un biais de 2 cm, l'équation de runup proposée offre un fort potentiel d'estimation des niveaux d'eau totaux sur les environnements côtiers diversifiés à fetch limité. Les résultats de la thèse montrent qu'il apparaît pertinent d'utiliser une approche statique p ur identifier les zones les plus vulnérables à la submersion, en autant que l'équation utilisée soit validée sur le type d'environnement en question. En combinant cette approche à des modélisations numériques en zones à forte concentration d'enjeux, il sera possible d'instaurer un premier zonage de la submersion au Québec. -- Mot(s) clé(s) en français : Cartographie de la submersion, Runup, Topographie par vidéo, Vagues infragravitaires, XBeach. -- ABSTRACT: Coastal flood events are increasing on the shores of the St. Lawrence River due to storms, rising sea levels and decreasing sea ice. To date, the province of Québec does not have a coastal flood mapping guideline. In this thesis, a coastal flood mapping approach is developed by integrating waves, water levels and beach morphology of the Estuary and Gulf of St. Lawrence (EGSL). Two types of cartographic approaches were compared: the empirical simulation that projects a static total level overland (observed water level + wave effect on the coast, known as wave runup), and the numerical model XBeach in surfbeat mode. These two approaches require a precise and updated topo-bathymetric surface of the beach. Through the development of a shore-based video monitoring network, we first evaluate the effectiveness of a video intertidal topography method against terrestrial LiDAR surveys, and improve its performance by integrating water levels near the beach as a proxy to beach contour elevetion. This project enabled the creation of centimeter-scale topographic surfaces comparable to LiDAR and the extraction of morphological parameters, such as the beach slope, necessary for empirical runup models. The ability of both mapping approaches to simulate the flood of December 6, 2010 in Bas-Saint-Laurent was analyzed by comparing flooded areas. Spatial correspondence between simulations and the observed flood extent was evaluated. Despite the complexity of XBeach and a slight over-prediction of the empirical model (36%), the flooded areas obtained by the two approaches are similar and correctly predicted by 66-78%. In a third study, XBeach was also used in the Chaleur Bay to assess the impact of an extreme event for the 2100 horizon on coastal flood hazards. The simulations show that the overland flow was generated by waves of relatively low amplitude at the coast (Hs <1 m) and that despite water depths close to 1.2 m, high current velocities occurred in the urbanized areas (U> 2 m/s). The analysis of the flood maps in Maria suggests that by 2100, the impact of sea level rise on coastal communities in the St. Lawrence could lead to larger flooded areas, with deeper water depths and higher flow velocity, intensifying the risk to the population. Although numerical simulations offer an understanding of the physical phenomena that cause coastal flooding, the interest of the static method lies in its convenience, but its effectiveness depends on the validity of the empirical runup models employed. Thus, the last part of the thesis deals with the parameterization of an empirical runup model in the EGSL. Video-based wave runup observations (and of its mean and high frequency components, setup and swash, respectively) on 5 beaches was carried out on a broad spectrum of environmental parameters and coast type over a period of 3 years. Correlation analyzes between coastal water levels (runup, setup, and swash) and offshore wave characteristics and beach slope were performed. The results show that the influence of the hydrodynamic parameters on wave runup, setup, and swash is similarly parameterized. The role of the morphology of the range on the setup is however parameterized by an inverse function of the slope, while the swash is a function of the square root of the slope. With an average error of 23 cm and a 2 cm bias, the original runup equation offers a high potential for estimating total water levels over diverse fetch-limited coastal environments. This thesis shows that it seems appropriate to use a static approach to identify the areas most vulnerable to coastal flooding, as long as the equation used is validated on the specific coastal environment. By combining this approach with numerical modeling in coastal hotspots with multiple issues at stake, it will be possible to introduce a first coasta flood zoning in the province of Québec. -- Mot(s) clé(s) en anglais : Coastal flooding, Runup, Video-derived topography, Infragravity waves, XBeach.
-
Multi-cohort forest management in northern hardwood stands may well be the best way to successfully regenerate tree species of intermediate shade tolerance, such as yellow birch (Betula alleghaniensis Britt.). The creation of large enough gaps in the canopy favors increased light availability within the opening, while soil scarification provides suitable germination seedbeds. Evidence of these methods’ success nonetheless remains mostly the purview of experimental studies rather than operational tests. In Quebec, Canada, the multi-cohort methods promoted include group selection cutting and patch cutting. The present study tested their implementation at an operational scale and over a large territory in both hardwood-dominated and mixedwood stands. We assessed their efficacy in promoting natural regeneration of commercial hardwood trees, notably yellow birch and sugar maple (Acer saccharum Marsh.). We conducted regeneration surveys at 2, 5, 10, and 15 years after harvest. Overall, group selection and patch cuttings were successful in regenerating the target species. Yellow birch, for instance, showed a mean stocking around 60% and a mean sapling density around 3400 stems ha−1 after 15 years. We compared several variables for measuring regeneration in early years, and found that the relative abundance, the stocking based on one stem per sampling unit, and the mean maximum height were good predictors of the relative presence of yellow birch and sugar maple in 15-year-old canopy openings. Using smaller sampling units (6.25 m2 rather than 25 m2) and waiting until year 5 may be more useful for making such predictions. In addition, there was an important turnover in vertical dominance in these openings. Non-commercial woody competitors were frequently dominant in early years but were often replaced by commercial hardwoods, notably yellow birch. We propose certain thresholds for assessing the success of post-harvest regeneration and for evaluating the need for a cleaning treatment.
-
This study examines the hydrological sensitivity of an agroforested catchment to changes in temperature and precipitation. A physically based hydrological model was created using the Cold Regions Hydrological Modelling platform to simulate the hydrological processes over 23 years in the Acadie River Catchment in southern Quebec. The observed air temperature and precipitation were perturbed linearly based on existing climate change projections, with warming of up to 8 °C and an increase in total precipitation up to 20%. The results show that warming causes a decrease in blowing snow transport and sublimation losses from blowing snow, canopy-intercepted snowfall and the snowpack. Decreasing blowing snow transport leads to reduced spatial variability in peak snow water equivalent (SWE) and a more synchronized snow cover depletion across the catchment. A 20% increase in precipitation is not sufficient to counteract the decline in annual peak SWE caused by a 1 °C warming. On the other hand, peak spring streamflow increases by 7% and occurs 20 days earlier with a 1 °C warming and a 20% increase in precipitation. However, when warming exceeds 1.5 °C, the catchment becomes more rainfall dominated and the peak flow and its timing follows the rainfall rather than snowmelt regime. Results from this study can be used for sustainable farming development and planning in regions with hydroclimatic characteristics similar to the Acadie River Catchment, where climate change may have a significant impact on the dominating hydrological processes.
-
Devastating floods occur regularly around the world. Recently, machine learning models have been used for flood susceptibility mapping. However, even when these algorithms are provided with adequate ground truth training samples, they can fail to predict flood extends reliably. On the other hand, the height above nearest drainage (HAND) model can produce flood prediction maps with limited accuracy. The objective of this research is to produce an accurate and dynamic flood modeling technique to produce flood maps as a function of water level by combining the HAND model and machine learning. In this paper, the HAND model was utilized to generate a preliminary flood map; then, the predictions of the HAND model were used to produce pseudo training samples for a R.F. model. To improve the R.F. training stage, five of the most effective flood mapping conditioning factors are used, namely, Altitude, Slope, Aspect, Distance from River and Land use/cover map. In this approach, the R.F. model is trained to dynamically estimate the flood extent with the pseudo training points acquired from the HAND model. However, due to the limited accuracy of the HAND model, a random sample consensus (RANSAC) method was used to detect outliers. The accuracy of the proposed model for flood extent prediction, was tested on different flood events in the city of Fredericton, NB, Canada in 2014, 2016, 2018, 2019. Furthermore, to ensure that the proposed model can produce accurate flood maps in other areas as well, it was also tested on the 2019 flood in Gatineau, QC, Canada. Accuracy assessment metrics, such as overall accuracy, Cohen’s kappa coefficient, Matthews correlation coefficient, true positive rate (TPR), true negative rate (TNR), false positive rate (FPR) and false negative rate (FNR), were used to compare the predicted flood extent of the study areas, to the extent estimated by the HAND model and the extent imaged by Sentinel-2 and Landsat satellites. The results confirm that the proposed model can improve the flood extent prediction of the HAND model without using any ground truth training data.
-
Large-scale flood risk assessment is essential in supporting national and global policies, emergency operations and land-use management. The present study proposes a cost-efficient method for the large-scale mapping of direct economic flood damage in data-scarce environments. The proposed framework consists of three main stages: (i) deriving a water depth map through a geomorphic method based on a supervised linear binary classification; (ii) generating an exposure land-use map developed from multi-spectral Landsat 8 satellite images using a machine-learning classification algorithm; and (iii) performing a flood damage assessment using a GIS tool, based on the vulnerability (depth–damage) curves method. The proposed integrated method was applied over the entire country of Romania (including minor order basins) for a 100-year return time at 30-m resolution. The results showed how the description of flood risk may especially benefit from the ability of the proposed cost-efficient model to carry out large-scale analyses in data-scarce environments. This approach may help in performing and updating risk assessments and management, taking into account the temporal and spatial changes in hazard, exposure, and vulnerability.
-
Abstract The present study analyses the impacts of past and future climate change on extreme weather events for southern parts of Canada from 1981 to 2100. A set of precipitation and temperature‐based indices were computed using the downscaled Coupled Model Intercomparison Project Phase 5 (CMIP5) multi‐model ensemble projections at 8 km resolution over the 21st Century for two representative concentration pathway (RCP) scenarios: RCP4.5 and RCP8.5. The results show that this region is expected to experience stronger warming and a higher increase in precipitation extremes in future. Generally, projected changes in minimum temperature will be greater than changes in maximum temperature, as shown by respective indices. A decrease in frost days and an increase in warm nights will be expected. By 2100 there will be no cool nights and cool days. Daily minimum and maximum temperatures will increase by 12 and 7°C, respectively, under the RCP8.5 scenario, when compared with the reference period 1981–2000. The highest warming in minimum temperature and decrease in cool nights and days will occur in Ontario and Quebec provinces close to the Great Lakes and Hudson Bay. The highest warming in maximum temperature will occur in the southern parts of Alberta and Saskatchewan. Annual total precipitation is expected to increase by about 16% and the occurrence of heavy precipitation events by five days. The highest increase in annual total precipitation will occur in the northern parts of Ontario and Quebec and in western British Columbia.