Votre recherche
Résultats 12 ressources
-
Hydrological time series often present nonstationarities such as trends, shifts, or oscillations due to anthropogenic effects and hydroclimatological variations, including global climate change. For water managers, it is crucial to recognize and define the nonstationarities in hydrological records. The nonstationarities must be appropriately modeled and stochastically simulated according to the characteristics of observed records to evaluate the adequacy of flood risk mitigation measures and future water resources management strategies. Therefore, in the current study, three approaches were suggested to address stochastically nonstationary behaviors, especially in the long-term variability of hydrological variables: as an overall trend, shifting mean, or as a long-term oscillation. To represent these options for hydrological variables, the autoregressive model with an overall trend, shifting mean level (SML), and empirical mode decomposition with nonstationary oscillation resampling (EMD-NSOR) were employed in the hydrological series of the net basin supply in the Lake Champlain-River Richelieu basin, where the International Joint Committee recently managed and significant flood damage from long consistent high flows occurred. The detailed results indicate that the EMD-NSOR model can be an appropriate option by reproducing long-term dependence statistics and generating manageable scenarios, while the SML model does not properly reproduce the observed long-term dependence, that are critical to simulate sustainable flood events. The trend model produces too many risks for floods in the future but no risk for droughts. The overall results conclude that the nonstationarities in hydrological series should be carefully handled in stochastic simulation models to appropriately manage future water-related risks.
-
Abstract Flood quantile estimation at sites with little or no data is important for the adequate planning and management of water resources. Regional Hydrological Frequency Analysis (RFA) deals with the estimation of hydrological variables at ungauged sites. Random Forest (RF) is an ensemble learning technique which uses multiple Classification and Regression Trees (CART) for classification, regression, and other tasks. The RF technique is gaining popularity in a number of fields because of its powerful non-linear and non-parametric nature. In the present study, we investigate the use of Random Forest Regression (RFR) in the estimation step of RFA based on a case study represented by data collected from 151 hydrometric stations from the province of Quebec, Canada. RFR is applied to the whole data set and to homogeneous regions of stations delineated by canonical correlation analysis (CCA). Using the Out-of-bag error rate feature of RF, the optimal number of trees for the dataset is calculated. The results of the application of the CCA based RFR model (CCA-RFR) are compared to results obtained with a number of other linear and non-linear RFA models. CCA-RFR leads to the best performance in terms of root mean squared error. The use of CCA to delineate neighborhoods improves considerably the performance of RFR. RFR is found to be simple to apply and more efficient than more complex models such as Artificial Neural Network-based models.
-
Hydrological systems are naturally complex and nonlinear. A large number of variables, many of which not yet well considered in regional frequency analysis (RFA), have a significant impact on hydrological dynamics and consequently on flood quantile estimates. Despite the increasing number of statistical tools used to estimate flood quantiles at ungauged sites, little attention has been dedicated to the development of new regional estimation (RE) models accounting for both nonlinear links and interactions between hydrological and physio-meteorological variables. The aim of this paper is to simultaneously take into account nonlinearity and interactions between variables by introducing the multivariate adaptive regression splines (MARS) approach in RFA. The predictive performances of MARS are compared with those obtained by one of the most robust RE models: the generalized additive model (GAM). Both approaches are applied to two datasets covering 151 hydrometric stations in the province of Quebec (Canada): a standard dataset (STA) containing commonly used variables and an extended dataset (EXTD) combining STA with additional variables dealing with drainage network characteristics. Results indicate that RE models using MARS with the EXTD outperform slightly RE models using GAM. Thus, MARS seems to allow for a better representation of the hydrological process and an increased predictive power in RFA.
-
Abstract This study investigates possible trends and teleconnections in temperature extremes in New South Wales (NSW), Australia. Daily maximum and minimum temperature data covering the period 1971–2021 at 26 stations located in NSW were used. Three indices, which focus on daily maximum temperature, daily minimum temperature, and average daily temperature in terms of Excessive Heat Factor (EHF) were investigated to identify the occurrence of heatwaves (HWs). The study considered HWs of different durations (1-, 5-, and 10-days) in relation to intensity, frequency, duration, and their first occurrence parameters. Finally, the influences of three global climate drivers, namely – the El Niño/Southern Oscillation (ENSO), the Southern Annular Mode (SAM), and the Indian Ocean Dipole (IOD) were investigated with associated heatwave attributes for extended Austral summers. In this study, an increasing trend in both hot days and nights was observed for most of the selected stations within the study area. The increase was more pronounced for the last decade (2011–2021) of the investigated time period. The number, duration and frequency of the heatwaves increased over time considering the EHF criterion, whereas no particular trend was detected in cases of TX90 and TN90. It was also evident that the first occurrence of all the HWs shifted towards the onset of the extended summer while considering the EHF criterion of HWs. The correlations between heatwave attributes and climate drivers depicted that heatwave over NSW was positively influenced by both the IOD and ENSO and negatively correlated with SAM. The findings of this study will be useful in formulating strategies for managing the impacts of extreme temperature events such as bushfires, floods, droughts to the most at-risk regions within NSW.
-
Abstract Flow duration curves (FDC) are used to obtain daily streamflow series at ungauged sites. In this study, functional multiple regression (FMR) is proposed for FDC estimation. Its natural framework for dealing with curves allows obtaining the FDC as a whole instead of a limited number of single points. FMR assessment is performed through a case study in Quebec, Canada. FMR provides a better mean FDC estimation when obtained over sites by considering simultaneously all FDC quantiles in the assessment of each given site. However, traditional regression provides a better mean FDC estimation when obtained over given FDC quantiles by considering all sites in the assessment of each quantile separately. Mean daily streamflow estimation is similar; yet FMR provides an improved estimation for most sites. Furthermore, FMR represents a more suitable framework and provides a number of practical advantages, such as insight into descriptor influence on FDC quantiles. Hence, traditional regression may be preferred if only few FDC quantiles are of interest; whereas FMR would be more suitable if a large number of FDC quantiles is of interest, and therefore to estimate daily streamflows.
-
Abstract In flood frequency analysis (FFA), annual maximum (AM) model is widely adopted in practice due to its straightforward sampling process. However, AM model has been criticized for its limited flexibility. FFA using peaks-over-threshold (POT) model is an alternative to AM model, which offers several theoretical advantages; however, this model is currently underemployed internationally. This study aims to bridge the current knowledge gap by conducting a scoping review covering several aspects of the POT approach including model assumptions, independence criteria, threshold selection, parameter estimation, probability distribution, regionalization and stationarity. We have reviewed the previously published articles on POT model to investigate: (a) possible reasons for underemployment of the POT model in FFA; and (b) challenges in applying the POT model. It is highlighted that the POT model offers a greater flexibility compared to the AM model due to the nature of sampling process associated with the POT model. The POT is more capable of providing less biased flood estimates for frequent floods. The underemployment of POT model in FFA is mainly due to the complexity in selecting a threshold (e.g., physical threshold to satisfy independence criteria and statistical threshold for Generalized Pareto distribution – the most commonly applied distribution in POT modelling). It is also found that the uncertainty due to individual variable and combined effects of the variables are not well assessed in previous research, and there is a lack of established guideline to apply POT model in FFA.
-
Abstract The estimation of sea levels corresponding to high return periods is crucial for coastal planning and for the design of coastal defenses. This paper deals with the use of historical observations, that is, events that occurred before the beginning of the systematic tide gauge recordings, to improve the estimation of design sea levels. Most of the recent publications dealing with statistical analyses applied to sea levels suggest that astronomical high tide levels and skew surges should be analyzed and modeled separately. Historical samples generally consist of observed record sea levels. Some extreme historical skew surges can easily remain unnoticed if they occur at low or moderate astronomical high tides and do not generate extreme sea levels. The exhaustiveness of historical skew surge series, which is an essential criterion for an unbiased statistical inference, can therefore not be guaranteed. This study proposes a model combining, in a single Bayesian inference procedure, information of two different natures for the calibration of the statistical distribution of skew surges: measured skew surges for the systematic period and extreme sea levels for the historical period. A data‐based comparison of the proposed model with previously published approaches is presented based on a large number of Monte Carlo simulations. The proposed model is applied to four locations on the French Atlantic and Channel coasts. Results indicate that the proposed model is more reliable and accurate than previously proposed methods that aim at the integration of historical records in coastal sea level or surge statistical analyses. , Plain Language Summary Coastal facilities must be designed as to be protected from extreme sea levels. Sea levels at high tide are the combination of astronomical high tides, which can be predicted, and skew surges. The estimation of the statistical distribution of skew surges is usually based on the skew surges measured by tide gauges and can be improved with the use of historical information, observations that occurred before the beginning of the tide gauge recordings. Extreme skew surges combined with low or moderate astronomical high tides would not necessarily generate extreme sea levels, and consequently some extreme historical skew surges could be missed. The exhaustiveness of historical information is an essential criterion for an unbiased estimation, but it cannot be guaranteed in the case of historical skew surges. The present study proposes to combine skew surges for the recent period and extreme sea levels for the historical period. The proposed model is compared to previously published approaches and appears to be more reliable and accurate. The proposed model is applied to four case studies on the French Atlantic and Channel coasts. , Key Points The exhaustiveness of historical sea record information is demonstrated based on French Atlantic coast data A comparative analysis of approaches to integrate historical information is carried out The efficiency of a new method for the combination of systematic skew surges and historical records is verified
-
Quantile estimates are generally interpreted in association with the return period concept in practical engineering. To do so with the peaks‐over‐threshold (POT) approach, combined Poisson‐generalized Pareto distributions (referred to as PD‐GPD model) must be considered. In this article, we evaluate the incorporation of non‐stationarity in the generalized Pareto distribution (GPD) and the Poisson distribution (PD) using, respectively, the smoothing‐based B‐spline functions and the logarithmic link function. Two models are proposed, a stationary PD combined to a non‐stationary GPD (referred to as PD0‐GPD1) and a combined non‐stationary PD and GPD (referred to as PD1‐GPD1). The teleconnections between hydro‐climatological variables and a number of large‐scale climate patterns allow using these climate indices as covariates in the development of non‐stationary extreme value models. The case study is made with daily precipitation amount time series from southeastern Canada and two climatic covariates, the Arctic Oscillation (AO) and the Pacific North American (PNA) indices. A comparison of PD0‐GPD1 and PD1‐GPD1 models showed that the incorporation of non‐stationarity in both POT models instead of solely in the GPD has an effect on the estimated quantiles. The use of the B‐spline function as link function between the GPD parameters and the considered climatic covariates provided flexible non‐stationary PD‐GPD models. Indeed, linear and nonlinear conditional quantiles are observed at various stations in the case study, opening an interesting perspective for further research on the physical mechanism behind these simple and complex interactions.
-
ABSTRACTThis work explores the ability of two methodologies in downscaling hydrological indices characterizing the low flow regime of three salmon rivers in Eastern Canada: Moisie, Romaine and Ouelle. The selected indices describe four aspects of the low flow regime of these rivers: amplitude, frequency, variability and timing. The first methodology (direct downscaling) ascertains a direct link between large-scale atmospheric variables (the predictors) and low flow indices (the predictands). The second (indirect downscaling) involves downscaling precipitation and air temperature (local climate variables) that are introduced into a hydrological model to simulate flows. Synthetic flow time series are subsequently used to calculate the low flow indices. The statistical models used for downscaling low flow hydrological indices and local climate variables are: Sparse Bayesian Learning and Multiple Linear Regression. The results showed that direct downscaling using Sparse Bayesian Learning surpassed the other a...
-
Generalized Additive Models (GAMs) are introduced in this study for the regional estimation of low-flow characteristics at ungauged basins and compared to other approaches commonly used for this purpose. GAMs provide more flexibility in the shape of the relationships between the response and explanatory variables in comparison to classical models such as multiple linear regression (MLR). Homogeneous regions are defined here using the methods of hierarchical cluster analysis, canonical correlation analysis and region of influence. GAMs and MLR are then used within the delineated regions and also for the whole study area. In addition, a spatial interpolation method is also tested. The different models are applied for the regional estimation of summer and winter low-flow quantiles at stations in Quebec, Canada. Results show that for a given regional delineation method, GAMs provide improved performances compared to MLR.
-
Abstract Change point detection methods have an important role in many hydrological and hydraulic studies of river basins. These methods are very useful to characterize changes in hydrological regimes and can, therefore, lead to better understanding changes in extreme flows behavior. Flood events are generally characterized by a finite number of characteristics that may not include the entire information available in a discharge time series. The aim of the current work is to present a new approach to detect changes in flood events based on a functional data analysis framework. The use of the functional approach allows taking into account the whole information contained in the discharge time series of flood events. The presented methodology is illustrated on a flood analysis case study, from the province of Quebec, Canada. Obtained results using the proposed approach are consistent with those obtained using a traditional change point method, and demonstrate the capability of the functional framework to simultaneously consider several flood features and, therefore, presenting a comprehensive way for a better exploitation of the information contained in a discharge time series.