Votre recherche
Résultats 786 ressources
-
Abstract Landslides involving sensitive clays are recurrent events in the world's northern regions and are especially notorious in eastern Canada. The two critical factors that separate sensitive clay landslides from traditional slope stability analysis are the highly brittle behavior in undrained conditions (strain-softening) characteristic of progressive or retrogressive failures and the large deformations associated with them. Conventional limit equilibrium analysis has numerous shortcomings in incorporating these characteristics when assessing landslides in sensitive clays. This paper presents an extensive literature review of the failure mechanics characteristics of landslides in sensitive clays and the existing constitutive models and numerical tools to analyze such slopes' stability and post-failure behavior. The advantages and shortcomings of the different techniques to incorporate strain-softening and large deformation in the numerical modeling of sensitive clay landslides are assessed. The literature review depicts that elastoviscoplastic soil models with non-linear strain-softening laws and rate effects represent the material behavior of sensitive clays. Though several numerical models have been proposed to analyze post-failure runouts, the amount of work performed in line with sensitive clay landslides is very scarce. That creates an urgent need to apply and further develop advanced numerical tools for better understanding and predicting these catastrophic events.
-
Abstract. Hybrid hydroclimatic forecasting systems employ data-driven (statistical or machine learning) methods to harness and integrate a broad variety of predictions from dynamical, physics-based models – such as numerical weather prediction, climate, land, hydrology, and Earth system models – into a final prediction product. They are recognized as a promising way of enhancing the prediction skill of meteorological and hydroclimatic variables and events, including rainfall, temperature, streamflow, floods, droughts, tropical cyclones, or atmospheric rivers. Hybrid forecasting methods are now receiving growing attention due to advances in weather and climate prediction systems at subseasonal to decadal scales, a better appreciation of the strengths of AI, and expanding access to computational resources and methods. Such systems are attractive because they may avoid the need to run a computationally expensive offline land model, can minimize the effect of biases that exist within dynamical outputs, benefit from the strengths of machine learning, and can learn from large datasets, while combining different sources of predictability with varying time horizons. Here we review recent developments in hybrid hydroclimatic forecasting and outline key challenges and opportunities for further research. These include obtaining physically explainable results, assimilating human influences from novel data sources, integrating new ensemble techniques to improve predictive skill, creating seamless prediction schemes that merge short to long lead times, incorporating initial land surface and ocean/ice conditions, acknowledging spatial variability in landscape and atmospheric forcing, and increasing the operational uptake of hybrid prediction schemes.
-
For the past few decades, remote sensing has been a valuable tool for deriving global information on snow water equivalent (SWE), where products derived from space-borne passive microwave radiometers are favoured as they respond to snow depth, an important component of SWE. GlobSnow, a novel SWE product, has increased the accuracy of global-scale SWE estimates by combining remotely sensed radiometric data with other physiographic characteristics, such as snow depth, as quantified by climatic stations. However, research has demonstrated that passive microwaves algorithms tend to underestimate SWE for deep snowpack. Approaches were proposed to correct for such underestimation; however, they are computer intensive and complex to implement at the watershed scale. In this study, SWEmax information from the near real time 5-km GlobSnow product, provided by Copernicus and the European Space Agency (ESA) and GlobSnow product at 25 km resolution were corrected using a simple bias correction approach for watershed scale applications. This method, referred to as the Watershed Scale Correction (WSC) approach, estimates the bias based on the direct runoff that occurs during the spring melt season. Direct runoff is estimated on the one hand from SWEmax information as main input. Infiltration is also considered in computing direct runoff. An independent estimation of direct runoff from gauged stations is also performed. Discrepancy between these estimates allows for estimating the bias correction factor. This approach is advantageous as it exploits data that commonly exists i.e., flow at gauged stations and remotely sensed/reanalysis data such as snow cover and precipitation. The WSC approach was applied to watersheds located in Eastern Canada. It was found that the average bias moved from 33.5% with existing GlobSnow product to 18% with the corrected product, using the recommended recursive filter coefficient β of 0.925 for baseflow separation. Results show the usefulness of integrating direct runoff for bias correction of existing GlobSnow product at the watershed scale. In addition, potential benefits are offered using the recursive filter approach for baseflow separation of watersheds with limited in situ SWE measurements, to further reduce overall uncertainties and bias. The WSC approach should be appealing for poorly monitored watersheds where SWE measurements are critical for hydropower production and where snowmelt can pose serious flood-related damages.
-
Abstract The estimation of sea levels corresponding to high return periods is crucial for coastal planning and for the design of coastal defenses. This paper deals with the use of historical observations, that is, events that occurred before the beginning of the systematic tide gauge recordings, to improve the estimation of design sea levels. Most of the recent publications dealing with statistical analyses applied to sea levels suggest that astronomical high tide levels and skew surges should be analyzed and modeled separately. Historical samples generally consist of observed record sea levels. Some extreme historical skew surges can easily remain unnoticed if they occur at low or moderate astronomical high tides and do not generate extreme sea levels. The exhaustiveness of historical skew surge series, which is an essential criterion for an unbiased statistical inference, can therefore not be guaranteed. This study proposes a model combining, in a single Bayesian inference procedure, information of two different natures for the calibration of the statistical distribution of skew surges: measured skew surges for the systematic period and extreme sea levels for the historical period. A data‐based comparison of the proposed model with previously published approaches is presented based on a large number of Monte Carlo simulations. The proposed model is applied to four locations on the French Atlantic and Channel coasts. Results indicate that the proposed model is more reliable and accurate than previously proposed methods that aim at the integration of historical records in coastal sea level or surge statistical analyses. , Plain Language Summary Coastal facilities must be designed as to be protected from extreme sea levels. Sea levels at high tide are the combination of astronomical high tides, which can be predicted, and skew surges. The estimation of the statistical distribution of skew surges is usually based on the skew surges measured by tide gauges and can be improved with the use of historical information, observations that occurred before the beginning of the tide gauge recordings. Extreme skew surges combined with low or moderate astronomical high tides would not necessarily generate extreme sea levels, and consequently some extreme historical skew surges could be missed. The exhaustiveness of historical information is an essential criterion for an unbiased estimation, but it cannot be guaranteed in the case of historical skew surges. The present study proposes to combine skew surges for the recent period and extreme sea levels for the historical period. The proposed model is compared to previously published approaches and appears to be more reliable and accurate. The proposed model is applied to four case studies on the French Atlantic and Channel coasts. , Key Points The exhaustiveness of historical sea record information is demonstrated based on French Atlantic coast data A comparative analysis of approaches to integrate historical information is carried out The efficiency of a new method for the combination of systematic skew surges and historical records is verified
-
The magnitudes of dissolved organic carbon (DOC) exports from boreal peatlands to streams through lateral subsurface flow vary during the ice-free season. Peatland water table depth and the alternation of low and high flow in peat-draining streams are thought to drive this DOC export variability. However, calculation of the specific DOC exports from a peatland can be challenging considering the multiple potential DOC sources within the catchment. A calculation approach based on the hydrological connectivity between the peat and the stream could help to solve this issue, which is the approach used in the present research. This study took place from June 2018 to October 2019 in a boreal catchment in northeastern Canada, with 76.7 % of the catchment being covered by ombrotrophic peatland. The objectives were to (1) establish relationships between DOC exports from a headwater stream and the peatland hydrology; (2) quantify, at the catchment scale, the amount of DOC laterally exported to the draining stream; and (3) define the patterns of DOC mobilization during high-river-flow events. At the peatland headwater stream outlet, the DOC concentrations were monitored at a high frequency (hourly) using a fluorescent dissolved organic matter (fDOM) sensor, a proxy for DOC concentration. Hydrological variables, such as stream outlet discharge and peatland water table depth (WTD), were continuously monitored at hourly intervals for 2 years. Our results highlight the direct and delayed control of subsurface flow from peat to the stream and associated DOC exports. Rain events raised the peatland WTD, which increased hydrological connectivity between the peatland and the stream. This led to increased stream discharge (Q) and a delayed DOC concentration increase, typical of lateral subsurface flow. The magnitude of the WTD increase played a crucial role in influencing the quantity of DOC exported. Based on the observations that the peatland is the most important contributor to DOC exports at the catchment scale and that other DOC sources were negligible during high-flow periods, we propose a new approach to estimate the specific DOC exports attributable to the peatland by distinguishing between the surfaces used for calculation during high-flow and low-flow periods. In 2018–2019, 92.6 % of DOC was exported during flood events despite the fact that these flood events accounted for 59.1 % of the period. In 2019–2020, 93.8 % of DOC was exported during flood events, which represented 44.1 % of the period. Our analysis of individual flood events revealed three types of events and DOC mobilization patterns. The first type is characterized by high rainfall, leading to an important WTD increase that favours the connection between the peatland and the stream and leading to high DOC exports. The second is characterized by a large WTD increase succeeding a previous event that had depleted DOC available to be transferred to the stream, leading to low DOC exports. The third type corresponds to low rainfall events with an insufficient WTD increase to reconnect the peatland and the stream, leading to low DOC exports. Our results suggest that DOC exports are sensitive to hydroclimatic conditions; moreover, flood events, changes in rainfall regime, ice-free season duration, and porewater temperature may affect the exported DOC and, consequently, partially offset the net carbon sequestration potential of peatlands.
-
As Earth's atmospheric temperatures and human populations increase, more people are becoming vulnerable to natural and human-induced disasters. This is particularly true in Central America, where the growing human population is experiencing climate extremes (droughts and floods), and the region is susceptible to geological hazards, such as earthquakes and volcanic eruptions, and environmental deterioration in many forms (soil erosion, lake eutrophication, heavy metal contamination, etc.). Instrumental and historical data from the region are insufficient to understand and document past hazards, a necessary first step for mitigating future risks. Long, continuous, well-resolved geological records can, however, provide a window into past climate and environmental changes that can be used to better predict future conditions in the region. The Lake Izabal Basin (LIB), in eastern Guatemala, contains the longest known continental records of tectonics, climate, and environmental change in the northern Neotropics. The basin is a pull-apart depression that developed along the North American and Caribbean plate boundary ∼ 12 Myr ago and contains > 4 km of sediment. The sedimentological archive in the LIB records the interplay among several Earth System processes. Consequently, exploration of sediments in the basin can provide key information concerning: (1) tectonic deformation and earthquake history along the plate boundary; (2) the timing and causes of volcanism from the Central American Volcanic Arc; and (3) hydroclimatic, ecologic, and geomicrobiological responses to different climate and environmental states. To evaluate the LIB as a potential site for scientific drilling, 65 scientists from 13 countries and 33 institutions met in Antigua, Guatemala, in August 2022 under the auspices of the International Continental Scientific Drilling Program (ICDP) and the US National Science Foundation (NSF). Several working groups developed scientific questions and overarching hypotheses that could be addressed by drilling the LIB and identified optimal coring sites and instrumentation needed to achieve the project goals. The group also discussed logistical challenges and outreach opportunities. The project is not only an outstanding opportunity to improve our scientific understanding of seismotectonic, volcanic, paleoclimatic, paleoecologic, and paleobiologic processes that operate in the tropics of Central America, but it is also an opportunity to improve understanding of multiple geological hazards and communicate that knowledge to help increase the resilience of at-risk Central American communities.
-
Airborne LiDAR scanning is a promising approach to providing high-resolution products that are appropriate for different applications, such as flood management. However, the vertical accuracy of airborne LiDAR point clouds is not constant and varies in space. Having a better knowledge of their accuracy will assist decision makers in more accurately estimating the damage caused by flood. Data producers often report the total estimation of errors by means of comparison with a ground truth. However, the reliability of such an approach depends on various factors including the sample size, accessibility to ground truth, distribution, and a large enough diversity of ground truth, which comes at a cost and is somewhat unfeasible in the larger scale. Therefore, the main objective of this article is to propose a method that could provide a local estimation of error without any third-party datasets. In this regard, we take advantage of geostatistical ordinary kriging as an alternative accuracy estimator. The challenge of considering constant variation across the space leads us to propose a non-stationary ordinary kriging model that results in the local estimation of elevation accuracy. The proposed method is compared with global ordinary kriging and a ground truth, and the results indicate that our method provides more reliable error values. These errors are lower in urban and semi-urban areas, especially in farmland and residential areas, but larger in forests, due to the lower density of points and the larger terrain variations.
-
In cold regions, ice jams frequently result in severe flooding due to a rapid rise in water levels upstream of the jam. Sudden floods resulting from ice jams threaten human safety and cause damage to properties and infrastructure. Hence, ice-jam prediction tools can give an early warning to increase response time and minimize the possible damages. However, ice-jam prediction has always been a challenge as there is no analytical method available for this purpose. Nonetheless, ice jams form when some hydro-meteorological conditions happen, a few hours to a few days before the event. Ice-jam prediction can be addressed as a binary multivariate time-series classification. Deep learning techniques have been widely used for time-series classification in many fields such as finance, engineering, weather forecasting, and medicine. In this research, we successfully applied convolutional neural networks (CNN), long short-term memory (LSTM), and combined convolutional–long short-term memory (CNN-LSTM) networks to predict the formation of ice jams in 150 rivers in the province of Quebec (Canada). We also employed machine learning methods including support vector machine (SVM), k-nearest neighbors classifier (KNN), decision tree, and multilayer perceptron (MLP) for this purpose. The hydro-meteorological variables (e.g., temperature, precipitation, and snow depth) along with the corresponding jam or no-jam events are used as model inputs. Ten percent of the data were excluded from the model and set aside for testing, and 100 reshuffling and splitting iterations were applied to 80 % of the remaining data for training and 20 % for validation. The developed deep learning models achieved improvements in performance in comparison to the developed machine learning models. The results show that the CNN-LSTM model yields the best results in the validation and testing with F1 scores of 0.82 and 0.92, respectively. This demonstrates that CNN and LSTM models are complementary, and a combination of both further improves classification.
-
Abstract This study investigates possible trends and teleconnections in temperature extremes in New South Wales (NSW), Australia. Daily maximum and minimum temperature data covering the period 1971–2021 at 26 stations located in NSW were used. Three indices, which focus on daily maximum temperature, daily minimum temperature, and average daily temperature in terms of Excessive Heat Factor (EHF) were investigated to identify the occurrence of heatwaves (HWs). The study considered HWs of different durations (1-, 5-, and 10-days) in relation to intensity, frequency, duration, and their first occurrence parameters. Finally, the influences of three global climate drivers, namely – the El Niño/Southern Oscillation (ENSO), the Southern Annular Mode (SAM), and the Indian Ocean Dipole (IOD) were investigated with associated heatwave attributes for extended Austral summers. In this study, an increasing trend in both hot days and nights was observed for most of the selected stations within the study area. The increase was more pronounced for the last decade (2011–2021) of the investigated time period. The number, duration and frequency of the heatwaves increased over time considering the EHF criterion, whereas no particular trend was detected in cases of TX90 and TN90. It was also evident that the first occurrence of all the HWs shifted towards the onset of the extended summer while considering the EHF criterion of HWs. The correlations between heatwave attributes and climate drivers depicted that heatwave over NSW was positively influenced by both the IOD and ENSO and negatively correlated with SAM. The findings of this study will be useful in formulating strategies for managing the impacts of extreme temperature events such as bushfires, floods, droughts to the most at-risk regions within NSW.
-
Hydrological time series often present nonstationarities such as trends, shifts, or oscillations due to anthropogenic effects and hydroclimatological variations, including global climate change. For water managers, it is crucial to recognize and define the nonstationarities in hydrological records. The nonstationarities must be appropriately modeled and stochastically simulated according to the characteristics of observed records to evaluate the adequacy of flood risk mitigation measures and future water resources management strategies. Therefore, in the current study, three approaches were suggested to address stochastically nonstationary behaviors, especially in the long-term variability of hydrological variables: as an overall trend, shifting mean, or as a long-term oscillation. To represent these options for hydrological variables, the autoregressive model with an overall trend, shifting mean level (SML), and empirical mode decomposition with nonstationary oscillation resampling (EMD-NSOR) were employed in the hydrological series of the net basin supply in the Lake Champlain-River Richelieu basin, where the International Joint Committee recently managed and significant flood damage from long consistent high flows occurred. The detailed results indicate that the EMD-NSOR model can be an appropriate option by reproducing long-term dependence statistics and generating manageable scenarios, while the SML model does not properly reproduce the observed long-term dependence, that are critical to simulate sustainable flood events. The trend model produces too many risks for floods in the future but no risk for droughts. The overall results conclude that the nonstationarities in hydrological series should be carefully handled in stochastic simulation models to appropriately manage future water-related risks.