Votre recherche
Résultats 13 ressources
-
Machine learning (ML) algorithms have emerged as competent tools for identifying areas that are susceptible to flooding. The primary variables considered in most of these works include terrain models, lithology, river networks and land use. While several recent studies include average annual rainfall and/or temperature, other meteorological information such as snow accumulation and short-term intense rain events that may influence the hydrology of the area under investigation have not been considered. Notably, in Canada, most inland flooding occurs during the freshet, due to the melting of an accumulated snowpack coupled with heavy rainfall. Therefore, in this study the impact of several climate variables along with various hydro-geomorphological (HG) variables were tested to determine the impact of their inclusion. Three tests were run: only HG variables, the addition of annual average temperature and precipitation (HG-PT), and the inclusion of six other meteorological datasets (HG-8M) on five study areas across Canada. In HG-PT, both precipitation and temperature were selected as important in every study area, while in HG-8M a minimum of three meteorological datasets were considered important in each study area. Notably, as the meteorological variables were added, many of the initial HG variables were dropped from the selection set. The accuracy, F1, true skill and Area Under the Curve (AUC) were marginally improved when the meteorological data was added to the a parallel random forest algorithm (parRF). When the model is applied to new data, the estimated accuracy of the prediction is higher in HG-8M, indicating that inclusion of relevant, local meteorological datasets improves the result.
-
Abstract. Canada's RADARSAT missions improve the potential to study past flood events; however, existing tools to derive flood depths from this remote-sensing data do not correct for errors, leading to poor estimates. To provide more accurate gridded depth estimates of historical flooding, a new tool is proposed that integrates Height Above Nearest Drainage and Cost Allocation algorithms. This tool is tested against two trusted, hydraulically derived, gridded depths of recent floods in Canada. This validation shows the proposed tool outperforms existing tools and can provide more accurate estimates from minimal data without the need for complex physics-based models or expert judgement. With improvements in remote-sensing data, the tool proposed here can provide flood researchers and emergency managers accurate depths in near-real time.
-
Among the most prevalent natural hazards, flooding has been threatening human lives and properties. Robust flood simulation is required for effective response and prevention. Machine learning is widely used in flood modeling due to its high performance and scalability. Nonetheless, data pre-processing of heterogeneous sources can be cumbersome, and traditional data processing and modeling have been limited to a single resolution. This study employed an Icosahedral Snyder Equal Area Aperture 3 Hexagonal Discrete Global Grid System (ISEA3H DGGS) as a scalable, standard spatial framework for computation, integration, and analysis of multi-source geospatial data. We managed to incorporate external machine learning algorithms with a DGGS-based data framework, and project future flood risks under multiple climate change scenarios for southern New Brunswick, Canada. A total of 32 explanatory factors including topographical, hydrological, geomorphic, meteorological, and anthropogenic were investigated. Results showed that low elevation and proximity to permanent waterbodies were primary factors of flooding events, and rising spring temperatures can increase flood risk. Flooding extent was predicted to occupy 135–203% of the 2019 flood area, one of the most recent major flooding events, by the year 2100. Our results assisted in understanding the potential impact of climate change on flood risk, and indicated the feasibility of DGGS as the standard data fabric for heterogeneous data integration and incorporated in multi-scale data mining.
-
Recent research has extended conventional hydrological algorithms into a hexagonal grid and noted that hydrological modeling on a hexagonal mesh grid outperformed that on a rectangular grid. Among the hydrological products, flow routing grids are the base of many other hydrological simulations, such as flow accumulation, watershed delineation, and stream networks. However, most of the previous research adopted the D6 algorithm, which is analogous to the D8 algorithm over a rectangular grid, to produce flow routing. This paper explored another four methods regarding generating flow directions in a hexagonal grid, based on four algorithms of slope aspect computation. We also developed and visualized hexagonal-grid-based hydrological operations, including flow accumulation, watershed delineation, and hydrological indices computation. Experiments were carried out across multiple grid resolutions with various terrain roughness. The results showed that flow direction can vary among different approaches, and the impact of such variation can propagate to flow accumulation, watershed delineation, and hydrological indices production, which was reflected by the cell-wise comparison and visualization. This research is practical for hydrological analysis in hexagonal, hierarchical grids, such as Discrete Global Grid Systems, and the developed operations can be used in flood modeling in the real world.
-
Synthetic Aperture Radar (SAR) imagery is a vital tool for flood mapping due to its capability to acquire images day and night in almost any weather and to penetrate through cloud cover. In rural areas, SAR backscatter intensity can be used to detect flooded areas accurately; however, the complexity of urban structures makes flood mapping in urban areas a challenging task. In this study, we examine the synergistic use of SAR simulated reflectivity maps and Polarimetric and Interferometric SAR (PolInSAR) features in the improvement of flood mapping in urban environments. We propose a machine learning model employing simulated and PolInSAR features derived from TerraSAR-X images along with five auxiliary features, namely elevation, slope, aspect, distance from the river, and land-use/land-cover that are well-known to contribute to flood mapping. A total of 2450 data points have been used to build and evaluate the model over four different areas with different vegetation and urban density. The results indicated that by using PolInSAR and SAR simulated reflectivity maps together with five auxiliary features, a classification overall accuracy of 93.1% in urban areas was obtained, representing a 9.6% improvement over using the five auxiliary features alone.
-
With the record breaking flood experienced in Canada’s capital region in 2017 and 2019, there is an urgent need to update and harmonize existing flood hazard maps and fill in the spatial gaps between them to improve flood mitigation strategies. To achieve this goal, we aim to develop a novel approach using machine learning classification (i.e., random forest). We used existing fragmented flood hazard maps along the Ottawa River to train a random forest classification model using a range of flood conditioning factors. We then applied this classification across the Capital Region to fill in the spatial gaps between existing flood hazard maps and generate a harmonized high-resolution (1 m) 100 year flood susceptibility map. When validated against recently produced 100 year flood hazard maps across the capital region, we find that this random forest classification approach yields a highly accurate flood susceptibility map. We argue that the machine learning classification approach is a promising technique to fill in the spatial gaps between existing flood hazard maps and create harmonized high-resolution flood susceptibility maps across flood-vulnerable areas. However, caution must be taken in selecting suitable flood conditioning factors and extrapolating classification to areas with similar characteristics to the training sites. The resulted harmonized and spatially continuous flood susceptibility map has wide-reaching relevance for flood mitigation planning in the capital region. The machine learning approach and flood classification optimization method developed in this study is also a first step toward Natural Resources Canada’s aim of creating a spatially continuous flood susceptibility map across the Ottawa River watershed. Our modeling approach is transferable to harmonize flood maps and fill in spatial gaps in other regions of the world and will help mitigate flood disasters by providing accurate flood data for urban planning.
-
Abstract A timely and cost-effective method of creating inundation maps could assist first responders in allocating resources and personnel in the event of a flood or in preparation of a future disaster. The Height Above Nearest Drainage (HAND) model could be implemented into an on-the-fly flood mapping application for a Canada-wide service. The HAND model requires water level (m) data inputs while many sources of hydrological data in Canada only provide discharge (m 3 /sec) data. Synthetic rating curves (SRCs), created using river geometry/characteristics and the Manning’s formula, could be utilized to provide an approximate water level given a discharge input. A challenge with creating SRCs includes representing how multiple different land covers will slow impact flow due to texture and bulky features (i.e., smooth asphalt versus rocky river channel); this relates to the roughness coefficient ( n ). In our study, two methods of representing multiple n values were experimented with (a weighted method and a minimum-median method) and were compared to using a fixed n method. A custom ArcGIS tool, Canadian Estimator of Ratings Curves using HAND and Discharge (CERC-HAND-D), was developed to create SRCs using all three methods. Control data were sourced from gauge stations across Canada in the form of rating curves. Results indicate that in areas with medium to medium–high river gradients (S > 0.002 m/m) or with river reaches under 5 km, the CERC-HAND-D tool creates more accurate SRCs (NRMSE = 3.7–8.8%, Percent Bias = −7.8%—9.4%), with the minimum-median method being the preferred n method.
-
Devastating floods occur regularly around the world. Recently, machine learning models have been used for flood susceptibility mapping. However, even when these algorithms are provided with adequate ground truth training samples, they can fail to predict flood extends reliably. On the other hand, the height above nearest drainage (HAND) model can produce flood prediction maps with limited accuracy. The objective of this research is to produce an accurate and dynamic flood modeling technique to produce flood maps as a function of water level by combining the HAND model and machine learning. In this paper, the HAND model was utilized to generate a preliminary flood map; then, the predictions of the HAND model were used to produce pseudo training samples for a R.F. model. To improve the R.F. training stage, five of the most effective flood mapping conditioning factors are used, namely, Altitude, Slope, Aspect, Distance from River and Land use/cover map. In this approach, the R.F. model is trained to dynamically estimate the flood extent with the pseudo training points acquired from the HAND model. However, due to the limited accuracy of the HAND model, a random sample consensus (RANSAC) method was used to detect outliers. The accuracy of the proposed model for flood extent prediction, was tested on different flood events in the city of Fredericton, NB, Canada in 2014, 2016, 2018, 2019. Furthermore, to ensure that the proposed model can produce accurate flood maps in other areas as well, it was also tested on the 2019 flood in Gatineau, QC, Canada. Accuracy assessment metrics, such as overall accuracy, Cohen’s kappa coefficient, Matthews correlation coefficient, true positive rate (TPR), true negative rate (TNR), false positive rate (FPR) and false negative rate (FNR), were used to compare the predicted flood extent of the study areas, to the extent estimated by the HAND model and the extent imaged by Sentinel-2 and Landsat satellites. The results confirm that the proposed model can improve the flood extent prediction of the HAND model without using any ground truth training data.
-
The Canada Centre for Mapping and Earth Observation (CCMEO) uses Radarsat Constellation Mission (RCM) data for near-real time flood mapping. One of the many advantages of using SAR sensors, is that they are less affected by the cloud coverage and atmospheric conditions, compared to optical sensors. RCM has been used operationally since 2020 and employs 3 satellites, enabling lower revisit times and increased imagery coverage. The team responsible for the production of flood maps in the context of emergency response are able to produce maps within four hours from the data acquisition. Although the results from their automated system are good, there are some limitations to it, requiring manual intervention to correct the data before publication. Main limitations are located in urban and vegetated areas. Work started in 2021 to make use of deep learning algorithms, namely convolutional neural networks (CNN), to improve the performances of the automated production of flood inundation maps. The training dataset make use of the former maps created by the emergency response team and is comprised of over 80 SAR images and corresponding digital elevation model (DEM) in multiple locations in Canada. The training and test images were split in smaller tiles of 256 x 256 pixels, for a total of 22,469 training tiles and 6,821 test tiles. Current implementation uses a U-Net architecture from NRCan geo-deep-learning pipeline (https://github.com/NRCan/geo-deep-learning). To measure performance of the model, intersection over union (IoU) metric is used. The model can achieve 83% IoU for extracting water and flood from background areas over the test tiles. Next steps include increasing the number of different geographical contexts in the training set, towards the integration of the model into production.