Votre recherche
Résultats 58 ressources
-
Coastal areas are particularly vulnerable to flooding from heavy rainfall, sea storm surge, or a combination of the two. Recent studies project higher intensity and frequency of heavy rains, and progressive sea level rise continuing over the next decades. Pre-emptive and optimal flood defense policies that adaptively address climate change are needed. However, future climate projections have significant uncertainty due to multiple factors: (a) future CO2 emission scenarios; (b) uncertainties in climate modelling; (c) discount factor changes due to market fluctuations; (d) uncertain migration and population growth dynamics. Here, a methodology is proposed to identify the optimal design and timing of flood defense structures in which uncertainties in 21st century climate projections are explicitly considered probabilistically. A multi-objective optimization model is developed to minimize both the cost of the flood defence infrastructure system and the flooding hydraulic risk expressed by Expected Annual Damage (EAD). The decision variables of the multi-objective optimization problem are the size of defence system and the timing of implementation. The model accounts for the joint probability density functions of extreme rainfall, storm surge and sea level rise, as well as the damages, which are determined dynamically by the defence system state considering the probability and consequences of system failure, using a water depth–damage curve related to the land use (Corine Land Cover); water depth due to flooding are calculated by hydraulic model. A new dominant sorting genetic algorithm (NSGAII) is used to solve the multi-objective problem optimization. A case study is presented for the Pontina Plain (Lazio Italy), a coastal region, originally a swamp reclaimed about a hundred years ago, that is rich in urban centers and farms. A set of optimal adaptation policies, quantifying size and timing of flood defence constructions for different climate scenarios and belonging to the Pareto curve obtained by the NSGAII are identified for such a case study to mitigate the risk of flooding and to aid decision makers.
-
In recent years, understanding and improving the perception of flood risk has become an important aspect of flood risk management and flood risk reduction policies. The aim of this study was to explore perceptions of flood risk in the Petite Nation River watershed, located in southern Quebec, Canada. A survey was conducted with 130 residents living on a floodplain in this river watershed, which had been affected by floods in the spring of 2017. Participants were asked about different aspects related to flood risk, such as the flood hazard experience, the physical changes occurring in the environment, climate change, information accessibility, flood risk governance, adaptation measures, and finally the perception of losses. An analysis of these factors provided perspectives for improving flood risk communication and increasing the public awareness of flood risk. The results indicated that the analyzed aspects are potentially important in terms of risk perception and showed that the flood risk perceptions varied for each aspect analyzed. In general, the information regarding flood risk management is available and generally understandable, and the level of confidence was good towards most authorities. However, the experiences of flood risk and the consequences of climate change on floods were not clear among the respondents. Regarding the adaptation measures, the majority of participants tended to consider non-structural adaptation measures as being more relevant than structural ones. Moreover, the long-term consequences of flooding on property values are of highest concern. These results provide a snapshot of citizens’ risk perceptions and their opinions on topics that are directly related to such risks.
-
Abstract A warmer climate impacts streamflows and these changes need to be quantified to assess future risk, vulnerability, and to implement efficient adaptation measures. The climate simulations from the fifth phase of the Coupled Model Intercomparison Project (CMIP5), which have been the basis of most such assessments over the past decade, are being gradually superseded by the more recent Coupled Model Intercomparison Project Phase 6 (CMIP6). Our study portrays the added value of the CMIP6 ensemble over CMIP5 in a first North America wide comparison using 3,107 catchments. Results show a reduced spread of the CMIP6 ensemble compared to the CMIP5 ensemble for temperature and precipitation projections. In terms of flow indicators, the CMIP6 driven hydrological projections result in a smaller spread of future mean and high flow values, except for mountainous areas. Overall, we assess that the CMIP6 ensemble provides a narrower band of uncertainty of future climate projections, bringing more confidence for hydrological impact studies. , Plain Language Summary Greenhouse gas emissions are causing the climate to warm significantly, which in turn impacts flows in rivers worldwide. To adapt to these changes, it is essential to quantify them and assess future risk and vulnerability. Climate models are the primary tools used to achieve this. The main data set that provides scientists with state‐of‐the‐art climate model simulations is known as the Coupled Model Intercomparison Project (CMIP). The fifth phase of that project (CMIP5) has been used over the past decade in multiple hydrological studies to assess the impacts of climate change on streamflow. The more recent sixth phase (CMIP6) has started to generate projections, which brings the following question: is it necessary to update the hydrological impact studies performed using CMIP5 with the new CMIP6 models? To answer this question, a comparison between CMIP5 and CMIP6 using 3,107 catchments over North America was conducted. Results show that there is less spread in temperature and precipitation projections for CMIP6. This translates into a smaller spread of future mean, high and low flow values, except for mountainous areas. Overall, we assess that using the CMIP6 data set would provide a more concerted range of future climate projections, leading to more confident hydrological impact studies. , Key Points A comparison of hydrological impacts using Coupled Model Intercomparison Project version 5 (CMIP5) and Coupled Model Intercomparison Project Phase 6 (CMIP6) ensembles is performed over 3,107 catchments in North America The CMIP6 ensembles provide a narrower band of uncertainty for hydrological indicators in the future It is recommended to update hydrological impact studies performed using CMIP5 with the CMIP6 ensemble
-
Whether disasters influence adaptation actions in cities is contested. Yet, the extant knowledge base primarily consists of single or small-N case studies, so there is no global overview of the evidence on disaster impacts and adaptation. Here, we use regression analysis to explore the effects of disaster frequency and severity on four adaptation action types in 549 cities. In countries with greater adaptive capacity, economic losses increase city-level actions targeting recently experienced disaster event types, as well as actions to strengthen general disaster preparedness. An increase in disaster frequency reduces actions targeting hazard types other than those that recently occurred, while human losses have few effects. Comparisons between cities across levels of adaptive capacity indicate a wealth effect. More affluent countries incur greater economic damages from disasters, but also have higher governance capacity, creating both incentives and opportunities for adaptation measures. While disaster frequency and severity had a limited impact on adaptation actions overall, results are sensitive to which disaster impacts, adaptation action types, and adaptive capacities are considered.
-
Abstract The performance of adaptation measures depends on their robustness against various possible futures, with varying climate change impacts. Such impacts are driven by both climatic as well as non-climatic drivers. Risk dynamics are then important, as the avoided risk will determine the benefits of adaptation actions. It is argued that the integration of information on changing exposure and vulnerability is needed to make projections of future climate risk more realistic. In addition, many impact and vulnerability studies have used a top-down rather a technical approach. Whether adaptation action is feasible is determined by technical and physical possibilities on the ground, as well as local capacities, governance and preference. These determine the hard and soft limits of adaptation. Therefore, it is argued that the risk metrics outputs alone are not sufficient to predict adaptation outcomes, or predict where adaptation is feasible or not; they must be placed in the local context. Several of the current climate risk products would fall short of their promise to inform adaptation decision-making on the ground. Some steps are proposed to improve adaptation modelling in order to better incorporate these aspects.
-
In cold regions, ice jams frequently result in severe flooding due to a rapid rise in water levels upstream of the jam. Sudden floods resulting from ice jams threaten human safety and cause damage to properties and infrastructure. Hence, ice-jam prediction tools can give an early warning to increase response time and minimize the possible damages. However, ice-jam prediction has always been a challenge as there is no analytical method available for this purpose. Nonetheless, ice jams form when some hydro-meteorological conditions happen, a few hours to a few days before the event. Ice-jam prediction can be addressed as a binary multivariate time-series classification. Deep learning techniques have been widely used for time-series classification in many fields such as finance, engineering, weather forecasting, and medicine. In this research, we successfully applied convolutional neural networks (CNN), long short-term memory (LSTM), and combined convolutional–long short-term memory (CNN-LSTM) networks to predict the formation of ice jams in 150 rivers in the province of Quebec (Canada). We also employed machine learning methods including support vector machine (SVM), k-nearest neighbors classifier (KNN), decision tree, and multilayer perceptron (MLP) for this purpose. The hydro-meteorological variables (e.g., temperature, precipitation, and snow depth) along with the corresponding jam or no-jam events are used as model inputs. Ten percent of the data were excluded from the model and set aside for testing, and 100 reshuffling and splitting iterations were applied to 80 % of the remaining data for training and 20 % for validation. The developed deep learning models achieved improvements in performance in comparison to the developed machine learning models. The results show that the CNN-LSTM model yields the best results in the validation and testing with F1 scores of 0.82 and 0.92, respectively. This demonstrates that CNN and LSTM models are complementary, and a combination of both further improves classification.
-
La température extrême de l’eau influence de nombreuses propriétés physiques, chimiques et biologiques des rivières. l ’ évaluation de l ’ Une prédiction précise de la température de l’eau est importante pour impact environnemental. Dans ce cadre, différents modèles ont été utilisés pour estimer les températures de l ’ linéaires simp eau à différentes échelles spatiales et temporelles, allant des méthodes les pour déterminer l’incertitude à des modèles sophistiqués non linéaires. Cependant, cette variable primordiale n’a pas été traitée dans un contexte probabiliste (ou fréquentiste). Donc, l’estimation des évènements extrêmes thermiques à l’aide des approc hes d’analyse fréquentielle locale (AFL) est importante. Lors de l’estimation des extrêmes thermiques, il est crucial de tenir compte de la forme de la distribution de fréquences considérée. Dans la première partie de la thèse , nous nous concentrons sur la sélection de la distribution de probabilité la plus appropriée des températures des rivières. Le critère d critère d ’ ’ information d ’ Akaike (AIC) et le information bayésien (BIC) sont utilisés pour évaluer la qualité de l distributions statis ’ ajustement des tiques. La validation des distributions candidates appropriées est également effectuée en utilisant l ’ approche de diagramme de rapport des L obtenus montrent que la distribution de Weibull (W2) moments (MRD). Les résultats est celle qui semble s’ajuster le données provenant des stations de haute altitude, tandis que les mieux aux séries d’extrêmes provenant des stations situées dans les régions de basse altitude sont bien adaptées avec la distribution normale (N). Ceci correspond au premier article. L a ’ couverture spatiale des données de température des cours d ’ eau est limitée dans de nombreuses régions du monde. Pour cette raison, une analyse fréquentielle régionale (AFR) permettant d estimer les extrêmes de température des rivières sur des sites non jau gés ou mal surveillés est nécessaire. En général, l’AFR inclut deux étapes principales, la délimitation des régions homogènes (DRH) qui vise à déterminer les sites similaires, et l’estimation régionale (ER) qui transfère l’information depuis les sites déte rminés dans la première étape vers le site cible. Par conséquent, le modèle d’indice thermique (IT) est introduit dans le contexte d’AFR pour estimer les extrêmes du régime thermique. Cette méthode est analogue au modèle d ’ indice de crue (IF) largement uti lisé en hydrologie. Le modèle IT incorpore l’homogénéité de la distribution de fréquence appropriée pour chaque région, ce qui offre une plus grande flexibilité. Dans cette étude, le modèle IT est comparé avec la régression linéaire multiple (MLR). Les rés ultats indiquent que le modèle IT fournit la meilleure performance (Article 2) . Ensuite, l’approche d’analyse canonique des corrélations non linéaires (ACCNL) est intégrée dans la DRH, présentée dans le Chapitre 4 de ce manuscrit (Article 3). Elle permet de considérer la complexité des phénomènes thermiques dans l’étape de DRH. Par la suite, dans le but d’identifier des combinaisons (DRH-ER) plus prometteuses permettant une meilleure estimation, une étude comparative est réalisée. Les combinaisons considérées au niveau des deux étapes de la procédure de l’AFR sont des combinaisons linéaires, semi-linéaires et non linéaires. Les résultats montrent que la meilleure performance globale est présentée par la combinaison non linéaire ACCNL et le modèle additif généralisé (GAM). Finalement, des modèles non paramétriques tels que le foret aléatoire (RF), le boosting de gradient extrême (XGBoost) et le modèle régression multivariée par spline adaptative (MARS) sont introduits dans le contexte de l’AFR pour estimer les quantiles thermiques et les comparer aux quantiles estimés à l’aide du modèle semi-paramétrique GAM. Ces modèles sont combinés avec des approches linéaires et non linéaires dans l’étape DRH, telles que ACC et ACCNL, afin de déterminer leur potentiel prédictif. Les résultats indiquent que ACCNL+GAM est la meilleure, suivie par ACC+MARS. Ceci correspond à l’article 4. <br /><br />Extreme water temperatures have a significant impact on the physical, chemical, and biological properties of the rivers. Environmental impact assessment requires accurate predictions of water temperature. The models used to estimate water temperatures within this framework range from simple linear methods to more complex nonlinear models. However, w ater temperature has not been studied in a probabilistic manner. It is, therefore, essential to estimate extreme thermal events using local frequency analysis (LFA). An LFA aims to predict the frequency and amplitude of these events at a given gauged locat ion. In order to estimate quantiles, it is essential to consider the shape of the frequency distribution being considered. The first part of our study focuses on selecting the most appropriate probability distribution for river water temperatures. The Akai ke information criteria (AIC) and the Bayesian information criteria (BIC) are used to evaluate the goodness of fit of statistical distributions. An Lmoment ratio diagram (MRD) approach is also used to validate sui table candidate distributions. The results good fit for extremes data from the highindicate that the Weibull distribution (W2) provides a altitude stations, while the normal distribution (N) is most appropriate for lowaltitude stations. This corresponds to the first article. In many parts of the world, river temperature data are limited in terms of spatial coverage and size of the series. Therefore, it is necessary to perform a regional frequency analysis (RFA) to estimate river temperature extremes at ungauged or poorly monitored sites. Generall y, RFA involves two main steps: delineation of homogenous regions (DHR), which identifies similar sites, and regional estimation (RE), which transfers information from the identified sites to the target site. The thermal index (TI) model is introduced in t he context of RFA to estimate the extremes of the thermal regime. This method is analogous to the index flood (IF) model commonly used in hydrology. The TI model considers the homogeneity of the appropriate frequency distributions for each region, which pr ovides larger flexibility. This study compares the TI model with multiple linear regression (MLR) approach. Results indicate that the TI model leads to better performances (Article 2). Then, the nonlinear canonical correlations analysis (NLCCA) approach is integrated into the DHR, as presented in Chapter 4 of this manuscript (Article 3). It allows considering the complexity of the thermal phenomena in the DHR step. A comparative study is then conducted to identify more promising combinations (DHR RE), that RFA procedure, linear, semilead to best estimation results. In the two stages of the linear, and nonlinear combinations are considered. The results of this study indicate that the nonlinear combination of the NLCCA and the generalized additive model (GAM ) produces the best overall performances. Finally, nonparametric models such as random forest (RF), extreme gradient boosting (XGBoost), and multivariate adaptive regression splines (MARS) are introduced in the context of RFA in order to estimate thermal q uantiles and compare them to quantiles estimated using the semiparametric GAM model. The predictive potential of these models is determined by combining them with linear and nonlinear approaches, such as CCA and NLCCA, in the DHR step. The results indicat e that NLCCA+GAM is the best, followed by CCA+MARS. This corresponds to article 4.
-
The canalized southernmost section of Wonscotonach (the Don River) in Tkarón:to (Toronto), also known as The Narrows, is a highly disturbed urban natural landscape. Following the 1886 Don Improvement Project, the Keating Channel, and today the Port Lands Revitalization and Flood Protection Project, these Lands have been harnessed and developed through settler colonization to tame and control the once-winding river. This research-creation—in the form of a curated online exhibition and written thesis—presents a critical (re)reading of the notion of improvement, becoming allied to the pre-colonial landscape and the knowledge it carried. This exhibition and thesis develop the concept of the meander, inspired by the non-linear trajectory of the pre-canalized Don River, as a model for the curatorial. The curatorial process of improvement becomes a wall, and the river meanders still began before the global COVID-19 pandemic and, subsequently, was derailed in March 2020. The exhibition’s final form was unknowable throughout much of the curatorial process. Thus, following the meander as a research-creation technique, the curatorial process, exhibitionary structure, and content had to adapt through lingering uncertainty. This thesis, contributing to the theoretical and practical knowledge of research-creation, looks to intersections with the curatorial following the theoretical underpinnings of Erin Manning and Brian Massumi, Natalie Loveless and Stefanie Springgay and Sarah E. Truman. As a project untethered from institutional timelines and normative requirements to ‘know a project in advance,’ as well as the conventions of a physical exhibition, this research-creation manifested through process-led, creative and exploratory techniques (such as walking and drawing) and slowed pace allowed by the COVID-19 pandemic’s reframing of time. This research-creation exhibition and written thesis develop a responsive and resilient curatorial process deeply indebted to Land-based knowledge.
-
Compte tenu de la nécessité de mettre à jour les cartes d'inondation et de minimiser les coûts associés (collecte de données et ressources humaines), il existe un besoin de méthodes alternatives simplifiées ne reposant pas sur la modélisation hydrodynamique classique. L'une des méthodes simplifiées répondant à ce besoin est HAND (Height Above the Nearest Drainage), une approche qui requiert uniquement un modèle numérique d'altitude (MNA) et un réseau hydrographique. Celle-ci a été mise en œuvre dans PHYSITEL, un système d’information géographique SIG spécialisé pour les modèles hydrologiques distribués. Ainsi, pour une hauteur d’eau donnée dans plusieurs tronçons de rivière, il est possible de faire une délimitation de première instance de la surface inondée le long du réseau hydrographique d’un bassin versant. Par ailleurs, l'utilisation des informations fournies par HAND et l'application de l'équation de Manning permettent également de construire une courbe de tarage synthétique pour tout tronçon de rivière en l’absence de données bathymétriques. Ce mémoire présente l’application de cette approche, qui a été validée précédemment en partie sur de grands bassins, sur deux petits bassins, ceux de la rivière à La Raquette, d’une superficie de 133 km², et de la rivière Saint Charles, d’une superficie de 552 km². Trois stations de jaugeage dans chaque bassin ont fourni les informations de base nécessaires au processus de calage de l’approche. L’efficacité et l’adaptabilité de cette approche ont été évaluées dans ce projet en fonction des données disponibles, du temps de calcul et de la précision mesurée par le biais et l’erreur quadratique moyenne. Les incertitudes et sensibilités de l’approche ont été analysées en tenant compte de la résolution spatiale et du manque de données bathymétriques. De plus, des analyses innovatrices ont été produites dans l’application de HAND. Tels qu’une analyse de sensibilité globale pour informer le processus de calage ainsi que l’application d’un critère basé sur le nombre de Froude afin de permettre de valider le respect des hypothèses sous-jacentes à l’application de l’approche sur chaque tronçon de rivière d’un bassin. En utilisant des MNA à haute résolution(<5 m/pixel), des courbes de tarage synthétiques ont été produites avec des biais inférieurs à ±20 % par rapport à des courbes de tarage in-situ. De plus, la détermination d'un critère de sélection des courbes dans un biais de ± 5% par rapport à la courbe de tarage observée a permis d'obtenir des courbes de tarage synthétiques avec des erreurs quadratiques moyennes normalisées comprises entre 0,03 et 0,62. Ainsi, cette approche a été validée pour dériver des courbes de tarage synthétiques et, par conséquent, pour soutenir la délimitation des zones à risque d'inondation dans les petits bassins versants en tenant compte des incertitudes associées à l'application d'une approche de faible complexité. <br /><br />Given the emergent need to update flood inundation maps and minimize associated financial costs (data collection and human resources), simplified alternative methods to the classical hydrodynamic modelling method, are being developed. One of the simplified methods built to fulfill this need is the terrain-based Height Above the Nearest Drainage (HAND) method, which solely relies on a digital elevation model (DEM) and a river network. This approach was implemented in PHYSITEL, a specialized GIS for distributed hydrological models. For a given river reach and water height, HAND can provide a first-hand delineation of the inundated areas within a watershed. In addition, coupling the information provided by HAND and the Manning equation allows for the construction of a synthetic rating curve for any homogeneous river reach where bathymetric data are not available. Since this synthetic rating curve approach has been validated in part for large watersheds, this study tested this approach onto two small watersheds: the 133- km² La Raquette River watershed and the 552-km² Saint Charles River watershed. Three gauging stations on each basin provided the basic data to perform the calibration process. The effectiveness and adaptability of the approach was assessed as a function of available data, computational time, and accuracy measured using the bias and root mean squared error (RMSE). The uncertainties were quantified in terms of spatial resolution and lack of bathymetry data. In addition, innovative analyses were made on the application of the HAND-synthetic rating curve approach. First, a global sensitivity analysis was done to inform the calibration process, and then a Froude number-based criterion was applied to validate the application of the Manning equation on any river reach of a watershed. Using high-resolution DEMs (<5 m/pixel), we obtained synthetic rating curves with bias less than 20% when compared to in-situ rating curves. Finally, a curve selection criterion was applied to identify those curves having a bias of ± 5%. The selected synthetic rating curves had normalized mean squared errors between 0.03 and 0.62. Thus, the proposed approach was deemed appropriate to derive synthetic rating curves and support the delineation of flood risk areas in small watersheds all the while considering the uncertainties associated with applying a low complexity model.
-
Extreme flood events continue to be one of the most threatening natural disasters around the world due to their pronounced social, environmental and economic impacts. Changes in the magnitude and frequency of floods have been documented during the last years, and it is expected that a changing climate will continue to affect their occurrence. Therefore, understanding the impacts of climate change through hydroclimatic simulations has become essential to prepare adaptation strategies for the future. However, the confidence in flood projections is still low due to the considerable uncertainties associated with their simulations, and the complexity of local features influencing these events. The main objective of this doctoral thesis is thus to improve our understanding of the modelling uncertainties associated with the generation of flood projections as well as evaluating strategies to reduce these uncertainties to increase our confidence in flood simulations. To address the main objective, this project aimed at (1) quantifying the uncertainty contributions of different elements involved in the modelling chain used to produce flood projections and, (2) evaluating the effects of different strategies to reduce the uncertainties associated with climate and hydrological models in regions with diverse hydroclimatic conditions. A total of 96 basins located in Quebec (basins dominated by snow-related processes) and Mexico (basins dominated by rain-related processes), covering a wide range of climatic and hydrological regimes were included in the study. The first stage consisted in decomposing the uncertainty contributions of four main uncertainty sources involved in the generation of flood projections: (1) climate models, (2) post-processing methods, (3) hydrological models, and (4) probability distributions used in flood frequency analyses. A variance decomposition method allowed quantifying and ranking the influence of each uncertainty source on floods over the two regions studied and by seasons. The results showed that the uncertainty contributions of each source vary over the different regions and seasons. Regions and seasons dominated by rain showed climate models as the main uncertainty source, while those dominated by snowmelt showed hydrological models as the main uncertainty contributor. These findings not only show the dangers of relying on single climate and hydrological models, but also underline the importance of regional uncertainty analyses. The second stage of this research project focused in evaluating strategies to reduce the uncertainties arising from hydrological models on flood projections. This stage includes two steps: (1) the analysis of the reliability of hydrological model’s calibration under a changing climate and (2) the evaluation of the effects of weighting hydrological simulations on flood projections. To address the first part, different calibration strategies were tested and evaluated using five conceptual lumped hydrological models under contrasting climate conditions with datasets lengths varying from 2 up to 21 years. The results revealed that the climatic conditions of the calibration data have larger impacts on hydrological model’s performance than the lengths of the climate time series. Moreover, changes on precipitation generally showed greater impacts than changes in temperature across all the different basins. These results suggest that shorter calibration and validation periods that are more representative of possible changes in climatic conditions could be more appropriate for climate change impact studies. Following these findings, the effects of different weighting strategies based on the robustness of hydrological models (in contrasting climatic conditions) were assessed on flood projections of the different studied basins. Weighting the five hydrological models based on their robustness showed some improvements over the traditional equal-weighting approach, particularly over warmer and drier conditions. Moreover, the results showed that the difference between these approaches was more pronounced over flood projections, as contrasting flood magnitudes and climate change signals were observed between both approaches. Additional analyses performed over four selected basins using a semi-distributed and more physically-based hydrological model suggested that this type of models might have an added value when simulating low-flows, and high flows on small basins (of about 500 km2). These results highlight once again the importance of working with ensembles of hydrological models and presents the potential impacts of weighting hydrological models on climate change impact studies. The final stage of this study focused on evaluating the impacts of weighting climate simulations on flood projections. The different weighting strategies tested showed that weighting climate simulations can improve the mean hydrograph representation compared to the traditional model “democracy” approach. This improvement was mainly observed with a weighting approach proposed in this thesis that evaluates the skill of the seasonal simulated streamflow against observations. The results also revealed that weighting climate simulations based on their performance can: (1) impact the floods magnitudes, (2) impact the climate change signals, and (3) reduce the uncertainty spreads of the resulting flood projection. These effects were particularly clear over rain-dominated basins, where climate modelling uncertainty plays a main role. These finding emphasize the need to reconsider the traditional climate model democracy approach, especially when studying processes with higher levels of climatic uncertainty. Finally, the implications of the obtained results were discussed. This section puts the main findings into perspective and identifies different ways forward to keep improving the understanding of climate change impacts in hydrology and increasing our confidence on flood projections that are essential to guide adaptation strategies for the future.
-
Globally, the number of people at risk from flooding has been increasing since 2000, with the population from the South being more vulnerable. Millions of households are displaced by disasters every year. In 2009, the city of Ouagadougou in Burkina Faso experienced its most disastrous flood ever recorded. As a response, the government designed a permanent relocation plan in Yagma, a village located outside the city of Ouagadougou. The relocation plan disrupted the livelihoods of the households that were affected by the flood, leading many of them to return and rebuild their houses in flood prone areas. This paper contributes to a body of literature analyzing the heritage of postcolonialism on the flood vulnerability on the poorer communities in Ouagadougou. Using a political ecology frame, the thesis attempts to understand how the government of Burkina Faso and flood victims understand land and belongings, and how that understanding shaped the relocation program. After interviewing flood victims and government officials, an analysis revealed that contrasting views are at work. A perspective based on technical calculations and a neo-colonialist vision of development, on the one hand, and a grounded perspective based on relationships to the land and each other, on the other.
-
Abstract Resilience has become a cornerstone for risk management and disaster reduction. However, it has evolved extensively both etymologically and conceptually in time and across scientific disciplines. The concept has been (re)shaped by the evolution of research and practice efforts. Considered the opposite of vulnerability for a long time, resilience was first defined as the ability to resist, bounce back, cope with, and recover quickly from the impacts of hazards. To avoid the possible return to conditions of vulnerability and exposure to hazards, the notions of post-disaster development, transformation, and adaptation (build back better) and anticipation, innovation, and proactivity (bounce forward) were then integrated. Today, resilience is characterized by a multitude of components and several classifications. We present a selection of 25 components used to define resilience, and an interesting linkage emerges between these components and the dimensions of risk management (prevention, preparedness, response, and recovery), offering a perspective to strengthen resilience through the development of capacities. Despite its potential, resilience is subject to challenges regarding its operationalization, effectiveness, measurement, credibility, equity, and even its nature. Nevertheless, it offers applicability and opportunities for local communities as well as an interdisciplinary look at global challenges.
-
Adapting to some level of climate change has become unavoidable. However, there is surprisingly limited systematic knowledge about whether and how adaptation policies have diffused and could diffuse in the future. Most existing adaptation studies do not explicitly examine policy diffusion, which is a form of interdependent policy-making among jurisdictions at the same or across different levels of governance. To address this gap, we offer a new interpretation and assessment of the extensive adaptation policy literature through a policy diffusion perspective; we pay specific attention to diffusion drivers and barriers, motivations, mechanisms, outputs, and outcomes. We assess the extent to which four motivations and related mechanisms of policy diffusion—interests (linked with learning and competition), rights and duties (tied to coercion), ideology, and recognition (both connected with emulation)—are conceptually and empirically associated with adaptation. We also engage with adaptation policy characteristics, contextual conditions (e.g., problem severity) and different channels of adapation policy diffusion (e.g., transnational networks). We demonstrate that adaptation policy diffusion can be associated with different mechanisms, yet many of them remain remarkably understudied. So are the effects of adaptation policy diffusion in terms of changes in vulnerability and resilience. We thus identify manifold avenues for future research, and provide insights for practitioners who may hope to leverage diffusion mechanisms to enhance their adaptation efforts. This article is categorized under: Policy and Governance > Multilevel and Transnational Climate Change Governance Vulnerability and Adaptation to Climate Change > Institutions for Adaptation
-
Background Given the important role that municipalities must play in adapting to climate change, it is more than ever essential to measure their progress in this area. However, measuring municipalities’ adaptation progress presents its share of difficulties especially when it comes to comparing (on similar dimensions and over time) the situation of different municipal entities and to linking adaptation impacts to local actions. Longitudinal studies with recurring indicators could capture changes occurring over time, but the development of such indicators requires great emphasis on methodological and psychometric aspects, such as measurement validity. Therefore, this study aimed to develop and validate an index of adaptation to heatwaves and flooding at the level of municipal urbanists and urban planners. Methods A sample of 139 officers working in urbanism and urban planning for municipal entities in the province of Quebec (Canada) completed an online questionnaire. Developed based on a literature review and consultation of representatives from the municipal sector, the questionnaire measured whether the respondent’s municipal entity did or did not adopt the behaviors that are recommended in the scientific and gray literature to adapt to heatwaves and flooding. Results Results of the various metrological analyses (indicator reliability analysis, first order confirmatory factor analysis, concurrent validity analysis, and nomological validity assessment analysis) confirmed the validity of the index developed to measure progress in climate change adaptation at the municipal level. The first dimension of the index corresponds to preliminary measures that inform and prepare stakeholders for action (i.e., groundwork adaptation initiatives), whereas the second refers to measures that aim to concretely reduce vulnerability to climate change, to improve the adaptive capacity or the resilience of human and natural systems (i.e., adaptation actions). Conclusion The results of a series of psychometric analyses showed that the index has good validity and could properly measure the adoption of actions to prepare for adaptation as well as adaptation actions per se. Municipal and government officials can therefore consider using it to monitor and evaluate adaptation efforts at the municipal level.
-
Abstract Current flood risk mapping, relying on historical observations, fails to account for increasing threat under climate change. Incorporating recent developments in inundation modelling, here we show a 26.4% (24.1–29.1%) increase in US flood risk by 2050 due to climate change alone under RCP4.5. Our national depiction of comprehensive and high-resolution flood risk estimates in the United States indicates current average annual losses of US$32.1 billion (US$30.5–33.8 billion) in 2020’s climate, which are borne disproportionately by poorer communities with a proportionally larger White population. The future increase in risk will disproportionately impact Black communities, while remaining concentrated on the Atlantic and Gulf coasts. Furthermore, projected population change (SSP2) could cause flood risk increases that outweigh the impact of climate change fourfold. These results make clear the need for adaptation to flood and emergent climate risks in the United States, with mitigation required to prevent the acceleration of these risks.
-
Empirical evidence points out that urban form adaptation to climate-induced flooding events—through interventions in land uses and town plans (i. e., street networks, building footprints, and urban blocks)—might exacerbate vulnerabilities and exposures, engendering risk inequalities and climate injustice. We develop a multicriteria model that draws on distributive justice's interconnections with the risk drivers of social vulnerabilities, flood hazard exposures, and the adaptive capacity of urban form (through land uses and town plans). The model assesses “who” is unequally at-risk to flooding events, hence, should be prioritized in adaptation responses; “where” are the high-risk priority areas located; and “how” can urban form adaptive interventions advance climate justice in the priority areas. We test the model in Toronto, Ontario, Canada, where there are indications of increased rainfall events and disparities in social vulnerabilities. Our methodology started with surveying Toronto-based flooding experts who assigned weights to the risk drivers based on their importance. Using ArcGIS, we then mapped and overlayed the risk drivers' values in all the neighborhoods across the city based on the experts' assigned weights. Accordingly, we identified four high-risk tower communities with old infrastructure and vulnerable populations as the priority neighborhoods for adaptation interventions within the urban form. These four neighborhoods are typical of inner-city tower blocks built in the 20 th century across North America, Europe, and Asia based on modern architectural ideas. Considering the lifespan of these blocks, this study calls for future studies to investigate how these types of neighborhoods can be adapted to climate change to advance climate justice.
-
Abstract Risk management has reduced vulnerability to floods and droughts globally 1,2 , yet their impacts are still increasing 3 . An improved understanding of the causes of changing impacts is therefore needed, but has been hampered by a lack of empirical data 4,5 . On the basis of a global dataset of 45 pairs of events that occurred within the same area, we show that risk management generally reduces the impacts of floods and droughts but faces difficulties in reducing the impacts of unprecedented events of a magnitude not previously experienced. If the second event was much more hazardous than the first, its impact was almost always higher. This is because management was not designed to deal with such extreme events: for example, they exceeded the design levels of levees and reservoirs. In two success stories, the impact of the second, more hazardous, event was lower, as a result of improved risk management governance and high investment in integrated management. The observed difficulty of managing unprecedented events is alarming, given that more extreme hydrological events are projected owing to climate change 3 .
-
The normative dimensions of flood harm in flood risk management (FRM) have become salient in a milieu of extreme flood events. In this article, two types of flood harm will be discussed. They are namely, risk harm and outcome harm. Whilst risk harm suggests that risk imposition by structural FRM measures is a type of harm that can increase vulnerability and diminish well-being, outcome harm is manifested in deliberate flooding used to protect certain privileged communities at the expense of harming other less privileged ones. Risk-imposing parties are required to seek consent for imposing new risks. In contrast, outcome harm as deliberate flooding is far more pernicious and should only be exercised in extreme situations with ample provisions for restitution and recovery. The aim of this article is to foreground and examine these under-explored notions of flood harm in the FRM discourse and in tandem, to expand the normative dimensions of FRM in a milieu where difficult ethical choices abound.