Votre recherche
Résultats 77 ressources
-
This paper presents a new framework for floodplain inundation modeling in an ungauged basin using unmanned aerial vehicles (UAVs) imagery. This method is based on the integrated analysis of high-resolution ortho-images and elevation data produced by the structure from motion (SfM) technology. To this end, the Flood-Level Marks (FLMs) were created from high-resolution UAV ortho-images and compared to the flood inundated areas simulated using the HEC-RAS hydraulic model. The flood quantiles for 25, 50, 100, and 200 return periods were then estimated by synthetic hydrographs using the Natural Resources Conservation Service (NRCS). The proposed method was applied to UAV image data collected from the Khosban village, in Taleghan County, Iran, in the ungauged sub-basin of the Khosban River. The study area is located along one kilometre of the river in the middle of the village. The results showed that the flood inundation areas modeled by the HEC-RAS were 33%, 19%, and 8% less than those estimated from the UAV’s FLMs for 25, 50, and 100 years return periods, respectively. For return periods of 200 years, this difference was overestimated by more than 6%, compared to the UAV’s FLM. The maximum flood depth in our four proposed scenarios of hydraulic models varied between 2.33 to 2.83 meters. These analyses showed that this method, based on the UAV imagery, is well suited to improve the hydraulic modeling for seasonal inundation in ungauged rivers, thus providing reliable support to flood mitigation strategies
-
Geohazards associated with the dynamics of the liquid and solid water of the Earth’s hydrosphere, such as floods and glacial processes, may pose significant risks to populations, activities and properties [...]
-
This study is focused on social cognitive variables that motivate information seeking and information sharing related to Hurricane Harvey. Survey results from a nationally representative sample and a quota sample of Houston residents indicate that climate change beliefs and issue salience are consistent predictors of risk perception, which increases individuals’ negative emotions and information insufficiency. This need for information subsequently motivates information seeking and information sharing. Informational subjective norms are significantly related to seeking and sharing in the Houston sample, whereas perceived information gathering capacity and trust in media are significant predictors in the national sample.
-
The paper identifies a role for public relations in disaster management by analysing disaster and communication managers' understanding of community resilience and their use of communication in the context of two different cultural environments.,The research study comprised 51 in-depth qualitative interviews with disaster managers in Sri Lanka and New Zealand, which were thematically analysed using the software programme NVivo 10.,The study identified cultural differences in Sri Lanka and New Zealand that impact on how managers' communicate in natural disaster situations. The findings indicated that public relations’ understanding of communities’ cultures, their communication, networking and lobbying skills could further enhance the effectiveness of efforts to build community resilience to disasters.,Nations are complex multicultural realities; the findings cannot be generalized to make claims about how natural disasters are managed in different national contexts.,The paper identifies the unrealized potential of public relations’ expertise in communication, community relations, networking and lobbying to contribute to building community resilience to natural disasters.,By supporting efforts to build community resilience to disasters, public relations practitioners can contribute to social well-being in times of catastrophic natural disasters.,The paper adds an innovative perspective to public relations crisis literature by identifying the potential contribution of public relations’ concepts and practices to build community resilience to natural disasters. It demonstrates how sociocultural differences may affect disaster communication strategies.
-
River restoration practice frequently employs conservative designs that create and maintain prescribed, static morphology. Such approaches ignore an emerging understanding of resilient river systems that typically adjust their morphology in response to hydrologic, vegetative and sediment supply changes. As such, using increased dynamism as a restoration design objective will arguably yield more diverse and productive habitats, better managed expectations, and more self-sustaining outcomes. Here, we answer the following question: does restoring lateral migration in a channelised river that was once a wandering gravel-bed river, result in more diverse in-channel geomorphology? We acquired pre- and post-restoration topographic surveys on a segment of the Allt Lorgy, Scotland to quantify morphodynamics and systematically map geomorphic units, using Geomorphic Unit Tool (GUT) software. GUT implements topographic definitions to discriminate between a taxonomy of fluvial landforms that have been developed from an extension of the River Styles framework, using 3-tiered hierarchy: (1) differentiation based on stage or elevation relative to channel; (2) classification of form based on shape (mound, bowl, trough, saddle, plane, wall); and (3) mapping geomorphic units based on attributes (e.g., position and orientation). Results showed restoration increased geomorphic unit diversity, with the Shannon Diversity Index increasing from 1.40 pre-restoration (2012) to 2.04 (2014) and 2.05 (2016) after restoration. Channel widening, due to bank erosion, caused aerial coverage of in-channel geomorphic units to increase 23% after restoration and 6% further in the two-years following restoration. Once bank protection was removed, allowing bank erosion yieled a local supply of sediment to enable the formation and maintenance of lateral and point bars, riffles and diagonal bar complexes, and instream wood created structurally-forced pools and riffles. The methodology used systematically quantifies how geomorphic unit diversity increases when a river is given back its freedom space. The framework allows for testing restoration design hypotheses in post-project appraisal.
-
Watershed management efforts in agriculturally dominated landscapes of North America face nearly two centuries of laws and policies that encouraged habitat destruction. Although streams and wetlands in these landscapes are actively being restored using designs that incorporate science and engineering, watershed drainage laws can constrain action or impact passively restored or naturalized habitat. In general, drainage laws require removal of any riparian vegetation or wood deemed to obstruct flow in streams regulated as drains. We use a case study from Indiana (USA) to introduce the shortcomings of drainage laws for allowing large wood, which is an important habitat feature, to remain in stream ecosystems. Removals of large wood from monitored stream reaches in a regulated drain were associated with subsequent declines in fish biomass. Such legal activities represent an important environmental management problem that exists under drainage laws which apply to streams over a widespread geographic region of North America. Recent litigation in Wisconsin (USA) suggests that if state legislatures fail to update these antiquated laws, the courts may act in favour of science-based management of drains. The statutes and regulations that govern agricultural drainage warrant careful consideration if streams within drainage districts are to be managed to improve ecological function. © 2020 John Wiley & Sons, Ltd.
-
Abstract Within the Copernicus Climate Change Service (C3S), ECMWF is producing the ERA5 reanalysis which, once completed, will embody a detailed record of the global atmosphere, land surface and ocean waves from 1950 onwards. This new reanalysis replaces the ERA‐Interim reanalysis (spanning 1979 onwards) which was started in 2006. ERA5 is based on the Integrated Forecasting System (IFS) Cy41r2 which was operational in 2016. ERA5 thus benefits from a decade of developments in model physics, core dynamics and data assimilation. In addition to a significantly enhanced horizontal resolution of 31 km, compared to 80 km for ERA‐Interim, ERA5 has hourly output throughout, and an uncertainty estimate from an ensemble (3‐hourly at half the horizontal resolution). This paper describes the general set‐up of ERA5, as well as a basic evaluation of characteristics and performance, with a focus on the dataset from 1979 onwards which is currently publicly available. Re‐forecasts from ERA5 analyses show a gain of up to one day in skill with respect to ERA‐Interim. Comparison with radiosonde and PILOT data prior to assimilation shows an improved fit for temperature, wind and humidity in the troposphere, but not the stratosphere. A comparison with independent buoy data shows a much improved fit for ocean wave height. The uncertainty estimate reflects the evolution of the observing systems used in ERA5. The enhanced temporal and spatial resolution allows for a detailed evolution of weather systems. For precipitation, global‐mean correlation with monthly‐mean GPCP data is increased from 67% to 77%. In general, low‐frequency variability is found to be well represented and from 10 hPa downwards general patterns of anomalies in temperature match those from the ERA‐Interim, MERRA‐2 and JRA‐55 reanalyses.
-
UNDRR report published to mark the International Day for Disaster Risk Reduction on October 13, 2020, confirms how extreme weather events have come to dominate the disaster landscape in the 21st century.
-
According to our survey about climate risk perceptions, institutional investors believe climate risks have financial implications for their portfolio firms and that these risks, particularly regulatory risks, already have begun to materialize. Many of the investors, especially the long-term, larger, and ESG-oriented ones, consider risk management and engagement, rather than divestment, to be the better approach for addressing climate risks. Although surveyed investors believe that some equity valuations do not fully reflect climate risks, their perceived overvaluations are not large.
-
Abstract As losses from extreme weather events grow, many governments are looking to privatize the financing and incentivization of climate adaptation through insurance markets. In a pure market approach to insurance for extreme weather events, individuals become responsible for ensuring they are adequately covered for risks to their own properties, and governments no longer contribute funds to post‐disaster recovery. Theoretically, insurance premiums signal the level of risk faced by each household, and incentivize homeowners to invest in adaptive action, such as retrofitting, or drainage work, to reduce premiums. Where risk is considered too high by insurance markets, housing is devalued, in theory leading to retreat from risky areas. In this review article, we evaluate the suitability of private insurance as a mechanism for climate adaptation at a household and community level. We find a mismatch between social understandings of responsibility for climate risks, and the technocratic, market‐based home insurance products offered by private insurance markets. We suggest that by constructing increasingly individualized, technical, and calculative evaluations of risk, market‐based models of insurance for extreme weather events erode the solidaristic and collective discourses and practices that support adaptive behavior. This article is categorized under: Vulnerability and Adaptation to Climate Change > Institutions for Adaptation
-
RÉSUMÉ: Les événements de submersion sont en augmentation sur les côtes du fleuve Saint-Laurent en raison des tempêtes, de la hausse du niveau marin et de la diminution de la glace de mer. À ce jour, le Québec ne possède pas de zonage de la submersion. Dans le cadre de cette thèse, une approche de cartographie de la submersion est développée en intégrant les vagues, les niveaux d'eau et la morphologie des plages de l'estuaire et du golfe du Saint-Laurent (EGSL). Deux types d'approches cartographiques ont été comparés : la simulation empirique qui projette un niveau total statique sur le territoire (niveau d'eau observé + effet des vagues sur la côte, le jet de rive ou runup), et le modèle numérique XBeach en mode surfbeat. Ces deux approches nécessitent une surface topo-bathymétrique précise et actualisée de la plage. Grâce au développement d'un réseau de suivi des plages par vidéo, nous évaluons dans un premier temps l'efficacité d'une méthode de topographie intertidale par vidéo par rapport à des levés LiDAR terrestres, et améliorons sa performance en intégrant les niveaux d'eau près de la plage au module d'élévation des lignes d'eau. Ce projet a permis la création de surfaces topographiques à précision centimétrique comparable au LiDAR et d'y extraire des paramètres morphologiques, comme la pente de la plage, nécessaire aux modèles empiriques de niveaux d'eau. La capacité des deux approches de cartographie à simuler la submersion du 6 décembre 2010 au Bas-Saint-Laurent a ensuite été analysée en comparant les surfaces inondées. La correspondance spatiale entre les simulations et les observations de submersion a été évaluée. Il en ressort que malgré la complexité du modèle XBeach et une légère surprédiction du modèle empirique (36%), les surfaces submergées obtenues par les deux approches sont similaires et correctement prédites à hauteur de 66-78%. Dans le cadre d'une troisième étude, XBeach a également été utilisé dans la baie des Chaleurs pour évaluer l'impact d'un événement extrême pour l'horizon 2100 sur l'aléa de submersion. Les simulations montrent que les débordements côtiers ont été engendrés par des vagues de relativement faible amplitude à la côte (Hs < 1 m) et que malgré des profondeurs d'eau avoisinant 1,2 m, des vitesses de courants élevées se sont produites dans les espaces urbanisés (U > 2 m/s). L'analyse de la cartographie de la submersion à Maria suggère qu'en 2100, l'impact de la hausse du niveau marin sur les communautés riveraines du Saint-Laurent pourrait provoquer des submersions plus vastes avec des profondeurs d'eau et vitesses de courants plus élevées, ce qui pourraient intensifier l'aléa auquel fait face la population. Même si les simulations numériques permettent de comprendre comment les phénomènes physiques engendrent la submersion, l'intérêt de la méthode statique réside dans sa rapidité d'application, mais son efficacité est fonction de la validité et l'applicabilité des modèles empiriques de runup utilisés. Ainsi, le dernier volet de la thèse porte sur le paramétrage d'un modèle empirique de runup adapté à l'EGSL. L'observation du runup (et de ses composantes moyenne et haute fréquence, le setup et le swash) par vidéo réalisée sur 5 plages couvre un large spectre de paramètres environnementaux et de types de côte sur une période de 3 ans. Des analyses de corrélation entre les niveaux d'eau à la côte et les caractéristiques de vagues au large et la pente de plage ont été réalisées. Les résultats montrent que l'influence des paramètres hydrodynamiques sur le runup, setup, et swash est paramétrée de façon similaire. Le rôle de la morphologie de la plage sur le setup est par ailleurs paramétré par une fonction inverse de la pente, alors que le swash est fonction de la racine carrée de la pente. Avec une erreur moyenne de 23 cm et un biais de 2 cm, l'équation de runup proposée offre un fort potentiel d'estimation des niveaux d'eau totaux sur les environnements côtiers diversifiés à fetch limité. Les résultats de la thèse montrent qu'il apparaît pertinent d'utiliser une approche statique p ur identifier les zones les plus vulnérables à la submersion, en autant que l'équation utilisée soit validée sur le type d'environnement en question. En combinant cette approche à des modélisations numériques en zones à forte concentration d'enjeux, il sera possible d'instaurer un premier zonage de la submersion au Québec. -- Mot(s) clé(s) en français : Cartographie de la submersion, Runup, Topographie par vidéo, Vagues infragravitaires, XBeach. -- ABSTRACT: Coastal flood events are increasing on the shores of the St. Lawrence River due to storms, rising sea levels and decreasing sea ice. To date, the province of Québec does not have a coastal flood mapping guideline. In this thesis, a coastal flood mapping approach is developed by integrating waves, water levels and beach morphology of the Estuary and Gulf of St. Lawrence (EGSL). Two types of cartographic approaches were compared: the empirical simulation that projects a static total level overland (observed water level + wave effect on the coast, known as wave runup), and the numerical model XBeach in surfbeat mode. These two approaches require a precise and updated topo-bathymetric surface of the beach. Through the development of a shore-based video monitoring network, we first evaluate the effectiveness of a video intertidal topography method against terrestrial LiDAR surveys, and improve its performance by integrating water levels near the beach as a proxy to beach contour elevetion. This project enabled the creation of centimeter-scale topographic surfaces comparable to LiDAR and the extraction of morphological parameters, such as the beach slope, necessary for empirical runup models. The ability of both mapping approaches to simulate the flood of December 6, 2010 in Bas-Saint-Laurent was analyzed by comparing flooded areas. Spatial correspondence between simulations and the observed flood extent was evaluated. Despite the complexity of XBeach and a slight over-prediction of the empirical model (36%), the flooded areas obtained by the two approaches are similar and correctly predicted by 66-78%. In a third study, XBeach was also used in the Chaleur Bay to assess the impact of an extreme event for the 2100 horizon on coastal flood hazards. The simulations show that the overland flow was generated by waves of relatively low amplitude at the coast (Hs <1 m) and that despite water depths close to 1.2 m, high current velocities occurred in the urbanized areas (U> 2 m/s). The analysis of the flood maps in Maria suggests that by 2100, the impact of sea level rise on coastal communities in the St. Lawrence could lead to larger flooded areas, with deeper water depths and higher flow velocity, intensifying the risk to the population. Although numerical simulations offer an understanding of the physical phenomena that cause coastal flooding, the interest of the static method lies in its convenience, but its effectiveness depends on the validity of the empirical runup models employed. Thus, the last part of the thesis deals with the parameterization of an empirical runup model in the EGSL. Video-based wave runup observations (and of its mean and high frequency components, setup and swash, respectively) on 5 beaches was carried out on a broad spectrum of environmental parameters and coast type over a period of 3 years. Correlation analyzes between coastal water levels (runup, setup, and swash) and offshore wave characteristics and beach slope were performed. The results show that the influence of the hydrodynamic parameters on wave runup, setup, and swash is similarly parameterized. The role of the morphology of the range on the setup is however parameterized by an inverse function of the slope, while the swash is a function of the square root of the slope. With an average error of 23 cm and a 2 cm bias, the original runup equation offers a high potential for estimating total water levels over diverse fetch-limited coastal environments. This thesis shows that it seems appropriate to use a static approach to identify the areas most vulnerable to coastal flooding, as long as the equation used is validated on the specific coastal environment. By combining this approach with numerical modeling in coastal hotspots with multiple issues at stake, it will be possible to introduce a first coasta flood zoning in the province of Québec. -- Mot(s) clé(s) en anglais : Coastal flooding, Runup, Video-derived topography, Infragravity waves, XBeach.
-
Cette thèse vise à améliorer notre compréhension du modèle hédonique et de son application sur les données des biens immobiliers afin d'étudier l'impact d'un événement / externalité / environnementale liée à la présence d'inondation sur la valeur des propriétés résidentielles. Étant donné que les données immobilières sont réparties dans l'espace et dans le temps, des "corrections" temporelles et spatiales sont nécessaires dans le processus de modélisation économétrique. La recherche prend appui sur l’équation de prix hédonique. L’analyse empirique recours également à l’estimateur de type différence de différences spatio-temporelles (STDID) afin d’étudier l’effet d’une inondation survenue en 1998 sur le prix des résidences dans la ville de Laval au Canada entre 1995-2007. Les résultats suggèrent que l’utilisation des informations sur les zones inondables dans le but d’évaluer l’impact des inondations sur les valeurs résidentielles n’est pas une approche nécessairement appropriée. Les conclusions suggèrent que la grande hétérogénéité des résultats notés dans la littérature n’est probablement pas étrangère à la façon de définir les résidences touchées par les inondations. Cela signifie que les recherches empiriques sur les effets des inondations sur la valeur immobilière mesurent en réalité la valeur liée à la perception du risque d'inondation et non l’effet réel de l'inondation. Les résultats suggèrent que les applications futures dans la littérature devront porter une attention particulière à la manière de définir les zones inondables et d’identifier les résidences réellement touchées.
-
Scour is a hydraulic risk threatening the stability of bridges in fluvial and coastal areas. Therefore, developing permanent and real-time monitoring techniques is crucial. Recent advances in strain measurements using fiber optic sensors allow new opportunities for scour monitoring. In this study, the innovative optical frequency domain reflectometry (OFDR) was used to evaluate the effect of scour by performing distributed strain measurements along a rod under static lateral loads. An analytical analysis based on the Winkler model of the soil was carefully established and used to evaluate the accuracy of the fiber optic sensors and helped interpret the measurements results. Dynamic tests were also performed and results from static and dynamic tests were compared using an equivalent cantilever model.
-
Abstract Several studies have reported the factor structure of posttraumatic stress disorder (PTSD) using confirmatory factor analysis (CFA). The results show models with different number of factors, high correlations between factors, and symptoms that belong to different factors in different models without affecting the fit index. These elements could suppose the existence of considerable item cross-loading, the overlap of different factors or even the presence of a general factor that explains the items common source of variance. The aim is to provide new evidence regarding the factor structure of PTSD using CFA and exploratory structural equation modeling (ESEM). In a sample of 1,372 undergraduate students, we tested six different models using CFA and two models using ESEM and ESEM bifactor analysis. Trauma event and past-month PTSD symptoms were assessed with Life Events Checklist for DSM-5 (LEC–5) and PTSD Checklist for DSM-5 (PCL–5). All six tested CFA models showed good fit indexes (RMSEA = .051–.056, CFI = .969–.977, TLI = .965–.970), with high correlations between factors ( M = .77, SD = .09 to M = .80, SD = .09). The ESEM models showed good fit indexes (RMSEA = .027–.036, CFI = .991–.996, TLI = .985–.992). These models confirmed the presence of cross-loadings on several items as well as loads on a general factor that explained 76.3% of the common variance. The results showed that most of the items do not meet the assumption of dimensional exclusivity, showing the need to expand the analysis strategies to study the symptomatic organization of PTSD.
-
Purpose Few people living in informal settlements in the Global South spontaneously claim that they are “resilient” or “adapting” to disaster risk or climate change. Surely, they often overcome multiple challenges, including natural hazards exacerbated by climate change. Yet their actions are increasingly examined through the framework of resilience, a notion developed in the North, and increasingly adopted in the South. To what extent eliminate’ do these initiatives correspond to the concepts that scholars and authorities place under the resilience framework? Design/methodology/approach Three longitudinal case studies in Yumbo, Salgar and San Andrés (Colombia) serve to investigate narratives of disaster risks and responses to them. Methods include narrative analysis from policy and project documents, presentations, five workshops, six focus groups and 24 interviews. Findings The discourse adopted by most international scholars and local authorities differs greatly from that used by citizens to explain risk and masks the politics involved in disaster reduction and the search for social justice. Besides, narratives of social change, aspirations and social status are increasingly masked in disaster risk explanations. Tensions are also concealed, including those regarding the winners and losers of interventions and the responsibilities for disaster risk reduction. Originality/value Our findings confirm previous results that have shown that the resilience framework contributes to “depoliticize” the analysis of risk and serves to mask and dilute the responsibility of political and economic elites in disaster risk creation. But they also show that resilience fails to explain the type of socioeconomic change that is required to reduce vulnerabilities in Latin America.