Votre recherche
Résultats 55 ressources
-
Abstract. Climate change impact studies require a reference climatological dataset providing a baseline period to assess future changes and post-process climate model biases. High-resolution gridded precipitation and temperature datasets interpolated from weather stations are available in regions of high-density networks of weather stations, as is the case in most parts of Europe and the United States. In many of the world's regions, however, the low density of observational networks renders gauge-based datasets highly uncertain. Satellite, reanalysis and merged product datasets have been used to overcome this deficiency. However, it is not known how much uncertainty the choice of a reference dataset may bring to impact studies. To tackle this issue, this study compares nine precipitation and two temperature datasets over 1145 African catchments to evaluate the dataset uncertainty contribution to the results of climate change studies. These deterministic datasets all cover a common 30-year period needed to define the reference period climate. The precipitation datasets include two gauge-only products (GPCC and CPC Unified), two satellite products (CHIRPS and PERSIANN-CDR) corrected using ground-based observations, four reanalysis products (JRA55, NCEP-CFSR, ERA-I and ERA5) and one merged gauged, satellite and reanalysis product (MSWEP). The temperature datasets include one gauged-only (CPC Unified) product and one reanalysis (ERA5) product. All combinations of these precipitation and temperature datasets were used to assess changes in future streamflows. To assess dataset uncertainty against that of other sources of uncertainty, the climate change impact study used a top-down hydroclimatic modeling chain using 10 CMIP5 (fifth Coupled Model Intercomparison Project) general circulation models (GCMs) under RCP8.5 and two lumped hydrological models (HMETS and GR4J) to generate future streamflows over the 2071–2100 period. Variance decomposition was performed to compare how much the different uncertainty sources contribute to actual uncertainty. Results show that all precipitation and temperature datasets provide good streamflow simulations over the reference period, but four precipitation datasets outperformed the others for most catchments. They are, in order, MSWEP, CHIRPS, PERSIANN and ERA5. For the present study, the two-member ensemble of temperature datasets provided negligible levels of uncertainty. However, the ensemble of nine precipitation datasets provided uncertainty that was equal to or larger than that related to GCMs for most of the streamflow metrics and over most of the catchments. A selection of the four best-performing reference datasets (credibility ensemble) significantly reduced the uncertainty attributed to precipitation for most metrics but still remained the main source of uncertainty for some streamflow metrics. The choice of a reference dataset can therefore be critical to climate change impact studies as apparently small differences between datasets over a common reference period can propagate to generate large amounts of uncertainty in future climate streamflows.
-
Abstract In freshwater ecosystems, several studies have shown a strong linear relationship between total mercury (THg) or methylmercury (MeHg) and dissolved organic carbon (DOC) concentrations. Variations in this linear relationship have been reported, but the magnitude and causes of this variation are not well known. The objective of this study was to conduct a meta‐analysis to quantify and understand the global variation of this mercury (Hg)–DOC association. This meta‐analysis included 54 studies in lentic and lotic ecosystems for a total of 85 THg–DOC and 59 MeHg–DOC relationships. There was an increase in Hg with DOC concentrations in water with a global average slope of 0.25 (confidence interval (CI): 0.20–0.35) ng/mg for THg and 0.029 (CI: 0.014–0.044) ng/mg for MeHg. Relationships were stronger for (1) North American studies, (2) natural environments compared to those disturbed by anthropogenic activities, (3) spatial studies compared to temporal studies, (4) filtered samples (THg only), and (5) the aromatic fraction of DOC compared to the bulk DOC. Coupling with DOC was stronger for THg than for MeHg. Ecosystem type (lentic vs. lotic), geographical coordinates, and publication year did not influence the strength of relationships. Overall, we show that there is a strong but variable coupling between carbon and mercury cycles in freshwater ecosystems globally and that this link is modulated regionally by geographic location, temporal scale, and human activity, with implications for understanding these rapidly changing biogeochemical processes in response to global change. , Plain Language Summary In lakes and rivers, organic carbon is known to be a transporter of mercury, a toxic metal. However, depending on the chemistry of waterbodies, carbon can carry different amounts of mercury. This work compiled results of 54 scientific studies around the world looking at the correlation between mercury and organic carbon. We looked at the conditions that make this relationship vary. We found that relationships were almost always positive and that the type of carbon influenced the amount of mercury that was carried. The strength of those relationships was higher in natural ecosystems compared to those with human influence and in North American ecosystems compared to European and Asian ones. This work is important to understand the mechanism behind the association between mercury and carbon in different environments and how carbon can be used to explain variations in mercury, especially in a changing climate under human pressure. , Key Points Mercury and dissolved organic matter coupling is stronger in spatial studies, in North America, in natural systems, and in filtered samples Correlations are stronger with the aromatic fraction than the bulk dissolved organic carbon and stronger for total than methyl mercury Ecosystem type (lentic vs. lotic), geographical coordinates, and publication year had no effect on the strength of relationships
-
Soil moisture is often considered a direct way of quantifying agricultural drought since it is a measure of the availability of water to support crop growth. Measurements of soil moisture at regional scales have traditionally been sparse, but advances in land surface modelling and the development of satellite technology to indirectly measure surface soil moisture has led to the emergence of a number of national and global soil moisture data sets that can provide insight into the dynamics of agricultural drought. Droughts are often defined by normal conditions for a given time and place; as a result, data sets used to quantify drought need a representative baseline of conditions in order to accurately establish a normal. This presents a challenge when working with earth observation data sets which often have very short baselines for a single instrument. This study assessed three soil moisture data sets: a surface satellite soil moisture data set from the Soil Moisture and Ocean Salinity (SMOS) mission operating since 2010; a blended surface satellite soil moisture data set from the European Space Agency Climate Change Initiative (ESA-CCI) that has a long history and a surface and root zone soil moisture data set from the Canadian Meteorology Centre (CMC)’s Regional Deterministic Prediction System (RDPS). An iterative chi-squared statistical routine was used to evaluate each data set’s sensitivity to canola yields in Saskatchewan, Canada. The surface soil moisture from all three data sets showed a similar temporal trend related to crop yields, showing a negative impact on canola yields when soil moisture exceeded a threshold in May and June. The strength and timing of this relationship varied with the accuracy and statistical properties of the data set, with the SMOS data set showing the strongest relationship (peak X2 = 170 for Day of Year 145), followed by the ESA-CCI (peak X2 = 89 on Day of Year 129) and then the RDPS (peak X2 = 65 on Day of Year 129). Using short baseline soil moisture data sets can produce consistent results compared to using a longer data set, but the characteristics of the years used for the baseline are important. Soil moisture baselines of 18–20 years or more are needed to reliably estimate the relationship between high soil moisture and high yielding years. For the relationship between low soil moisture and low yielding years, a shorter baseline can be used, with reliable results obtained when 10–15 years of data are available, but with reasonably consistent results obtained with as few as 7 years of data. This suggests that the negative impacts of drought on agriculture may be reliably estimated with a relatively short baseline of data.
-
Empirical evidence points out that urban form adaptation to climate-induced flooding events—through interventions in land uses and town plans (i. e., street networks, building footprints, and urban blocks)—might exacerbate vulnerabilities and exposures, engendering risk inequalities and climate injustice. We develop a multicriteria model that draws on distributive justice's interconnections with the risk drivers of social vulnerabilities, flood hazard exposures, and the adaptive capacity of urban form (through land uses and town plans). The model assesses “who” is unequally at-risk to flooding events, hence, should be prioritized in adaptation responses; “where” are the high-risk priority areas located; and “how” can urban form adaptive interventions advance climate justice in the priority areas. We test the model in Toronto, Ontario, Canada, where there are indications of increased rainfall events and disparities in social vulnerabilities. Our methodology started with surveying Toronto-based flooding experts who assigned weights to the risk drivers based on their importance. Using ArcGIS, we then mapped and overlayed the risk drivers' values in all the neighborhoods across the city based on the experts' assigned weights. Accordingly, we identified four high-risk tower communities with old infrastructure and vulnerable populations as the priority neighborhoods for adaptation interventions within the urban form. These four neighborhoods are typical of inner-city tower blocks built in the 20 th century across North America, Europe, and Asia based on modern architectural ideas. Considering the lifespan of these blocks, this study calls for future studies to investigate how these types of neighborhoods can be adapted to climate change to advance climate justice.
-
Extratropical Cyclone (EC) characteristics depend on a combination of large-scale factors and regional processes. However, the latter are considered to be poorly represented in global climate models (GCMs), partly because their resolution is too coarse. This paper describes a framework using possibilities given by regional climate models (RCMs) to gain insight into storm activity during winter over North America (NA). Recent past climate period (1981–2005) is considered to assess EC activity over NA using the NCEP regional reanalysis (NARR) as a reference, along with the European reanalysis ERA-Interim (ERAI) and two CMIP5 GCMs used to drive the Canadian Regional Climate Model—version 5 (CRCM5) and the corresponding regional-scale simulations. While ERAI and GCM simulations show basic agreement with NARR in terms of climatological storm track patterns, detailed bias analyses show that, on the one hand, ERAI presents statistically significant positive biases in terms of EC genesis and therefore occurrence while capturing their intensity fairly well. On the other hand, GCMs present large negative intensity biases in the overall NA domain and particularly over NA eastern coast. In addition, storm occurrence over the northwestern topographic regions is highly overestimated. When the CRCM5 is driven by ERAI, no significant skill deterioration arises and, more importantly, all storm characteristics near areas with marked relief and over regions with large water masses are significantly improved with respect to ERAI. Conversely, in GCM-driven simulations, the added value contributed by CRCM5 is less prominent and systematic, except over western NA areas with high topography and over the Western Atlantic coastlines where the most frequent and intense ECs are located. Despite this significant added-value on seasonal-mean characteristics, a caveat is raised on the RCM ability to handle storm temporal ‘seriality’, as a measure of their temporal variability at a given location. In fact, the driving models induce some significant footprints on the RCM skill to reproduce the intra-seasonal pattern of storm activity.
-
Large-scale flood risk assessment is essential in supporting national and global policies, emergency operations and land-use management. The present study proposes a cost-efficient method for the large-scale mapping of direct economic flood damage in data-scarce environments. The proposed framework consists of three main stages: (i) deriving a water depth map through a geomorphic method based on a supervised linear binary classification; (ii) generating an exposure land-use map developed from multi-spectral Landsat 8 satellite images using a machine-learning classification algorithm; and (iii) performing a flood damage assessment using a GIS tool, based on the vulnerability (depth–damage) curves method. The proposed integrated method was applied over the entire country of Romania (including minor order basins) for a 100-year return time at 30-m resolution. The results showed how the description of flood risk may especially benefit from the ability of the proposed cost-efficient model to carry out large-scale analyses in data-scarce environments. This approach may help in performing and updating risk assessments and management, taking into account the temporal and spatial changes in hazard, exposure, and vulnerability.
-
Abstract Floods are the most common and threatening natural risk for many countries in the world. Flood risk mapping is therefore of great importance for managing socio-economic and environmental impacts. Several researchers have proposed low-complexity and cost-effective flood mapping solutions that are useful for data scarce environments or at large-scale. Among these approaches, a line of recent research focuses on hydrogeomorphic methods that, due to digital elevation models (DEMs), exploit the causality between past flood events and the hydraulic geometry of floodplains. This study aims to compare the use of freely-available DEMs to support an advanced hydrogeomorphic method, Geomorphic Flood Index (GFI), to map flood-prone areas of the Basento River basin (Italy). The five selected DEMs are obtained from different sources, are characterized by different resolutions, spatial coverage, acquisition process, processing and validation, etc., and include: (i) HydroSHEDS v.1.1 (resolution 3 arc-seconds), hydrologically conditioned, derived primarily from STRM (NASA) and characterized by global coverage; (ii) ASTER GDEM v.3 with a res. of around 30 m (source: METI and NASA) and global coverage; (iii) EU-DEM v. 1.1 (res. 1 arc-second), Pan-European and combining SRTM and ASTER GDEM, customized to obtain a consistency with the EU-Hydro and screened to remove artefacts (source: Copernicus Land Monitoring Service); (iv) TinItaly DEM v. 1.1, (res. 10 m-cell size grid) and produced and distributed by INGV with coverage of the entire Italian territory; (v) Laser Scanner DEM with high resolution (5 m cell size grid) produced on the basis of Ground e Model Keypoint and available as part of the RSDI geoportal of the Basilicata Region with coverage at the regional administrative level. The effects of DEMs on the performance of the GFI calibration on the main reach of the Basento River, and its validation on one of its mountain tributaries (Gallitello Creek), were evaluated with widely accepted statistical metrics, i.e., the Area Under the Receiver Operating Characteristics (ROC) curve (AUC), Accuracy, Sensitivity and Specificity. Results confirmed the merits of the GFI in flood mapping using simple watershed characteristics and showed high Accuracy (AUC reached a value over 0.9 in all simulations) and low dependency on changes in the adopted DEMs and standard flood maps (1D and 2D hydraulic models or three return periods). The EU-DEM was identified as the most suitable data source for supporting GFI mapping with an AUC > 0.97 in the calibration phase for the main river reach. This may be due in part to its appropriate resolution for hydrological application but was also due to its customized pre-processing that supported an optimal description of the river network morphology. Indeed, EU-DEM obtained the highest performances (e.g., Accuracy around 98%) even in the validation phase where better results were expected from the high-resolution DEM (due to the very small size of Gallitello Creek cross-sections). For other DEMs, GFI generally showed an increase in metrics performance when, in the calibration phase, it neglected the floodplains of the river delta, where the standard flood map is produced using a 2D hydraulic model. However, if the DEMs were hydrologically conditioned with a relatively simple algorithm that forced the stream flow in the main river network, the GFI could be applied to the whole Basento watershed, including the delta, with a similar performance.
-
A full 3D numerical model is used for studying tidal asymmetry, estuarine circulation, and saline intrusion in the Gironde estuary. The model is calibrated and verified using the data measured during two field surveys in the Gironde estuary. Harmonic analysis of numerical results is proposed to understand how the superposition of M2, M4 and M6 components generate a complex estuarine circulation and salinity intrusion in the Gironde estuary. The numerical results show that the M6 component plays a significant role as important as the M4 one in modifying the nature of tidal asymmetry, especially in the Gironde upper estuary. In this case, the use of the phase lag between M2 and M4, neglecting M6, to predict the tidal asymmetry nature could produce errors. The effect of asymmetrical tides on saline intrusion and residual circulation is specifically discussed here.
-
Abstract Although hydraulic infrastructure such as levees remain important for flood risk management in the USA, France, and Quebec (Canada), there is increasing emphasis on nonstructural measures, such as regulatory flood maps, to reduce exposure and vulnerability, for example, preventing people from building in high hazard areas. One key concept related to areas protected by levees is that of “residual risk”, that is, the risk from floods greater than the design standard of the levees (levee overtopping) and from levee breach. In this article, we review the legislative framework for regulatory flood maps in the USA, France, and Quebec (Canada) and compare how residual risk behind protective structures is taken into account (or not) in regulatory flood maps. We find big differences in how the USA, France and Canada manage residual risk behind the levees. While in France the area behind levees is part of the regulatory flood prone area, and land use restrictions, building codes, emergency measures and risk communication are mandatory, in the USA the area behind levees is only shown as part of the regulatory flood prone area if the levee is not accredited. In Quebec, regulatory flood maps in general follow the French approach with a few exceptions.
-
Global warming is expected to affect both the frequency and severity of extreme weather events, though projections of the response of these events to climate warming remain highly uncertain. The range of changes reported in the climate modelling literature is very large, sometimes leading to contradictory results for a given extreme weather event. Much of this uncertainty stems from the incomplete understanding of the physics of extreme weather processes, the lack of representation of mesoscale processes in coarse-resolution climate models, and the effect of natural climate variability at multi-decadal time scales. However, some of the spread in results originates simply from the variety of scenarios for future climate change used to drive climate model simulations, which hampers the ability to make generalizations about predicted changes in extreme weather events. In this study, we present a meta-analysis of the literature on projected future extreme weather events in order to quantify expected changes in weather extremes as a function of a common metric of global mean temperature increases. We find that many extreme weather events are likely to be significantly affected by global warming. In particular, our analysis indicates that the overall frequency of global tropical cyclones could decrease with global warming but that the intensity of these storms, as well as the frequency of the most intense cyclones could increase, particularly in the northwestern Pacific basin. We also found increases in the intensity of South Asian monsoonal rainfall, the frequency of global heavy precipitation events, the number of North American severe thunderstorm days, North American drought conditions, and European heatwaves, with rising global mean temperatures. In addition, the periodicity of the El Niño–Southern Oscillation may decrease, which could, in itself, influence extreme weather frequency in many areas of the climate system.
-
Abstract. During the last decade, most European countries have produced hazard maps of natural hazards, but little is known about how to communicate these maps most efficiently to the public. In October 2011, Zurich's local authorities informed owners of buildings located in the urban flood hazard zone about potential flood damage, the probability of flood events and protection measures. The campaign was based on the assumptions that informing citizens increases their risk awareness and that citizens who are aware of risks are more likely to undertake actions to protect themselves and their property. This study is intended as a contribution to better understand the factors that influence flood risk preparedness, with a special focus on the effects of such a one-way risk communication strategy. We conducted a standardized mail survey of 1500 property owners in the hazard zones in Zurich (response rate main survey: 34 %). The questionnaire included items to measure respondents' risk awareness, risk preparedness, flood experience, information-seeking behaviour, knowledge about flood risk, evaluation of the information material, risk acceptance, attachment to the property and trust in local authorities. Data about the type of property and socio-demographic variables were also collected. Multivariate data analysis revealed that the average level of risk awareness and preparedness was low, but the results confirmed that the campaign had a statistically significant effect on the level of preparedness. The main influencing factors on the intention to prepare for a flood were the extent to which respondents evaluated the information material positively as well as their risk awareness. Respondents who had never taken any previous interest in floods were less likely to read the material. For future campaigns, we therefore recommend repeated communication that is tailored to the information needs of the target population.
-
Abstract. In response to the EU Floods Directive (2007/60/EC), flood hazard maps are currently produced all over Europe, reflecting a wider shift in focus from "flood protection" to "risk management", for which not only public authorities but also populations at risk are seen as responsible. By providing a visual image of the foreseen consequences of flooding, flood hazard maps can enhance people's knowledge about flood risk, making them more capable of an adequate response. Current literature, however, questions the maps' awareness raising capacity, arguing that their content and design are rarely adjusted to laypeople's needs. This paper wants to complement this perspective with a focus on risk communication by studying how these tools are disseminated and marketed to the public in the first place. Judging from communication theory, simply making hazard maps publicly available is unlikely to lead to attitudinal or behavioral effects, since this typically requires two-way communication and material or symbolic incentives. Consequently, it is relevant to investigate whether and how local risk managers, who are well positioned to interact with the local population, make use of flood hazard maps for risk communication purposes. A qualitative case study of this issue in the German state of Baden-Württemberg suggests that many municipalities lack a clear strategy for using this new information tool for hazard and risk communication. Four barriers in this regard are identified: perceived disinterest/sufficient awareness on behalf of the population at risk; unwillingness to cause worry or distress; lack of skills and resources; and insufficient support. These barriers are important to address – in research as well as in practice – since it is only if flood hazard maps are used to enhance local knowledge resources that they can be expected to contribute to social capacity building.