Votre recherche
Résultats 875 ressources
-
Abstract Lightning climate change projections show large uncertainties caused by limited empirical knowledge and strong assumptions inherent to coarse-grid climate modeling. This study addresses the latter issue by implementing and applying the lightning potential index parameterization (LPI) into a fine-grid convection-permitting regional climate model (CPM). This setup takes advantage of the explicit representation of deep convection in CPMs and allows for process-oriented LPI inputs such as vertical velocity within convective cells and coexistence of microphysical hydrometeor types, which are known to contribute to charge separation mechanisms. The LPI output is compared to output from a simpler flash rate parameterization, namely the CAPE $$\times$$ × PREC parameterization, applied in a non-CPM on a coarser grid. The LPI’s implementation into the regional climate model COSMO-CLM successfully reproduces the observed lightning climatology, including its latitudinal gradient, its daily and hourly probability distributions, and its diurnal and annual cycles. Besides, the simulated temperature dependence of lightning reflects the observed dependency. The LPI outperforms the CAPE $$\times$$ × PREC parameterization in all applied diagnostics. Based on this satisfactory evaluation, we used the LPI to a climate change projection under the RCP8.5 scenario. For the domain under investigation centered over Germany, the LPI projects a decrease of $$4.8\%$$ 4.8 % in flash rate by the end of the century, in opposition to a projected increase of $$17.4\%$$ 17.4 % as projected using the CAPE $$\times$$ × PREC parameterization. The future decrease of LPI occurs mostly during the summer afternoons and is related to (i) a change in convection occurrence and (ii) changes in the microphysical mixing. The two parameterizations differ because of different convection occurrences in the CPM and non-CPM and because of changes in the microphysical mixing, which is only represented in the LPI lightning parameterization.
-
Abstract Large scale flood risk analyses are fundamental to many applications requiring national or international overviews of flood risk. While large‐scale climate patterns such as teleconnections and climate change become important at this scale, it remains a challenge to represent the local hydrological cycle over various watersheds in a manner that is physically consistent with climate. As a result, global models tend to suffer from a lack of available scenarios and flexibility that are key for planners, relief organizations, regulators, and the financial services industry to analyze the socioeconomic, demographic, and climatic factors affecting exposure. Here we introduce a data‐driven, global, fast, flexible, and climate‐consistent flood risk modeling framework for applications that do not necessarily require high‐resolution flood mapping. We use statistical and machine learning methods to examine the relationship between historical flood occurrence and impact from the Dartmouth Flood Observatory (1985–2017), and climatic, watershed, and socioeconomic factors for 4,734 HydroSHEDS watersheds globally. Using bias‐corrected output from the NCAR CESM Large Ensemble (1980–2020), and the fitted statistical relationships, we simulate 1 million years of events worldwide along with the population displaced in each event. We discuss potential applications of the model and present global flood hazard and risk maps. The main value of this global flood model lies in its ability to quickly simulate realistic flood events at a resolution that is useful for large‐scale socioeconomic and financial planning, yet we expect it to be useful to climate and natural hazard scientists who are interested in socioeconomic impacts of climate. , Plain Language Summary Flood is among the deadliest and most damaging natural disasters. To protect against large scale flood risk, stakeholders need to understand how floods can occur and their potential impacts. Stakeholders rely on global flood models to provide them with plausible flood scenarios around the world. For a flood model to operate at the global scale, climate effects must be represented in addition to hydrological ones to demonstrate how rivers can overflow throughout the world each year. Global flood models often lack the flexibility and variety of scenarios required by many stakeholders because they are computationally demanding. Designed for applications where detailed local flood impacts are not required, we introduce a rapid and flexible global flood model that can generate hundreds of thousands of scenarios everywhere in the world in a matter of minutes. The model is based on a historical flood database from 1985 to 2017 that is represented using an algorithm that learns from the data. With this model, the output from a global climate model is used to simulate a large sample of floods for risk analyses that are coherent with global climate. Maps of the annual average number of floods and number of displaced people illustrate the models results. , Key Points We present a global flood model built using machine learning methods fitted with historical flood occurrences and impacts Forced with a climate model, the global flood model is fast, flexible and consistent with global climate We provide global flood hazard (occurrence) and risk (population displaced) maps over 4,734 watersheds
-
Abstract The structure and “metabolism” (movement and conversion of goods and energy) of urban areas has caused cities to be identified as “super‐organisms”, placed between ecosystems and the biosphere, in the hierarchy of living systems. Yet most such analogies are weak, and render the super‐organism model ineffective for sustainable development of cities. Via a cluster analysis of 15 shared traits of the hierarchical living system, we found that industrialized cities are more similar to eukaryotic cells than to multicellular organisms; enclosed systems, such as factories and greenhouses, paralleling organelles in eukaryotic cells. We further developed a “super‐cell” industrialized city model: a “eukarcity” with citynucleus (urban area) as a regulating centre, and organaras (enclosed systems, which provide the majority of goods and services) as the functional components, and cityplasm (natural ecosystems and farmlands) as the matrix. This model may improve the vitality and sustainability of cities through planning and management.
-
Abstract A fundamental issue when evaluating the simulation of precipitation is the difficulty of quantifying specific sources of errors and recognizing compensation of errors. We assess how well a large ensemble of high‐resolution simulations represents the precipitation associated with strong cyclones. We propose a framework to breakdown precipitation errors according to different dynamical (vertical velocity) and thermodynamical (vertically integrated water vapor) regimes and the frequency and intensity of precipitation. This approach approximates the error in the total precipitation of each regime as the sum of three terms describing errors in the large‐scale environmental conditions, the frequency of precipitation and its intensity. We show that simulations produce precipitation too often, that its intensity is too weak, that errors are larger for weak than for strong dynamical forcing and that biases in the vertically integrated water vapor can be large. Using the error breakdown presented above, we define four new error metrics differing on the degree to which they include the compensation of errors. We show that convection‐permitting simulations consistently improve the simulation of precipitation compared to coarser‐resolution simulations using parameterized convection, and that these improvements are revealed by our new approach but not by traditional metrics which can be affected by compensating errors. These results suggest that convection‐permitting models are more likely to produce better results for the right reasons. We conclude that the novel decomposition and error metrics presented in this study give a useful framework that provides physical insights about the sources of errors and a reliable quantification of errors. , Plain Language Summary The simulations of complex physical processes always entail various sources of errors. These errors can be of different sign and can consequently cancel each other out when using traditional performance metrics such as the bias error metric. We present a formal framework that allows us to approximate precipitation according to three terms that describe different aspects of the rainfall field including large‐scale environmental conditions and the frequency and intensity of rainfall. We apply the methodology to a large ensemble of high‐resolution simulations representing the precipitation associated with strong cyclones in eastern Australia. We show that simulations produce precipitation too often, with an intensity that is too weak leading to strong compensation. We further define new error metrics that explicitly quantify the degree of error compensation when simulating precipitation. We show that convection‐permitting simulations consistently improve the performance compared to coarser resolution simulations using parameterized convection and that these improvements are only revealed when using the new error metrics but are not apparent in traditional metrics (e.g., bias). , Key Points Multiple high‐resolution simulations produce precipitation too often with underestimated intensity leading to strong error compensation Errors in precipitation are quantified using novel metrics that prevent error compensation showing value compared with traditional metrics Convection permitting simulations outperform the representation of precipitation compared to simulations using parameterized convection
-
Abstract In sub-Saharan Africa (SSA), precipitation is an important driver of agricultural production. In Uganda, maize production is essentially rain-fed. However, due to changes in climate, projected maize yield targets have not often been met as actual observed maize yields are often below simulated/projected yields. This outcome has often been attributed to parallel gaps in precipitation. This study aims at identifying maize yield and precipitation gaps in Uganda for the period 1998–2017. Time series historical actual observed maize yield data (hg/ha/year) for the period 1998–2017 were collected from FAOSTAT. Actual observed maize growing season precipitation data were also collected from the climate portal of World Bank Group for the period 1998–2017. The simulated or projected maize yield data and the simulated or projected growing season precipitation data were simulated using a simple linear regression approach. The actual maize yield and actual growing season precipitation data were now compared with the simulated maize yield data and simulated growing season precipitation to establish the yield gaps. The results show that three key periods of maize yield gaps were observed (period one: 1998, period two: 2004–2007 and period three: 2015–2017) with parallel precipitation gaps. However, in the entire series (1998–2017), the years 2008–2009 had no yield gaps yet, precipitation gaps were observed. This implies that precipitation is not the only driver of maize yields in Uganda. In fact, this is supported by a low correlation between precipitation gaps and maize yield gaps of about 6.3%. For a better understanding of cropping systems in SSA, other potential drivers of maize yield gaps in Uganda such as soils, farm inputs, crop pests and diseases, high yielding varieties, literacy, and poverty levels should be considered.
-
Abstract The collection efficiency of a typical precipitation gauge-shield configuration decreases with increasing wind speed, with a high scatter for a given wind speed. The high scatter in the collection efficiency for a given wind speed arises in part from the variability in the characteristics of falling snow and atmospheric turbulence. This study uses weighing gauge data collected at the Marshall Field Site near Boulder, Colorado, during the WMO Solid Precipitation Intercomparison Experiment (SPICE). Particle diameter and fall speed data from a laser disdrometer were used to show that the scatter in the collection efficiency can be reduced by considering the fall speed of solid precipitation particles. The collection efficiency was divided into two classes depending on the measured mean-event particle fall speed during precipitation events. Slower-falling particles were associated with a lower collection efficiency. A new transfer function (i.e., the relationship between collection efficiency and other meteorological variables, such as wind speed or air temperature) that includes the fall speed of the hydrometeors was developed. The root-mean-square error of the adjusted precipitation with the new transfer function with respect to a weighing gauge placed in a double fence intercomparison reference was lower than using previously developed transfer functions that only consider wind speed and air temperature. This shows that the measured fall speed of solid precipitation with a laser disdrometer accounts for a large amount of the observed scatter in weighing gauge collection efficiency.
-
Abstract Approximately 10 years ago, convection‐permitting regional climate models (CPRCMs) emerged as a promising computationally affordable tool to produce fine resolution (1–4 km) decadal‐long climate simulations with explicitly resolved deep convection. This explicit representation is expected to reduce climate projection uncertainty related to deep convection parameterizations found in most climate models. A recent surge in CPRCM decadal simulations over larger domains, sometimes covering continents, has led to important insights into CPRCM advantages and limitations. Furthermore, new observational gridded datasets with fine spatial and temporal (~1 km; ~1 h) resolutions have leveraged additional knowledge through evaluations of the added value of CPRCMs. With an improved coordination in the frame of ongoing international initiatives, the production of ensembles of CPRCM simulations is expected to provide more robust climate projections and a better identification of their associated uncertainties. This review paper presents an overview of the methodology to produce CPRCM simulations and the latest research on the related added value in current and future climates. Impact studies that are already taking advantage of these new CPRCM simulations are highlighted. This review paper ends by proposing next steps that could be accomplished to continue exploiting the full potential of CPRCMs. This article is categorized under: Climate Models and Modeling > Earth System Models
-
Abstract In spring 2011, an unprecedented flood hit the complex eastern United States (U.S.)–Canada transboundary Lake Champlain–Richelieu River (LCRR) Basin, destructing properties and inducing negative impacts on agriculture and fish habitats. The damages, covered by the Governments of Canada and the U.S., were estimated to C$90M. This natural disaster motivated the study of mitigation measures to prevent such disasters from reoccurring. When evaluating flood risks, long‐term evolving climate change should be taken into account to adopt mitigation measures that will remain relevant in the future. To assess the impacts of climate change on flood risks of the LCRR basin, three bias‐corrected multi‐resolution ensembles of climate projections for two greenhouse gas concentration scenarios were used to force a state‐of‐the‐art, high‐resolution, distributed hydrological model. The analysis of the hydrological simulations indicates that the 20‐year return period flood (corresponding to a medium flood) should decrease between 8% and 35% for the end of the 21st Century (2070–2099) time horizon and for the high‐emission scenario representative concentration pathway (RCP) 8.5. The reduction in flood risks is explained by a decrease in snow accumulation and an increase in evapotranspiration expected with the future warming of the region. Nevertheless, due to the large climate inter‐annual variability, short‐term flood probabilities should remain similar to those experienced in the recent past.