Rechercher
Bibliographie complète 167 ressources
-
Abstract. Starphotometry, the night-time counterpart of sunphotometry, has not yet achieved the commonly sought observational error level of 1 %: a spectral optical depth (OD) error level of 0.01. In order to address this issue, we investigate a large variety of systematic (absolute) uncertainty sources. The bright-star catalogue of extraterrestrial references is noted as a major source of errors with an attendant recommendation that its accuracy, particularly its spectral photometric variability, be significantly improved. The small field of view (FOV) employed in starphotometry ensures that it, unlike sun- or moonphotometry, is only weakly dependent on the intrinsic and artificial OD reduction induced by scattering into the FOV by optically thin clouds. A FOV of 45 arcsec (arcseconds) was found to be the best trade-off for minimizing such forward-scattering errors concurrently with flux loss through vignetting. The importance of monitoring the sky background and using interpolation techniques to avoid spikes and to compensate for measurement delay was underscored. A set of 20 channels was identified to mitigate contamination errors associated with stellar and terrestrial atmospheric gas absorptions, as well as aurora and airglow emissions. We also note that observations made with starphotometers similar to our High Arctic instrument should be made at high angular elevations (i.e. at air masses less than 5). We noted the significant effects of snow crystal deposition on the starphotometer optics, how pseudo OD increases associated with this type of contamination could be detected, and how proactive techniques could be employed to avoid their occurrence in the first place. If all of these recommendations are followed, one may aspire to achieve component errors that are well below 0.01: in the process, one may attain a total 0.01 OD target error.
-
The objective of this study was to estimate economic losses associated with milk performance detriments under different climate scenarios. A dataset containing milk records of Holstein and daily temperature–humidity indexes compiled over 6 yr in two contrasting climatic dairy regions of Quebec [eastern (EQ) and southwestern Quebec (SWQ)] was used to develop equations for modeling milk performance. Milk performance, including milk, fat, protein, and lactose yields of dairy herds (kg·d −1 ), were then projected considering six plausible climate scenarios during a climatic reference period (REF: 1971–2000) and two future periods (FUT1: 2020–2049; FUT2: 2050–2079). Economic losses were assessed by comparing future to reference milk prices based on components. Results indicated that fat and protein yields could decline in the future, thus resulting in economic losses ranging from $5.34 to $7.07 CAD·hL −1 in EQ and from $5.03 to $6.99 CAD·hL −1 in SWQ, depending on the amplitude of future temperature and humidity changes and on whether it is milk quota or cow number which is adjusted to avoid under-quota production. The projected increase in occurrence and duration of heat stress episodes under climate change could result in substantial financial harm for producers, thereby reinforcing the necessity of implementing heat-abatement strategies on dairy farms.
-
Abstract Large scale flood risk analyses are fundamental to many applications requiring national or international overviews of flood risk. While large‐scale climate patterns such as teleconnections and climate change become important at this scale, it remains a challenge to represent the local hydrological cycle over various watersheds in a manner that is physically consistent with climate. As a result, global models tend to suffer from a lack of available scenarios and flexibility that are key for planners, relief organizations, regulators, and the financial services industry to analyze the socioeconomic, demographic, and climatic factors affecting exposure. Here we introduce a data‐driven, global, fast, flexible, and climate‐consistent flood risk modeling framework for applications that do not necessarily require high‐resolution flood mapping. We use statistical and machine learning methods to examine the relationship between historical flood occurrence and impact from the Dartmouth Flood Observatory (1985–2017), and climatic, watershed, and socioeconomic factors for 4,734 HydroSHEDS watersheds globally. Using bias‐corrected output from the NCAR CESM Large Ensemble (1980–2020), and the fitted statistical relationships, we simulate 1 million years of events worldwide along with the population displaced in each event. We discuss potential applications of the model and present global flood hazard and risk maps. The main value of this global flood model lies in its ability to quickly simulate realistic flood events at a resolution that is useful for large‐scale socioeconomic and financial planning, yet we expect it to be useful to climate and natural hazard scientists who are interested in socioeconomic impacts of climate. , Plain Language Summary Flood is among the deadliest and most damaging natural disasters. To protect against large scale flood risk, stakeholders need to understand how floods can occur and their potential impacts. Stakeholders rely on global flood models to provide them with plausible flood scenarios around the world. For a flood model to operate at the global scale, climate effects must be represented in addition to hydrological ones to demonstrate how rivers can overflow throughout the world each year. Global flood models often lack the flexibility and variety of scenarios required by many stakeholders because they are computationally demanding. Designed for applications where detailed local flood impacts are not required, we introduce a rapid and flexible global flood model that can generate hundreds of thousands of scenarios everywhere in the world in a matter of minutes. The model is based on a historical flood database from 1985 to 2017 that is represented using an algorithm that learns from the data. With this model, the output from a global climate model is used to simulate a large sample of floods for risk analyses that are coherent with global climate. Maps of the annual average number of floods and number of displaced people illustrate the models results. , Key Points We present a global flood model built using machine learning methods fitted with historical flood occurrences and impacts Forced with a climate model, the global flood model is fast, flexible and consistent with global climate We provide global flood hazard (occurrence) and risk (population displaced) maps over 4,734 watersheds
-
Abstract A fundamental issue when evaluating the simulation of precipitation is the difficulty of quantifying specific sources of errors and recognizing compensation of errors. We assess how well a large ensemble of high‐resolution simulations represents the precipitation associated with strong cyclones. We propose a framework to breakdown precipitation errors according to different dynamical (vertical velocity) and thermodynamical (vertically integrated water vapor) regimes and the frequency and intensity of precipitation. This approach approximates the error in the total precipitation of each regime as the sum of three terms describing errors in the large‐scale environmental conditions, the frequency of precipitation and its intensity. We show that simulations produce precipitation too often, that its intensity is too weak, that errors are larger for weak than for strong dynamical forcing and that biases in the vertically integrated water vapor can be large. Using the error breakdown presented above, we define four new error metrics differing on the degree to which they include the compensation of errors. We show that convection‐permitting simulations consistently improve the simulation of precipitation compared to coarser‐resolution simulations using parameterized convection, and that these improvements are revealed by our new approach but not by traditional metrics which can be affected by compensating errors. These results suggest that convection‐permitting models are more likely to produce better results for the right reasons. We conclude that the novel decomposition and error metrics presented in this study give a useful framework that provides physical insights about the sources of errors and a reliable quantification of errors. , Plain Language Summary The simulations of complex physical processes always entail various sources of errors. These errors can be of different sign and can consequently cancel each other out when using traditional performance metrics such as the bias error metric. We present a formal framework that allows us to approximate precipitation according to three terms that describe different aspects of the rainfall field including large‐scale environmental conditions and the frequency and intensity of rainfall. We apply the methodology to a large ensemble of high‐resolution simulations representing the precipitation associated with strong cyclones in eastern Australia. We show that simulations produce precipitation too often, with an intensity that is too weak leading to strong compensation. We further define new error metrics that explicitly quantify the degree of error compensation when simulating precipitation. We show that convection‐permitting simulations consistently improve the performance compared to coarser resolution simulations using parameterized convection and that these improvements are only revealed when using the new error metrics but are not apparent in traditional metrics (e.g., bias). , Key Points Multiple high‐resolution simulations produce precipitation too often with underestimated intensity leading to strong error compensation Errors in precipitation are quantified using novel metrics that prevent error compensation showing value compared with traditional metrics Convection permitting simulations outperform the representation of precipitation compared to simulations using parameterized convection
-
Abstract Timothy ( Phleum pratense L.) is expected to be more affected by climate change than other forage grasses. Therefore, alternatives to timothy, such as tall fescue [ Schedonorus arundinaceus (Shreb.) Dumort.], meadow fescue [ S. pratensis (Huds.) P. Beauv.], or meadow bromegrass ( Bromus biebersteinii Roem. & Schult.) should be explored. Our objective was to simulate and compare the yield and nutritive value of four alfalfa ( Medicago sativa L.)–grass mixtures and annual crops grown on two virtual dairy farms representative of eastern Canada under future climate conditions. The Integrated Farm System Model (IFSM) was used for these projections under the reference (1971–2000), near future (2020–2049), and distant future (2050–2079) climates for two climatically contrasting agricultural areas in eastern Canada (eastern Quebec; southwestern Quebec). In both future periods, annual forage dry matter (DM) yields of the four alfalfa–grass mixtures are projected to increase because of additional harvests, with greater DM yield increases projected in the colder area than in the warmer area. In both areas, the highest yield increase is projected for alfalfa–tall fescue mixture and the lowest for alfalfa–timothy mixture. The nutritive value of all mixtures should increase due to a greater proportion of alfalfa. In both areas, yields of silage and grain corn ( Zea mays L.), and soybean [ Glycine max (L.) Merr.] are projected to increase, but not those of wheat ( Triticum aestivum L.) and barley ( Hordeum vulgare L.). Tall fescue, meadow bromegrass, and meadow fescue are adequate alternatives to timothy grown in association with alfalfa under future climate conditions. , Core Ideas Forage yields of alfalfa–grass mixtures are projected to increase due to additional harvests. Mixture with tall fescue is projected to increase the most and timothy the least. Tall fescue, meadow fescue, and meadow bromegrass are valuable alternatives to timothy. Nutritive value is projected to increase due to more alfalfa in the mixture. Corn and soybean grain yields are projected to increase but not those of wheat and barley.
-
Abstract Atmospheric blockings are generally associated with large-scale high-pressure systems that interrupt west-to-east atmospheric flow in mid and high latitudes. Blockings cause several days of quasi-stationary weather conditions, and therefore can result in monthly or seasonal climate anomalies and extreme weather events on the affected regions. In this paper, the long-term coupled CERA-20C reanalysis data from 1901 to 2010 are used to evaluate the links between blocking events over the North Atlantic north of 35° N, and atmospheric and oceanic modes of climate variability on decadal time scales. This study indicates more frequent and longer lasting blocking events than previous studies using other reanalyses products. A strong relationship was found between North Atlantic blocking events and North Atlantic Oscillation (NAO), Atlantic Multidecadal Oscillation (AMO) and Baffin Island–West Atlantic (BWA) indices, in fall, winter and spring. More blocking events occur during the negative phases of the NAO index and positive phases of the BWA mode. In some situations, the BWA patterns provide clearer links with the North Atlantic blocking occurrence than with the NAO alone. The correlation between the synchronous occurrences of AMO and blocking is generally weak, although it does increase for a lag of about 6–10 years. Convergent cross mapping (CCM) furthermore demonstrates a significant two-way causal effect between blocking occurrences and the NAO and BWA indices. Finally, while we find no significant trends in blocking frequencies over the last 110 years in the Northern Hemisphere, these events become longer lasting in summer and fall, and more intense in spring in the North Atlantic.
-
Abstract Digital leaf physiognomy (DLP) is considered as one of the most promising methods for estimating past climate. However, current models built using the DLP data set still lack precision, especially for mean annual precipitation (MAP). To improve predictive power, we developed five machine learning (ML) models for mean annual temperature (MAT) and MAP respectively, and then tested the precision of these models and some of their averaging compared with that obtained from other models. The precision of all models was assessed using a repeated stratified 10‐fold cross‐validation. For MAT, three combinations of models ( R 2 = .77) presented moderate improvements in precision over the multiple linear regression (MLR) model ( R 2 = .68). For log e (MAP), the averaging of the support vector machine (SVM) and boosting models improved the R 2 from .19 to .63 compared with that of the MLR model. For MAP, the R 2 of this model combination was 0.49, which was much better than that of the artificial neural network (ANN) model ( R 2 = .21). Even the bagging model, which had the lowest R 2 (.37) for log e (MAP), demonstrated better precision ( R 2 = .27) for MAP. Our palaeoclimate estimates for nine fossil floras were also more accurate, because they were in better agreement with independent paleoclimate evidence. Our study confirms that our ML models and their averaging can improve paleoclimatic reconstructions, providing a better understanding of the relationship between climate and leaf physiognomy.
-
Abstract. In the Arctic, during polar night and early spring, ice clouds are separated into two leading types of ice clouds (TICs): (1) TIC1 clouds characterized by a large concentration of very small crystals and TIC2 clouds characterized by a low concentration of large ice crystals. Using a suitable parameterization of heterogeneous ice nucleation is essential for properly representing ice clouds in meteorological and climate models and subsequently understanding their interactions with aerosols and radiation. Here, we describe a new parameterization for ice crystal formation by heterogeneous nucleation in water-subsaturated conditions coupled to aerosol chemistry in the Weather Research and Forecasting model coupled with chemistry (WRF-Chem). The parameterization is implemented in the Milbrandt and Yau (2005a, b) two-moment cloud microphysics scheme, and we assess how the WRF-Chem model responds to the run-time interaction between chemistry and the new parameterization. Well-documented reference cases provided us with in situ data from the spring 2008 Indirect and Semi-Direct Aerosol Campaign (ISDAC) over Alaska. Our analysis reveals that the new parameterization clearly improves the representation of the ice water content (IWC) in polluted or unpolluted air masses and shows the poor performance of the reference parameterization in representing ice clouds with low IWC. The new parameterization is able to represent TIC1 and TIC2 microphysical characteristics at the top of the clouds, where heterogenous ice nucleation is most likely occurring, even with the known bias of simulated aerosols by WRF-Chem over the Arctic.
-
Abstract Compound events (CEs) are weather and climate events that result from multiple hazards or drivers with the potential to cause severe socio-economic impacts. Compared with isolated hazards, the multiple hazards/drivers associated with CEs can lead to higher economic losses and death tolls. Here, we provide the first analysis of multiple multivariate CEs potentially causing high-impact floods, droughts, and fires. Using observations and reanalysis data during 1980–2014, we analyse 27 hazard pairs and provide the first spatial estimates of their occurrences on the global scale. We identify hotspots of multivariate CEs including many socio-economically important regions such as North America, Russia and western Europe. We analyse the relative importance of different multivariate CEs in six continental regions to highlight CEs posing the highest risk. Our results provide initial guidance to assess the regional risk of CE events and an observationally-based dataset to aid evaluation of climate models for simulating multivariate CEs.
-
Abstract. Several sets of reference regions have been used in the literature for the regional synthesis of observed and modelled climate and climate change information. A popular example is the series of reference regions used in the Intergovernmental Panel on Climate Change (IPCC) Special Report on Managing the Risks of Extreme Events and Disasters to Advance Climate Adaptation (SREX). The SREX regions were slightly modified for the Fifth Assessment Report of the IPCC and used for reporting subcontinental observed and projected changes over a reduced number (33) of climatologically consistent regions encompassing a representative number of grid boxes. These regions are intended to allow analysis of atmospheric data over broad land or ocean regions and have been used as the basis for several popular spatially aggregated datasets, such as the Seasonal Mean Temperature and Precipitation in IPCC Regions for CMIP5 dataset. We present an updated version of the reference regions for the analysis of new observed and simulated datasets (including CMIP6) which offer an opportunity for refinement due to the higher atmospheric model resolution. As a result, the number of land and ocean regions is increased to 46 and 15, respectively, better representing consistent regional climate features. The paper describes the rationale for the definition of the new regions and analyses their homogeneity. The regions are defined as polygons and are provided as coordinates and a shapefile together with companion R and Python notebooks to illustrate their use in practical problems (e.g. calculating regional averages). We also describe the generation of a new dataset with monthly temperature and precipitation, spatially aggregated in the new regions, currently for CMIP5 and CMIP6, to be extended to other datasets in the future (including observations). The use of these reference regions, dataset and code is illustrated through a worked example using scatter plots to offer guidance on the likely range of future climate change at the scale of the reference regions. The regions, datasets and code (R and Python notebooks) are freely available at the ATLAS GitHub repository: https://github.com/SantanderMetGroup/ATLAS (last access: 24 August 2020), https://doi.org/10.5281/zenodo.3998463 (Iturbide et al., 2020).
-
Precipitation and temperature are among major climatic variables that are used to characterize extreme weather events, which can have profound impacts on ecosystems and society. Accurate simulation of these variables at the local scale is essential to adapt urban systems and policies to future climatic changes. However, accurate simulation of these climatic variables is difficult due to possible interdependence and feedbacks among them. In this paper, the concept of copulas was used to model seasonal interdependence between precipitation and temperature. Five copula functions were fitted to grid (approximately 10 km × 10 km) climate data from 1960 to 2013 in southern Ontario, Canada. Theoretical and empirical copulas were then compared with each other to select the most appropriate copula family for this region. Results showed that, of the tested copulas, none of them consistently performed the best over the entire region during all seasons. However, Gumbel copula was the best performer during the winter season, and Clayton performed best in the summer. More variability in terms of best copula was found in spring and fall seasons. By examining the likelihoods of concurrent extreme temperature and precipitation periods including wet/cool in the winter and dry/hot in the summer, we found that ignoring the joint distribution and confounding impacts of precipitation and temperature lead to the underestimation of occurrence of probabilities for these two concurrent extreme modes. This underestimation can also lead to incorrect conclusions and flawed decisions in terms of the severity of these extreme events.
-
Abstract Background During Spring 2019, many regions in Quebec (Canada) experienced severe floods. As much as 5,245 households were flooded and 7,452 persons were evacuated, causing extensive material and human damages. A large population-based study was therefore conducted to examine medium-term effects of this natural disaster on health and well-being. Methods Six to eight months post-floods, households located in the flooded zones (in one of the 6 Quebec regions the most severely affected) were randomly invited to participate to a telephone or a web-based survey (response rate=15.3%). Several psychological health outcomes were examined, including psychological distress (based on the 6-item Kessler Scale, score 0-24) and post-traumatic stress (based on the 15-item Impact of Event Scale, score 0-75). These outcomes were compared among 3 levels of exposure using Chi-square test: flooded (floodwater in ≥ 1 liveable room), disrupted (floodwater in non-liveable areas, loss of utilities, loss of access to services, or evacuation), and unaffected. Results Of the 3,437 participating households, 349 (10.2%) were flooded and 1230 (35.8%) were disrupted (but not flooded) during the 2019 floods. A steep gradient was observed for moderate/severe symptoms of post-traumatic stress (score ≥ 26) according to the level of exposure to flooding (unaffected: 3.0%; disrupted: 14.6%; flooded: 44.1%; p < 0.0005). For psychological distress (score ≥ 7), the baseline level (i.e. unaffected group) was 7.3% while it reached 15.0% and 38.4% in the disrupted and the flooded groups, respectively (p < 0.0005). Conclusions This study is among the largest to examine the psychological impacts of flooding. The magnitude of effects observed in flooded households is consistent with the literature and calls for stronger social and economic measures to support flood victims. Such support should help coping with initial stress, but also alleviating secondary stressors classically observed in post-flood settings. Key messages Psychological impacts of floods may persist for several months and may be observed in both flooded and disrupted people. Stronger social and economic measures are needed to better support flood victims, not only in the short but also in the longer term.
-
Aerosol–cloud interactions present a large source of uncertainties in atmospheric and climate models. One of the main challenges to simulate ice clouds is to reproduce the right ice nucleating particle concentration. In this study, we derive a parameterization for immersion freezing according to the classical nucleation theory. Our objective was to constrain this parameterization with observations taken over the Canadian Arctic during the Amundsen summer 2014 and 2016 campaigns. We found a linear dependence of contact angle and temperature. Using this approach, we were able to reproduce the scatter in ice nucleated particle concentrations within a factor 5 of observed values with a small negative bias. This parameterization would be easy to implement in climate and atmospheric models, but its representativeness has to first be validated against other datasets.
-
Après les nombreuses crues printanières qui ont affecté le sud du Québec depuis 2011, le gouvernement du Québec a annoncé en avril 2019 une refonte importante de son programme d’aide financière aux sinistrés. Le programme introduit désormais une couverture limitée à vie de 100 000 $ pour les inondations successives, une mesure unique au Canada. L’objectif de cet article est d’analyser le coût des inondations successives et les impacts financiers de cette limite de couverture pour les ménages.
-
Abstract We quantify the skill of Coupled Model Intercomparison Project Phase 5 (CMIP5) and CMIP6 models to represent daily temperature extremes. We find CMIP models systematically exaggerate the magnitude of daily temperature anomalies for both cold and hot extremes. We assess the contribution to a daily temperature extreme from four terms: the long‐term mean annual cycle, the diurnal cycle, synoptic variability, and seasonal variability for both cold and hot extremes. These four terms are combined, and the overall performance of individual climate models assessed. This identifies those models that can simulate temperature extremes well and simulate them well for the right reasons. The new error metric shows that increases in horizontal resolution usually lead to a better performance particularly for the coarser resolution models. The CMIP6 improvements relative to CMIP5 are systematic across most land regions and are only partially explained by the increase in horizontal resolution, and other differences must therefore help explain the higher CMIP6 skill. , Key Points CMIP5 and CMIP6 models exaggerate the magnitude of daily temperature anomalies for hot days and cold nights extremes Higher‐resolution models improve the simulation of temperature extremes largely due to better simulation of synoptic scales CMIP6 outperforms the simulation of temperature extremes compared to CMIP5 beyond the benefits given by the higher resolution