Votre recherche
Résultats 1 711 ressources
-
This is a review article invited by Atmosphere-Ocean to document the contributions of Recherche en Prévision Numérique (RPN) to Numerical Weather Prediction (NWP). It is structured as a historical review and documents RPN’s contributions to numerical methods, numerical modelling, data assimilation, and ensemble systems, with a look ahead to potential future systems. Through this review, we highlight the evolution of RPN’s contributions. We begin with early NWP efforts and continue through to environmental predictions with a broad range of applications. This synthesis is intended to be a helpful reference, consolidating developments and generating broader interest for future work on NWP in Canada.
-
The moisture maximization approach to estimate the Probable Maximum Precipitation (PMP) has a simple technique for controlling the risk of overestimating PMP: the maximization ratio is limited by an upper bound. The upper bound limit depends on storm records and watershed characteristics. However, it is not readily available in many watersheds. A robust scientific justification for limiting the maximization ratio is missing. In this paper, a novel approach is proposed to estimate the maximization ratio which does not impose an upper limit to the ratio. The new approach, which uses regional climate model data, is based on constructing annual maximum precipitable water time series with precipitable water values for which atmospheric variables are similar to the original event to be maximized. These time series are then used to estimate the 100-year return period precipitable water value required to calculate the maximization ratio. The new approach was tested in three watersheds in the province of Quebec, Canada. Results showed that maximization ratio values were lower than the proposed upper bound value for these watersheds. In comparison to the approach using an upper bound, this proposed approach reduced PMP in these watersheds by 11%. This article is protected by copyright. All rights reserved.
-
This dataset contains key characteristics about the data described in the Data Descriptor A comprehensive, multisource database for hydrometeorological modeling of 14,425 North American watersheds. <br> Contents: <br> 1. human readable metadata summary table in CSV format 2. machine readable metadata file in JSON format <br> <br> <br>
-
This dataset contains key characteristics about the data described in the Data Descriptor A global database of Holocene paleotemperature records. <br> Contents: <br> 1. human readable metadata summary table in CSV format 2. machine readable metadata file in JSON format <br> <br> <br>
-
Abstract. Measurements of the size and shape of frazil ice particles and flocs in saline water and of frazil ice flocs in freshwater are limited. This study consisted of a series of laboratory experiments producing frazil ice at salinities of 0 ‰, 15 ‰, 25 ‰ and 35 ‰ to address this lack of data. The experiments were conducted in a large tank in a cold room with bottom-mounted propellers to create turbulence. A high-resolution camera system was used to capture images of frazil ice particles and flocs passing through cross-polarizing lenses. The high-resolution images of the frazil ice were processed using a computer algorithm to differentiate particles from flocs and determine key properties including size, concentration and volume. The size and volume distributions of particles and flocs at all four salinities were found to fit log-normal distributions closely. The concentration, mean size, and standard deviation of flocs and particles were assessed at different times during the supercooling process to determine how these properties evolve with time. Comparisons were made to determine the effect of salinity on the properties of frazil ice particles and flocs. The overall mean size of frazil ice particles in saline water and freshwater was found to range between 0.52 and 0.45 mm, with particles sizes in freshwater ∼13 % larger than in saline water. However, qualitative observations showed that frazil ice particles in saline water tend to be more irregularly shaped. The overall mean size of flocs in freshwater was 2.57 mm compared to a mean size of 1.47 mm for flocs in saline water. The average growth rate of frazil particles was found to be 0.174, 0.070, 0.033, and 0.024 mm min−1 and the average floc growth rate was 0.408, 0.118, 0.089, and 0.072 mm min−1 for the 0 ‰, 15 ‰, 25 ‰, and 35 ‰, respectively. Estimates for the porosity of frazil ice flocs were made by equating the estimated volume of ice produced based on thermodynamic conditions to the estimated volume of ice determined from the digital images. The estimated porosities of frazil ice flocs were determined to be 0.86, 0.82, 0.8 and 0.75 for 0 ‰, 15 ‰, 25 ‰ and 35 ‰ saline water, respectively.
-
This study details the enhancement and calibration of the Arctic implementation of the HYdrological Predictions for the Environment (HYPE) hydrological model established for the BaySys group of projects to produce freshwater discharge scenarios for the Hudson Bay Drainage Basin (HBDB). The challenge in producing estimates of freshwater discharge for the HBDB is that it spans over a third of Canada’s continental landmass and is 40% ungauged. Scenarios for BaySys require the separation between human and climate interactions, specifically the separation of regulated river discharge from a natural, climate-driven response. We present three key improvements to the modelling system required to support the identification of natural from anthropogenic impacts: representation of prairie disconnected landscapes (i.e., non-contributing areas), a method to generalize lake storage-discharge parameters across large regions, and frozen soil modifications. Additionally, a unique approach to account for irregular hydrometric gauge density across the basins during model calibration is presented that avoids overfitting parameters to the densely gauged southern regions. We summarize our methodologies used to facilitate improved separation of human and climate driven impacts to streamflow within the basin and outline the baseline discharge simulations used for the BaySys group of projects. Challenges remain for modeling the most northern reaches of the basin, and in the lake-dominated watersheds. The techniques presented in this work, particularly the lake and flow signature clusters, may be applied to other high latitude, ungauged Arctic basins. Discharge simulations are subsequently used as input data for oceanographic, biogeochemical, and ecosystem studies across the HBDB.
-
In recent years, many developing countries have sought to implement more decentralized governmental systems. Despite efforts toward fiscal federalism, assessment of decentralization activity has been hampered by lack of consistent cross-country measures of effectiveness. Since governments play a central role in the management of catastrophic events, disaster impact data provide an opportunity to evaluate whether government structure is important in limiting disaster losses. We use cross-country data over the 1970–2005 period to estimate the relationship between decentralization and disaster casualties; countries with more decentralized governments experience fewer disaster-induced fatalities.
-
Rangecroft et al. provide an important and interesting paper on the challenges of interdisciplinary research and fieldwork with participants in water resource management. The paper shows the challenges of interaction between their research areas and demonstrates the importance of how a researcher interacts with their selected study sites. My key points reflect the use of different methodologies within social and natural sciences and across them as well as the main challenge of who has the power to influence the research directions. Research is not value-free and is highly influenced by one’s own training and knowledge, which needs to be addressed in the research activities. Finally, an option might be to move beyond interdisciplinary constraints and to work within a stronger transdisciplinary framework. Water research very much needs to interact with non-academic people to understand the challenges and possible solutions.
-
A satisfactory performance of hydrological models under historical climate conditions is considered a prerequisite step in any hydrological climate change impact study. Despite the significant interest in global hydrological modeling, few systematic evaluations of global hydrological models (gHMs) at the catchment scale have been carried out. This study investigates the performance of 4 gHMs driven by 4 global observation-based meteorological inputs at simulating weekly discharges over 198 large-sized North American catchments for the 1971–2010 period. The 16 discharge simulations serve as the basis for evaluating gHM accuracy at the catchment scale within the second phase of the Inter-Sectoral Impact Model Intercomparison Project (ISIMIP2a). The simulated discharges by the four gHMs are compared against observed and simulated weekly discharge values by two regional hydrological models (rHMs) driven by a global meteorological dataset for the same period. We discuss the implications of both modeling approaches as well as the influence of catchment characteristics and global meteorological forcing in terms of model performance through statistical criteria and visual hydrograph comparison for catchment-scale hydrological studies. Overall, the gHM discharge statistics exhibit poor agreement with observations at the catchment scale and manifest considerable bias and errors in seasonal flow simulations. We confirm that the gHM approach, as experimentally implemented through the ISIMIP2a, must be used with caution for regional studies. We find the rHM approach to be more trustworthy and recommend using it for hydrological studies, especially if findings are intended to support operational decision-making.
-
Abstract Ensemble forecasting applied to the field of hydrology is currently an established area of research embracing a broad spectrum of operational situations. This work catalogs the various pathways of ensemble streamflow forecasting based on an exhaustive review of more than 700 studies over the last 40 years. We focus on the advanced state of the art in the model‐based (dynamical) ensemble forecasting approaches. Ensemble streamflow prediction systems are categorized into three leading classes: statistics‐based streamflow prediction systems, climatology‐based ensemble streamflow prediction systems and numerical weather prediction‐based hydrological ensemble prediction systems. For each ensemble approach, technical information, as well as details about its strengths and weaknesses, are provided based on a critical review of the studies listed. Through this literature review, the performance and uncertainty associated with the ensemble forecasting systems are underlined from both operational and scientific viewpoints. Finally, the remaining key challenges and prospective future research directions are presented, notably through hybrid dynamical ‐ statistical learning approaches, which obviously present new challenges to be overcome in order to allow the successful employment of ensemble streamflow forecasting systems in the next decades. Targeting students, researchers and practitioners, this review provides a detailed perspective on the major features of an increasingly important area of hydrological forecasting. , Key Points This work summarizes the 40 years of research in the generation of streamflow forecasts based on an exhaustive review of studies Ensemble prediction systems are categorized into three classes: statistics‐based, climatology‐based and numerical weather prediction‐based hydrological ensemble prediction systems For each ensemble forecasting system, thorough technical information is provided
-
Summary Projected climate change effects on streamflow are investigated for the 2041–2070 horizon following the SRES A2 emissions scenario over two snowmelt-dominated catchments in Canada. A 16-member ensemble of SWAT hydrological model (HM) simulations, based on a comprehensive ensemble of the Canadian Regional Climate Model (CRCM) simulations driven by two global climate models (GCMs), with five realizations of the Canadian CGCM3 and three realizations of the German ECHAM5 is established per catchment. This study aims to evaluate, once model bias has been removed by statistical post-processing (SP), how the RCM-simulated climate changes differ from those of the parent GCMs, and how they affect the assessment of climate change-induced hydrological impacts at the catchment scale. The variability of streamflow caused by the use of different SP methods (mean-based versus distribution-based) within each statistical post-processing pathway of climate model outputs (bias correction versus perturbation) is also evaluated, as well as the uncertainty of natural climate variability. The simulations cover 1971–2000 in the reference period and 2041–2070 in the future period. For a set of criteria, results based on raw and statistically post-processed model outputs for the reference climate are compared with observations. This process demonstrates that SP is important not only for GCMs outputs, but also for CRCM outputs. SP leads to a high level of agreement between the CRCM and the driving GCMs in reproducing patterns of observed climate. The ensemble spread of the climate change signal on streamflow is large and varies with catchments and hydrological periods (winter/summer flows). The results of various hydrological indicators show that most of the uncertainty arises from the natural climate variability followed by the statistical post-processing. The uncertainty linked to the choice of statistical pathway is much larger than that associated with the choice of the method in quantifying the hydrological impacts. We find that the incorporation of dynamical downscaling of global models through the CRCM as an intermediate step in the GCM–RCM–SP–HM model chain does not lead to considerable differences in the assessment of the climate change impacts on streamflow for the study catchments.
-
Satellite-based broad-scale (i.e., global and continental) human settlement data are essential for diverse applications spanning climate hazard mitigation, sustainable development monitoring, spatial epidemiology and demographic modeling. Many human settlement products report exceptional detection accuracies above 85%, but there is a substantial blind spot in that product validation typically focuses on large urban areas and excludes rural, small-scale settlements that are home to 3.4 billion people around the world. In this study, we make use of a data-rich sample of 30 refugee settlements in Uganda to assess the small-scale settlement detection by four human settlement products, namely, Geo-Referenced Infrastructure and Demographic Data for Development settlement extent data (GRID3-SE), Global Human Settlements Built-Up Sentinel-2 (GHS-BUILT-S2), High Resolution Settlement Layer (HRSL) and World Settlement Footprint (WSF). We measured each product’s areal coverage within refugee settlement boundaries, assessed detection of 317,416 building footprints and examined spatial agreement among products. For settlements established before 2016, products had low median probability of detection and F1-score of 0.26 and 0.24, respectively, a high median false alarm rate of 0.59 and tended to only agree in regions with the highest building density. Individually, GRID3-SE offered more than five-fold the coverage of other products, GHS-BUILT-S2 underestimated the building footprint area by a median 50% and HRSL slightly underestimated the footprint area by a median 7%, while WSF entirely overlooked 8 of the 30 study refugee settlements. The variable rates of coverage and detection partly result from GRID3-SE and HRSL being based on much higher resolution imagery, compared to GHS-BUILT-S2 and WSF. Earlier established settlements were generally better detected than recently established settlements, showing that the timing of satellite image acquisition with respect to refugee settlement establishment also influenced detection results. Nonetheless, settlements established in the 1960s and 1980s were inconsistently detected by settlement products. These findings show that human settlement products have far to go in capturing small-scale refugee settlements and would benefit from incorporating refugee settlements in training and validating human settlement detection approaches.
-
Abstract Centennial‐to‐millennial temperature records of the past provide a context for the interpretation of current and future changes in climate. Quaternary climates have been relatively well studied in north‐east North America and the adjacent Atlantic Ocean over the last decades, and new research methods have been developed to improve reconstructions. We present newly inferred reconstructions of sea surface temperature for the north‐western Atlantic region, together with a compilation of published temperature records. The database thus comprises a total of 86 records from both marine and terrestrial sites, including lakes, peatlands, ice and tree rings, each covering at least part of the Holocene. For each record, we present details on seasons covered, chronologies and information on radiocarbon dates and analytical time steps. The 86 records contain a total of 154 reconstructions of temperature and temperature‐related variables. Main proxies include pollen and dinocysts, while summer was the season for which the highest number of reconstructions were available. Many records covered most of the Holocene, but many dinocyst records did not extend to the surface, due to sediment mixing, and dendroclimate records were limited to the last millennium. The database allows for the exploration of linkages between sea ice and climate and may be used in conjunction with other palaeoclimate and palaeoenvironmental records, such as wildfire records and peatland dynamics. This inventory may also aid the identification of gaps in the geographic distribution of past temperature records thus guiding future research efforts.
-
Abstract Groundwater quality modelling plays an important role in water resources management decision making processes. Accordingly, models must be developed to account for the uncertainty inherent in the modelling process, from the sample measurement stage through to the data interpretation stages. Artificial intelligence models, particularly fuzzy inference systems (FIS), have been shown to be effective in groundwater quality evaluation for complex aquifers. In the current study, fuzzy set theory is applied to groundwater-quality related decision-making in an agricultural production context; the Mamdani, Sugeno, and Larsen fuzzy logic-based models (MFL, SFL, and LFL, respectively) are used to develop a series of new, generalized, rule-based fuzzy models for water quality evaluation using widely accepted irrigation indices and hydrological data from the Sarab Plain, Iran. Rather than drawing upon physiochemical groundwater quality parameters, the present research employs widely accepted agricultural indices (e.g., irrigation criteria) when developing the MFL, SFL and LFL groundwater quality models. These newly-developed models, generated significantly more consistent results than the United States Soil Laboratory (USSL) diagram, addressed the inherent uncertainty in threshold data, and were effective in assessing groundwater quality for agricultural uses. The SFL model is recommended as it outperforms both MFL and LFL in terms of accuracy when assessing groundwater quality using irrigation indices.