Votre recherche
Résultats 21 ressources
-
Watershed runoff is closely related to land use but this influence is difficult to quantify. This study focused on the Chaudière River watershed (Québec,...
-
In this article a preliminary analysis of the loss of life caused by Hurricane Katrina in the New Orleans metropolitan area is presented. The hurricane caused more than 1,100 fatalities in the state of Louisiana. A preliminary data set that gives information on the recovery locations and individual characteristics for 771 fatalities has been analyzed. One-third of the analyzed fatalities occurred outside the flooded areas or in hospitals and shelters in the flooded area. These fatalities were due to the adverse public health situation that developed after the floods. Two-thirds of the analyzed fatalities were most likely associated with the direct physical impacts of the flood and mostly caused by drowning. The majority of victims were elderly: nearly 60% of fatalities were over 65 years old. Similar to historical flood events, mortality rates were highest in areas near severe breaches and in areas with large water depths. An empirical relationship has been derived between the water depth and mortality and this has been compared with similar mortality functions proposed based on data for other flood events. The overall mortality among the exposed population for this event was approximately 1%, which is similar to findings for historical flood events. Despite the fact that the presented results are preliminary they give important insights into the determinants of loss of life and the relationship between mortality and flood characteristics.
-
This paper examines the challenges facing English flood risk management (FRM) policy and practice when considering fair decision-making processes and outcomes at a range of spatial scales. It is recognised that flooding is not fair per se : the inherent natural spatial inequality of flood frequency and extent, plus the legacy of differential system interventions, being the cause. But, drawing on the three social justice models – procedural equality, Rawls’ maximin rule and maximum utility – the authors examine the fairness principles currently employed in FRM decision-making. This is achieved, firstly, in relation to the distribution of taxpayer’s money for FRM at the national, regional and local levels and, secondly, for non-structural strategies – most notably those of insurance, flood warnings and awareness raising, land use control, home owner adaptation and emergency management. A case study of the Lower Thames catchment illustrates the challenges facing decision-makers in ‘real life’: how those strategies which appear to be most technically and economically effective fall far short of being fair from either a vulnerability or equality perspective. The paper concludes that if we are to manage flood risk somewhat more fairly then a move in the direction of government funding of nationally consistent non-structural strategies, in conjunction with lower investment decision thresholds for other local-level FRM options, appears to offer a greater contribution to equality and vulnerability-based social justice principles than the status quo.
-
A physiographical space‐based kriging method is proposed for regional flood frequency estimation. The methodology relies on the construction of a continuous physiographical space using physiographical and meteorological characteristics of gauging stations and the use of multivariate analysis techniques. Two multivariate analysis methods were tested: canonical correlation analysis (CCA) and principal components analysis. Ordinary kriging, a geostatistical technique, was then used to interpolate flow quantiles through the physiographical space. Data from 151 gauging stations across the southern part of the province of Quebec, Canada, were used to illustrate this approach. In order to evaluate the performance of the proposed method, two validation techniques, cross validation and split‐sample validation, were applied to estimate flood quantiles corresponding to the 10, 50, and 100 year return periods. Results of the proposed method were compared to those produced by a traditional regional estimation method using the canonical correlation analysis. The proposed method yielded satisfactory results. It allowed, for instance, for estimating the 10 year return period specific flow with a coefficient of determination of up to 0.78. However, this performance decreases with the increase in the quantile return period. Results also showed that the proposed method works better when the physiographical space is defined using canonical correlation analysis. It is shown that kriging in the CCA physiographical space yields results as precise as the traditional estimation method, with a fraction of the effort and the computation time.
-
Abstract. The potential impact of future climate change on runoff generation processes in two southern British Columbia catchments was explored using the Canadian Centre for Climate Modelling Analysis General Circulation Model (CGCMa1) to estimate future changes in precipitation, temperature and cloud cover while the U.B.C. Watershed Model was used to simulate discharges and quantify the separate runoff components, i.e. rainfall, snowmelt, glacier melt and groundwater. Changes, not only in precipitation and temperature but also in the spatial distribution of precipitation with elevation, cloud cover, glacier extension, altitude distribution of vegetation, vegetation biomass production and plant physiology were considered. The future climate of the catchments would be wetter and warmer than the present. In the maritime rain-fed catchment of the Upper Campbell, runoff from rainfall is the most significant source of flow for present and future climatic conditions in the autumn and winter whereas runoff from groundwater generates the flow in spring and summer, especially for the future climate scenario. The total runoff, under the future climatic conditions, would increase in the autumn and winter and decrease in spring and summer. In contrast, in the interior snow-covered Illecillewaet catchment, groundwater is the most significant runoff generation mechanism in the autumn and winter although, at present, significant flow is generated from snowmelt in spring and from glacier runoff in summer. In the future scenario, the contribution to flow from snowmelt would increase in winter and diminish in spring while the runoff from the glacier would remain unchanged; groundwater would then become the most significant source of runoff, which would peak earlier in the season. Keywords: climatic change, hydrological simulation, rainfall, snowmelt, runoff processes
-
The causes of peak flows in two climatically different mountainous-forested basins of British Columbia have been identified. The U.B.C. watershed model was used to identify the causes of peak flows, since this model separately calculates the runoff components, i.e. rainfall, snowmelt and glacier runoff. The results showed that the flood flows in the maritime basin of Upper Campbell are mainly generated by rainfall during the fall months and winter rain-on-snow events. Rainfall runoff constitutes the largest percentage of peak flow for all types of events. On the other hand, the flood flows in the inland basin of Illecillewaet are mainly produced by spring rain and snowmelt events, snowmelt events alone and summer events when runoff from the glacier melt contributes to peak discharge. However, snowmelt runoff is the dominant component of peak flows. Based on these findings, flood frequency analysis showed that considering the flow component frequency distributions marginally improves the probability distribution flows in the two examined watersheds.
-
A four-dimensional variational data assimilation (4DVAR) scheme has recently been implemented in the medium-range weather forecast system of the Meteorological Service of Canada (MSC). The new scheme is now composed of several additional and improved features as compared with the three-dimensional variational data assimilation (3DVAR): the first guess at the appropriate time from the full-resolution model trajectory is used to calculate the misfit to the observations; the tangent linear of the forecast model and its adjoint are employed to propagate the analysis increment and the gradient of the cost function over the 6-h assimilation window; a comprehensive set of simplified physical parameterizations is used during the final minimization process; and the number of frequently reported data, in particular satellite data, has substantially increased. The impact of these 4DVAR components on the forecast skill is reported in this article. This is achieved by comparing data assimilation configurations that range in complexity from the former 3DVAR with the implemented 4DVAR over a 1-month period. It is shown that the implementation of the tangent-linear model and its adjoint as well as the increased number of observations are the two features of the new 4DVAR that contribute the most to the forecast improvement. All the other components provide marginal though positive impact. 4DVAR does not improve the medium-range forecast of tropical storms in general and tends to amplify the existing, too early extratropical transition often observed in the MSC global forecast system with 3DVAR. It is shown that this recurrent problem is, however, more sensitive to the forecast model than the data assimilation scheme employed in this system. Finally, the impact of using a shorter cutoff time for the reception of observations, as the one used in the operational context for the 0000 and 1200 UTC forecasts, is more detrimental with 4DVAR. This result indicates that 4DVAR is more sensitive to observations at the end of the assimilation window than 3DVAR.
-
On 15 March 2005, the Meteorological Service of Canada (MSC) proceeded to the implementation of a four-dimensional variational data assimilation (4DVAR) system, which led to significant improvements in the quality of global forecasts. This paper describes the different elements of MSC’s 4DVAR assimilation system, discusses some issues encountered during the development, and reports on the overall results from the 4DVAR implementation tests. The 4DVAR system adopted an incremental approach with two outer iterations. The simplified model used in the minimization has a horizontal resolution of 170 km and its simplified physics includes vertical diffusion, surface drag, orographic blocking, stratiform condensation, and convection. One important element of the design is its modularity, which has permitted continued progress on the three-dimensional variational data assimilation (3DVAR) component (e.g., addition of new observation types) and the model (e.g., computational and numerical changes). This paper discusses some numerical problems that occur in the vicinity of the Poles where the semi-Lagrangian scheme becomes unstable when there is a simultaneous occurrence of converging meridians and strong wind gradients. These could be removed by filtering the winds in the zonal direction before they are used to estimate the upstream position in the semi-Lagrangian scheme. The results show improvements in all aspects of the forecasts over all regions. The impact is particularly significant in the Southern Hemisphere where 4DVAR is able to extract more information from satellite data. In the Northern Hemisphere, 4DVAR accepts more asynoptic data, in particular coming from profilers and aircrafts. The impact noted is also positive and the short-term forecasts are particularly improved over the west coast of North America. Finally, the dynamical consistency of the 4DVAR global analyses leads to a significant impact on regional forecasts. Experimentation has shown that regional forecasts initiated directly from a 4DVAR global analysis are improved with respect to the regional forecasts resulting from the regional 3DVAR analysis.
-
Natural disasters have a statistically observable adverse impact on the macro-economy in the short-run and costlier events lead to more pronounced slowdowns in production. Yet, interestingly, developing countries, and smaller economies, face much larger output declines following a disaster of similar relative magnitude than do developed countries or bigger economies. A close study of the determinants of these adverse macroeconomic output costs reveals several interesting patterns. Countries with a higher literacy rate, better institutions, higher per capita income, higher degree of openness to trade, and higher levels of government spending are better able to withstand the initial disaster shock and prevent further spillovers into the macro-economy. These all suggest an increased ability to mobilize resources for reconstruction. Financial conditions also seem to be of importance; countries with more foreign exchange reserves, and higher levels of domestic credit, but with less-open capital accounts appear more robust and better able to endure natural disasters, with less adverse spillover into domestic production.
-
Hurricane Katrina pounded the Gulf South at the end of August 2005, devastating lives and raising questions about how race and class influence human, as well as institutional, responses to disaster. This study uses survey data collected from over 1200 Hurricane Katrina survivors to examine these influences on a wide array of responses, ranging from evacuation timing and emotional support to housing and employment situations and plans to return to pre-storm communities. Results reveal strong racial and class differences, indicating that neither of these dimensions can be reduced to the other when seeking to understand responses by survivors themselves. This intersection renders low-income black home owners from New Orleans those most in need of targeted assistance as residents work to put themselves and the region back together.
-
Global environmental change and sustainability science increasingly recognize the need to address the consequences of changes taking place in the structure and function of the biosphere. These changes raise questions such as: Who and what are vulnerable to the multiple environmental changes underway, and where? Research demonstrates that vulnerability is registered not by exposure to hazards (perturbations and stresses) alone but also resides in the sensitivity and resilience of the system experiencing such hazards. This recognition requires revisions and enlargements in the basic design of vulnerability assessments, including the capacity to treat coupled human–environment systems and those linkages within and without the systems that affect their vulnerability. A vulnerability framework for the assessment of coupled human–environment systems is presented.
-
Abstract This paper demonstrates the importance of disaggregating population data aggregated by census tracts or other units, for more realistic population distribution/location. A newly developed mapping method, the Cadastral-based Expert Dasymetric System (CEDS), calculates population in hyper-heterogeneous urban areas better than traditional mapping techniques. A case study estimating population potentially impacted by flood hazard in New York City compares the impacted population determined by CEDS with that derived by centroid-containment method and filtered areal-weighting interpolation. Compared to CEDS, 37% and 72% fewer people are estimated to be at risk from floods city-wide, using conventional areal weighting of census data, and centroid-containment selection, respectively. Undercounting of impacted population could have serious implications for emergency management and disaster planning. Ethnic/racial populations are also spatially disaggregated to determine any environmental justice impacts with flood risk. Minorities are disproportionately undercounted using traditional methods. Underestimating more vulnerable sub-populations impairs preparedness and relief efforts.