Votre recherche
Résultats 165 ressources
-
Abstract As an in‐depth profile control agent, water‐soluble phenolic resin crosslinking polyacrylamide weak gel has been widely used in the middle and high water cut stage of water flooding reservoir. In this study, the phenolic resin was synthesized by two‐step alkali catalysis. Factors influencing the synthesis of phenolic resin, including the molar ratio of phenol and formaldehyde, catalyst types, reaction time, were investigated with hydroxylmethyl and aldehyde content as the criterion. When the molar ratio of phenolic resin was 1:2 and NaOH was catalyst, at 80°C for 4 h, the phenolic resin had the highest hydroxymethyl content (49.37%) and the lowest free aldehyde content (2.95%). Weak gel was formed by the reaction of LT002‐polyacrylamide with phenolic resin. Taking the gelation time and strength as criteria, the factors influencing the crosslinking property, including hydroxymethyl content, crosslinker addition, and polyacrylamide concentration were investigated respectively. Under optimal formulation, the property investigation shows that the hydroxymethyl group in the phenolic resin can be crosslinked with the amide group in polyacrylamide, the gelation time is long (50–60 h), and the gelation strength is larger than 5 × 10 4 mPa s, which is conductive to the plugging of deep oil layers. When the permeability was 5061 × 10 −3 μm 2 , the plugging rate was 72.73%.
-
The degradation of soil bonding, which can be described by the evolution of bond degradation variables, is essential in the constitutive modeling of cemented soils. A degradation variable with a value of 0/1.0 indicates that the applied stress is completely sustained by bonded particles/unbounded grains. The discrete element method (DEM) was used for cemented soils to analyze the bond degradation evolution and to evaluate the degradation variables at the contact scale. Numerical cemented soil samples with different bonding strengths were first prepared using an advanced contact model (CM). Constant stress ratio compression, one-dimensional compression, conventional triaxial tests (CTTs), and true triaxial tests (TTTs) were then implemented for the numerical samples. After that, the numerical results were adopted to investigate the evolution of the bond degradation variables BN and B0. In the triaxial tests, B0 evolves to be near to or larger than BN due to shearing, which indicates that shearing increases the bearing rate of bond contacts. Finally, an approximate stress-path-independent bond degradation variable Bσ was developed. The evolution of Bσ with the equivalent plastic strain can be effectively described by an exponential function and a hyperbolic function.
-
The production of natural gas hydrates will change the cementation strength, porosity, and effective stress in the stratum, which may lead to engineering and geological disasters. Sand production is a phenomenon where sand particles are carried out of the reservoir along with fluids during gas extraction, posing challenges to safe and sustainable production. This study explored the mechanism of fine particle migration in multiphase flow by a microscopic visualization test device. The device can inject a gas–liquid–solid phase at the same time and allow real-time observation. Experimental tests on fine particle migration of single- and two-phase fluid flow were carried out considering different conditions, i.e., fine particle concentration, fine particle size, fluid flow rate, and gas–liquid ratio. The results show that in single-phase fluid flow, the original gas will gradually dissolve in the liquid phase, and finally stay in the test device as bubbles, which can change the pore structures, resulting in the accumulation of fine particles at the gas–liquid interface. In two-phase fluid flow with mixed gas–water fluids, there are two flow modes of gas–liquid flow: mixed flow and separated flow. The interfacial tension at the gas–liquid interface can effectively migrate fine particles when the gas–liquid flows alternately and the sand production rate further increases as the gas–liquid ratio increases. In addition, changes in the concentration of fine particles, particle size, fluid flow rate, and the gas–liquid ratio will affect the migration of fine particles, leading to differences in the final sand production.
-
Extreme precipitation events play a crucial role in shaping the vulnerability of regions like Algeria to the impacts of climate change. To delve deeper into this critical aspect, this study investigates the changing patterns of extreme precipitation across five sub-regions of Algeria using data from 33 model simulations provided by the NASA Earth Exchange Global Daily Downscaled Climate Projections (NEX-GDDP-CMIP6). Our analysis reveals a projected decline in annual precipitation for four of these regions, contrasting with an expected increase in desert areas where annual precipitation levels remain low, typically not exceeding 120 mm. Furthermore, key precipitation indices such as maximum 1-day precipitation (Rx1day) and extremely wet-day precipitation (R99p) consistently show upward trends across all zones, under both SSP245 and SSP585 scenarios. However, the number of heavy precipitation days (R20mm) demonstrates varied trends among zones, exhibiting stable fluctuations. These findings provide valuable foresight into future precipitation patterns, offering essential insights for policymakers and stakeholders. By anticipating these changes, adaptive strategies can be devised to mitigate potential climate change impacts on crucial sectors such as agriculture, flooding, water resources, and drought.
-
Abstract The Canadian Precipitation Analysis (CaPA) system provides near-real-time precipitation analyses over Canada by combining observations with short-term numerical weather prediction forecasts. CaPA’s snowfall estimates suffer from the lack of accurate solid precipitation measurements to correct the first-guess estimate. Weather radars have the potential to add precipitation measurements to CaPA in all seasons but are not assimilated in winter due to radar snowfall estimate imprecision and lack of precipitation gauges for calibration. The main objective of this study is to assess the impact of assimilating Canadian dual-polarized radar-based snowfall data in CaPA to improve precipitation estimates. Two sets of experiments were conducted to evaluate the impact of including radar snowfall retrievals, one set using the high-resolution CaPA (HRDPA) with the currently operational quality control configuration and another increasing the number of assimilated surface observations by relaxing quality control. Experiments spanned two winter seasons (2021 and 2022) in central Canada, covering part of the entire CaPA domain. The results showed that the assimilation of radar-based snowfall data improved CaPA’s precipitation estimates 81.75% of the time for 0.5-mm precipitation thresholds. An increase in the probability of detection together with a decrease in the false alarm ratio suggested an improvement of the precipitation spatial distribution and estimation accuracy. Additionally, the results showed improvements for both precipitation mass and frequency biases for low precipitation amounts. For larger thresholds, the frequency bias was degraded. The results also indicated that the assimilation of dual-polarization radar data is beneficial for the two CaPA configurations tested in this study.
-
Une première centrale au fil de l'eau (FDE) au Nunavik (QC, Canada), construite en zone de pergélisol continu, alimente la communauté d'Inukjuak en énergie renouvelable depuis 2024. De petite taille, ces constructions ont été peu étudiées par le passé, notamment en lien avec la modification du cycle du mercure (Hg) et à la bioaccumulation de méthylmercure (MeHg) dans les réseaux alimentaires adjacents. Le pergélisol est cependant un potentiel réservoir substantiel de Hg, et la mise en eau pourrait favoriser son dégel, remobilisant ainsi du Hg historique, co-transporté par du carbone (C) ancien. Afin de mieux cerner les impacts d’une inondation en contexte septentrional, des sols, de l’eau de surface et des invertébrés benthiques ont été échantillonnés le long de la rivière Innuksuac avant, pendant et trois mois suivants la mise en eau. Afin d’investiguer le Hg dans la colonne d’eau, la qualité du carbone organique dissous (COD) (i.e. âge et composition) a été étudiée, tandis que le transfert trophique du MeHg au sein du réseau alimentaire a été clarifié à l’aide de l’isotopie stable (ẟ13C et ẟ15N), reflétant la diète et le niveau trophique des organismes. Le ratio Hg : C suggère que les concentrations de Hg dans le sol de la zone d’étude étaient moindres que ce qui était attendu, en se basant de précédentes estimations circompolaires, tandis que la majorité du Hg mesuré se trouvait dans la couche active du pergélisol et n’était donc pas immobilisé par le gel. Néanmoins, la mise en eau a généré une hausse de la concentration de MeHg (~ 7x) et du potentiel de méthylation (~ 4x) dans la couche organique superficielle des sols ennoyés. Cette hausse d’activité s’est reflétée dans les eaux de surface de la baie inondée, qui présentait des concentrations de MeHg dix fois plus élevées que dans les autres sites échantillonnés. Tandis que le COD exogène dérivant du milieu terrestre semble important pour l’apport de Hg inorganique dans le système riverain, le COD récemment dégradé par l’activité microbienne s’est avéré être le meilleur indicateur du potentiel de la méthylation. Une augmentation de la concentration tissulaire de MeHg a finalement été observée au bas de la chaîne trophique, chez les consommateurs primaires (~ 4x) ainsi que chez les invertébrés benthiques arborant une diète omnivore (~ 3x), mais pas chez les organismes prédateurs, suggérant l’existence d’un délai de transfert trophique. Chez les consommateurs primaires, cette augmentation était surtout apparente chez les invertébrés intimement associés à l’environnement benthique de la nouvelle baie inondée, où les signatures de ẟ13C étaient également les plus faibles. Ces résultats offrent un premier portrait à court terme du transport et des transformations du Hg lors d’une inondation en région subarctique, et les hausses enregistrées, bien que non négligeables, se limitent pour l’instant à une faible superficie (< 1 km2) et ne semblent pas se répercuter en aval de la petite baie inondée.
-
RÉSUMÉ: «RÉSUMÉ: Les inondations sont reconnues comme l’une des catastrophes naturelles les plus fréquentes et destructrices à l’échelle mondiale. Leur gravité est exacerbée par les effets du changement climatique (augmentation des précipitations) et de la construction humaine (réduction de la capacité naturelle à absorber l’eau). Les structures construites dans des zones sujettes à l’eau, telles que les ponts et les barrages, sont généralement vulnérables aux événements d’inondation sévères. Pour les problèmes impliquant de l’eau fluide, les chercheurs en hydraulique supposent généralement que les structures sont "infinitement" rigides et utilisent des limites de paroi imperméables pour représenter les structures dans les modèles numériques. Cependant, les structures se déformeront, vibreront et pourraient même être endommagées lors d’un événement d’inondation sévère. Du point de vue d’un ingénieur structurel, il est important d’incorporer la flexibilité structurelle dans l’analyse de l’interaction fluide-structure (FSI). Étant donné que la taille du domaine fluide est significativement plus grande que celle des structures, un grand nombre d’éléments est généré, rendant l’analyse FSI chronophage, surtout pour les cas avec un canal 3D long et des maillages raffinés. Par conséquent, une méthode de modélisation simplifiée efficace et précise est nécessaire. De plus, le comportement hydrodynamique des structures telles que le pont dans un cours d’eau et la structure du barrage à l’extrémité d’un canal partiellement recouvert de glace n’est pas bien connu. Pour aborder ce problème, cette recherche a examiné numériquement les réponses structurelles avec l’impact de l’écoulement des inondations en tenant compte de la flexibilité structurelle, en se concentrant sur l’interaction dynamique entre l’eau fluide et les structures solides, les effets 3D des fluides et des structures, le glissement des structures (par exemple, le glissement du tablier du pont), et la présence d’une couverture de glace partielle positionnée au sommet de l’eau dans un canal.» ABSTRACT: «ABSTRACT: Floods are recognized as one of the most frequent and destructive natural disasters globally. Their severity is exacerbated by the effects of climate change (increased precipitation) and human construction (reduced natural capacity to absorb water). Structures built in waterprone areas, such as bridges and dams, are usually vulnerable to severe flood events. For problems involving fluid water, hydraulic researchers commonly assume that structures are "infinitely" rigid and use impervious wall boundaries to present the structures in numerical models. However, structures will deform, vibrate, and even be damaged during a severe flood event. From a structural engineer’s perspective, it is important to incorporate structural flexibility into the fluid-structure interaction (FSI) analysis. Because the size of the fluid domain is significantly larger than that of the structures, a large set of elements is generated, making the FSI analysis time-consuming, especially for cases with a long 3D channel and refined meshes. As a result, an efficient and accurate simplified modeling method is needed. Also, the hydrodynamic behavior of structures such as the bridge in a stream and the dam structure at the end of a partially ice-covered channel is not well known. To address this problem, this research numerically investigated the structural responses with the impact of flood flow considering the structural flexibility, focusing on the dynamic interaction between fluid water and solid structures, the 3D effects of fluid and structures, the sliding of structures (e.g. sliding of bridge deck), and the presence of partial ice cover positioned at the top of the water in a channel.»
-
QUIC is a modern transport layer internet protocol designed to be more efficient and secure than TCP. It has gained popularity quickly in recent years and has been adopted by a number of prominent tech companies. Its efficiency comes from its handshake design. The server and the client make both the transport layer acknowledgment and the TLS agreement during the same round trip. However this process makes the packets heavy and requires more processing on the server-side than TCP. This characteristic can be used as leverage by an attacker to compromise the computing resources of its victim. This thesis investigates the resilience of QUIC Protocol against handshake flood attacks and proposes a detection mechanism (QUICShield). I conducted comprehensive experiments to evaluate the resource consumptions of both the attacker and the target during incomplete handshake attacks, including CPU, memory, and bandwidth. We compared the results against TCP Syn Cookies under Syn flood attacks. The DDoS amplification factor was measured and analyzed based on the results. This work also proposes a detection mechanism based on a Bloom filter combined with Generalized Likelihood Ratio Cumulative Sum (GLR-CUSUM) to adapt to evolving attack patterns. It was implemented and deployed against real attacks to evaluate its efficiency. We showed that the QUIC Protocol design has a much larger DDoS amplification factor compared to the TCP, which means QUIC is more vulnerable to handshake DDoS attacks. However the mechanism proposed is accurate and efficient in terms of resources.
-
Abstract: In Canada, the annual runoff is predominantly influenced by snowmelt following the winter season, with a substantial portion (40-80\%) occurring during the spring period, leading to flooding in low-lying areas. Accurate prediction of streamflow is essential for hydropower production, effective flood management, necessitating the incorporation of comprehensive spatially distributed snow observations into hydrological models. This draws the attention to the research question " How can we utilize spatially distributed snow information at various spatial and temporal scales to enhance our understanding of snow processes and apply it for enhanced model calibration to improve hydrological model performance?" The first objective of this thesis is to investigate the utilization of spatially distributed snow information (SNODAS- SNOw Data Assimilation System) for the calibration of a hydrological model and to determine its impact on model performance. A distributed hydrological model, HYDROTEL, has been implemented in the Au Saumon River watershed using input data from ERA-5 Land for temperature data and MSWEP for precipitation data. Seven different calibration experiments are conducted, employing three different objective functions: Nash-Sutcliffe Efficiency (NSE), Root Mean Square Error (RMSE), and the SPAtial EFficiency metric (SPAEF). These objective functions are utilized individually or in combination as part of multi-objective calibration processes. This study indicates that utilizing SPAEF for spatial calibration of snow parameters improved streamflow prediction compared to the conventional practice of using RMSE for calibration. SPAEF is further implied to be a more effective metric than RMSE for both sequential and multi-objective calibration. During validation, the calibration experiment incorporating multi-objective SPAEF exhibits enhanced performance in terms of NSE and KGE compared to calibration experiment solely based on NSE. The findings of this study hold significant relevance and potential applicability in emerging satellite technology, particularly the future Terrestrial Snow Mass Mission (TSMM). The study then explores the impact of temporal resolution and signal saturation for model calibration by using SNODAS data as proxy SWE observations mimicking the characteristics of the TSMM product to calibrate the HYDROTEL model. Despite the limitations of it's temporal resolution and signal saturation it is noteworthy that TSMM data exhibits significant potential for enhancing model performance thereby highlighting its utility for hydrological modeling. This study then focuses on the spatio-temporal analysis of snow processes influencing the spatial variability and distribution of snow depth in a small-scale experimental watershed. Drone photogrammetry is employed to capture spatially distributed snow information over the watershed during the winter seasons of 2022 and 2023. The photogrammetric data facilitated the generation of high-resolution digital surface models (DSMs). Empirical Orthogonal Function (EOF) analysis is applied to understand the spatial distribution of snow, enabling a detailed examination of various snow processes at the watershed scale. This thesis explores the added value of spatially distributed snow cover information in predicting spring runoff. Each part of the study contributes to a comprehensive understanding of the spatial distribution of snow and its significance in hydrology.
-
Les changements climatiques sont un défi mondial imminent, dont les conséquences sont déjà observées. On sait que ces effets s’intensifieront, entraînant une augmentation de la fréquence et de la gravité des événements météorologiques extrêmes, une perturbation substantielle de la production alimentaire et le déplacement de dizaines de millions de personnes en raison de vagues de chaleur mortelles et de sécheresses. La question est donc : que peut-on y faire ? Dans cette thèse, nous faisons des changements climatiques notre objet central et explorons les voies par lesquelles la recherche en apprentissage profond peut contribuer à son atténuation. Un obstacle connu à des politiques climatiques ambitieuses est le manque de soutien et de demande populaires. Cela peut être attribué, en partie, aux causes et conséquences extrêmement complexes et imbriquées des changements climatiques. Une mauvaise conception courante est qu'ils affecteront principalement d’autres personnes que soi-même, des personnes éloignées dans le temps ou l’espace. Pour y remédier, la recherche a montré que présenter aux gens des \textit{images} authentiques, locales et pertinentes d'un concept les aide à mieux comprendre et appréhender ce qui est en jeu. Dans notre première contribution, nous explorons donc comment les récentes avancées en apprentissage profond pour la vision par ordinateur et les réseaux antagonistes génératifs peuvent être utilisées pour générer des images \textit{personnalisées} représentant les impacts du changement climatique. Notre objectif avec \textit{ClimateGAN} est de visualiser à quoi pourrait ressembler une inondation d’un mètre à n’importe quelle adresse, indépendamment de son risque réel d’inondation sous l’effet des changements climatiques. Cette approche vise à susciter l’empathie en rendant les impacts abstraits du changement climatique plus tangibles et personnalisés. En utilisant une image de Google Street View et en la traitant avec \textit{ClimateGAN}, nous générons des images d’inondation physiquement plausibles et visuellement réalistes basées sur l’adaptation de domaine à partir d’un environnement simulé, la prédiction de profondeur et la segmentation sémantique. Ce modèle a été déployé sur un site web dans le but de sensibiliser et d’engager l’action en faveur des changements climatiques. En plus d’aider les gens à mieux visualiser à quoi pourrait ressembler un avenir climatique hors de contrôle, nous étudions également dans cette thèse comment l’apprentissage profond peut améliorer les technologies existantes. Un domaine majeur de recherche dans cette direction est la recherche de nouveaux matériaux. Dans cette thèse, nous explorons plus particulièrement la prédiction des propriétés des matériaux comme moyen d’accélérer la découverte d'électro-catalyseurs, une famille de matériaux impliqués dans le stockage d’énergie à base d’hydrogène. Nous présentons deux contributions, \textit{PhAST} et \textit{FAENet}, qui se concentrent sur l’amélioration du compromis performance/scalabilité dans les réseaux de neurones géométriques de graphe (GNN). Avec \textit{PhAST}, nous introduisons un ensemble de méthodes pour adapter la procédure GNN classique--de la création du graphe d’entrée aux prédictions d’énergie et de forces de sortie--à la tâche spécifique de prédire l’énergie d’un système atomique adsorbant-catalyseur relaxé. Nous démontrons comment, en plus d’améliorer les performances, ces modifications améliorent l’efficacité et permettent un entraînement compétitif des GNN dans des environnements CPU. Dans \textit{FAENet}, nous présentons un nouveau GNN efficace pour les prédictions équivariantes E(3). En particulier, nous transposons la charge de l’équivarience sur la représentation des données afin de réduire les contraintes sur le modèle lui-même. Cette approche nous permet d’introduire une nouvelle architecture légère et expressive visant à faire des prédictions meilleures et plus rapides de diverses propriétés des matériaux. Enfin, nous examinons de manière critique notre propre domaine et discutons des impacts environnementaux associés aux technologies de l’IA. Nous nous penchons sur la façon dont les praticiens peuvent estimer leurs émissions de carbone, quelles mesures ils peuvent prendre aujourd’hui pour les réduire, et quelles autres étapes sont nécessaires pour des déclarations et responsabilités environnementales plus précises.
-
Dams are vital national assets that play a crucial role in water storage, hydroelectric power generation, and flood control. Globally, over 61,000 large dams have surpassed 50 years of service, and many show signs of deterioration. With over 300 dam failures recorded worldwide, the potential for catastrophic damage remains alarmingly high if these aging structures are not properly maintained and upgraded. Further, many of the existing dams were built upon outdated standards, and there is an increase in seismic hazards making it imperative to reevaluate their seismic performance to align with current safety standards. The need for improved dam safety measures is urgent, as dam owners, regulators, and policymakers grapple with the challenges of ensuring the structural integrity of aging dams in the face of growing risks. A key solution is shifting from traditional safety approaches to a modern, risk-based methodology, which addresses safety concerns more efficiently and economically. Various, global agencies have developed risk-based safety assessment guidelines; however, these often lack systematic implementation frameworks and sufficient reference studies, making them difficult for dam owners to adopt effectively. Furthermore, various uncertainties can impact the risk assessment and can complicate efforts to ensure dam safety. In this context, this research investigates uncertainties impacting seismic risk assessments for dams, including modeling choices, ground motion selection, aging, and material variability. Case studies of the Koyna Dam and Pine Flat Dam were used to evaluate these factors at each stage of performance evaluation: system response, fragility, and risk assessment. Key findings indicate that dam-foundation-reservoir (DFR) models incorporating acoustic elements exhibit less variability in system response, regardless of model complexity and solution procedure. Ground motion derived from the conditional mean spectrum (CMS) method yields better fragility estimates than the ASCE 7-16 standard, particularly for moderate to severe damage states. Additionally, aging and material variability significantly affect the dynamic characteristics of dams, with increased failure probabilities correlating with both age and return period. Based on these findings, the research proposes a comprehensive, systematic framework for risk-based seismic safety evaluation. This framework aligns with safety assessment objectives and ensures optimal use of computational resources.