scholarly journals Standard geomagnetic observatory data in tectonomagnetism: case study related to the M 5.7 Timisoara, Romania, earthquake

1997 ◽  
Vol 40 (2) ◽  
Author(s):  
M. Popeskov

There has recently been much discussion of large-scale interactions of fault zones and the influence of large-scale processes in the preparation and triggering of earthquakes. As a consequence, an official recommendation was issued to set up observational networks at regional scale. In this context, the existing network of standard geomagnetic observatories might play a more important role in future tectonomagnetic studies. The data from standard geomagnetic observatories are basically not appropriate for the detection of small-magnitude and, in most cases, spatially very localized geomagnetic field changes. However, their advantage is a continuity in a long-time period which enables the study of regional tectonomagnetic features and long-term precursory changes. As the first step of a more extensive study aimed at examining the features of observatory data for this purpose, a three-year data set from five European observatories has been analyzed. Some common statistical procedures have been applied along with a simple difference technique and multivariate linear regression to define local geomagnetic field changes. The distribution of M ³ 4.5 earthquakes in Europe, in a corresponding period, was also taken into account. No pronounced field variation, related in time to the M 5.7 Timisoara (Romania) earthquake on July 12, 1991, was found at Grocka observatory at about 80 km from the earthquake epicenter. However, an offset in level of the differences in declination which include Grocka observatory, not seen in the case of differences between other observatories, could be associated with a possible regional effect of the M 4.8 earthquake which occurred in September 1991 at about 70 km SE from Grocka.

Author(s):  
Sebastian Brehm ◽  
Felix Kern ◽  
Jonas Raub ◽  
Reinhard Niehuis

The Institute of Jet Propulsion at the University of the German Federal Armed Forces Munich has developed and patented a novel concept of air injection systems for active aerodynamic stabilization of turbo compressors. This so-called Ejector Injection System (EIS) utilizes the ejector effect to enhance efficiency and impact of the aerodynamic stabilization of the Larzac 04 two-spool turbofan engine’s LPC. The EIS design manufactured recently has been subject to CFD and experimental pre-investigations in which the expected ejector effect performance has been proven and the CFD set-up has been validated. Subsequently, optimization of the EIS ejector geometry comes into focus in order to enhance its performance. In this context, CFD parameter studies on the influence of in total 16 geometric and several aerodynamic parameters on the ejector effect are required. However, the existing and validated CFD set-up of the EIS comprises not only the mainly axisymmetric ejector geometry but also the highly complex 3D supply components upstream of the ejector geometry. This is hindering large scale CFD parameter studies due to the numerical effort required for these full 3D CFD simulations. Therefore, an approach to exploit the overall axissymmetry of the ejector geometry is presented within this paper which reduces the numerical effort required for CFD simulations of the EIS by more than 90%. This approach is verified by means of both experimental results as well as CFD predictions of the full 3D set-up. The comprehensive verification data set contains wall pressure distributions and the mass flow rates involved at various Aerodynamic Operating Points (AOP). Furthermore, limitations of the approach are revealed concerning its suitability e.g. to judge the response of the attached compressor of future EIS designs concerning aerodynamic stability or cyclic loading.


Geology ◽  
2020 ◽  
Author(s):  
Mikkel Fruergaard ◽  
Lasse Sander ◽  
Jérôme Goslin ◽  
Thorbjørn J. Andersen

Understanding the coupling between sediment availability and sea-level change is important for forecasting coastal-barrier (barrier islands and barrier spits) response to future sea-level rise (SLR). An extensive data set of sediment cores, seismic profiles, and a high-resolution chronology from the Wadden Sea (southeastern North Sea) documents that long-term barrier-chain progradation was interrupted by a period of widespread barrier deterioration between ca. 3.5 and 2.0 ka. The decay of the barrier islands resulted from a decrease in littoral drift triggered by regional-scale coastal reconfiguration. The formation of a large cuspate foreland updrift caused the depositional locus to shift away from the barrier coast. Our results demonstrate that the resulting reduction in marine sediment availability substantially decreased the stability of the barrier chain, causing the regional SLR thresholds to fall from between 2 and 9 mm yr–1 to ~0.9 mm yr–1, and thus below contemporary rates of SLR. Hence, we argue that predicting the response of barrier coasts to ongoing SLR requires consideration of possible changes in sediment availability and the role of large-scale geomorphological feedbacks due to human and natural causes.


2017 ◽  
Vol 14 (21) ◽  
pp. 5003-5014 ◽  
Author(s):  
Katrin Magin ◽  
Celia Somlai-Haase ◽  
Ralf B. Schäfer ◽  
Andreas Lorke

Abstract. Inland waters play an important role in regional to global-scale carbon cycling by transporting, processing and emitting substantial amounts of carbon, which originate mainly from their catchments. In this study, we analyzed the relationship between terrestrial net primary production (NPP) and the rate at which carbon is exported from the catchments in a temperate stream network. The analysis included more than 200 catchment areas in southwest Germany, ranging in size from 0.8 to 889 km2 for which CO2 evasion from stream surfaces and downstream transport with stream discharge were estimated from water quality monitoring data, while NPP in the catchments was obtained from a global data set based on remote sensing. We found that on average 13.9 g C m−2 yr−1 (corresponding to 2.7 % of terrestrial NPP) are exported from the catchments by streams and rivers, in which both CO2 evasion and downstream transport contributed about equally to this flux. The average carbon fluxes in the catchments of the study area resembled global and large-scale zonal mean values in many respects, including NPP, stream evasion and the carbon export per catchment area in the fluvial network. A review of existing studies on aquatic–terrestrial coupling in the carbon cycle suggests that the carbon export per catchment area varies in a relatively narrow range, despite a broad range of different spatial scales and hydrological characteristics of the study regions.


2017 ◽  
Author(s):  
Katrin Magin ◽  
Celia Somlai-Haase ◽  
Ralf B. Schäfer ◽  
Andreas Lorke

Abstract. Inland waters play an important role in regional to global scale carbon cycling by transporting, processing and emitting substantial amounts of carbon, which originate mainly from their catchments. In this study, we analyzed the relationship between terrestrial net primary production (NPP) and the rate at which carbon is exported from the catchments in a temperate stream network. The analysis included more than 200 catchment areas in southwest Germany, ranging in size from 0.8 to 889 km2 for which CO2 evasion from stream surfaces and downstream transport with stream discharge were estimated from water quality monitoring data, while NPP in the catchments was obtained from a global data set based on remote sensing. We found that on average 2.7 % of terrestrial NPP (13.9 g C m2 yr−1) are exported from the catchments by streams and rivers, in which both CO2 evasion and downstream transport contributed about equally to this flux. The average carbon fluxes in the catchments of the study area resembled global and large-scale zonal mean values in many respects, including NPP, stream evasion as well as the catchment-specific total export rate of carbon in the fluvial network. A review of existing studies on aquatic-terrestrial coupling in the carbon cycle suggests that the catchment-specific carbon export varies in a relatively narrow range, despite a broad range of different spatial scales and hydrological characteristics of the study regions.


2012 ◽  
Vol 8 (1) ◽  
pp. 481-503 ◽  
Author(s):  
J. D. Annan ◽  
J. C. Hargreaves

Abstract. We investigate the identifiability of the climate by limited proxy data. We test a data assimilation approach through perfect model pseudoproxy experiments, using a simple likelihood-based weighting based on the particle filtering process. Our experimental set-up enables us to create a massive 10 000-member ensemble at modest computational cost, thus enabling us to generate statistically robust results. We find that the method works well when data are sparse and imprecise, but in this case the reconstruction has a rather low accuracy as indicated by residual RMS errors. Conversely, when data are relatively plentiful and accurate, the estimate tracks the target closely, at least when considering the hemispheric mean. However, in this case, our prior ensemble size of 10 000 appears to be inadequate to correctly represent the true posterior, and the regional performance is poor. Using correlations to assess performance gives a more encouraging picture, with significant correlations ranging from about 0.3 when data are sparse to values over 0.7 when data are plentiful, but the residual RMS errors are substantial in all cases. Our results imply that caution is required in interpreting climate reconstructions, especially when considering the regional scale, as skill on this basis is markedly lower than on the large scale of hemispheric mean temperature.


2012 ◽  
Vol 16 (11) ◽  
pp. 4143-4156 ◽  
Author(s):  
F. Pappenberger ◽  
E. Dutra ◽  
F. Wetterhall ◽  
H. L. Cloke

Abstract. Global flood hazard maps can be used in the assessment of flood risk in a number of different applications, including (re)insurance and large scale flood preparedness. Such global hazard maps can be generated using large scale physically based models of rainfall-runoff and river routing, when used in conjunction with a number of post-processing methods. In this study, the European Centre for Medium Range Weather Forecasts (ECMWF) land surface model is coupled to ERA-Interim reanalysis meteorological forcing data, and resultant runoff is passed to a river routing algorithm which simulates floodplains and flood flow across the global land area. The global hazard map is based on a 30 yr (1979–2010) simulation period. A Gumbel distribution is fitted to the annual maxima flows to derive a number of flood return periods. The return periods are calculated initially for a 25 × 25 km grid, which is then reprojected onto a 1 × 1 km grid to derive maps of higher resolution and estimate flooded fractional area for the individual 25 × 25 km cells. Several global and regional maps of flood return periods ranging from 2 to 500 yr are presented. The results compare reasonably to a benchmark data set of global flood hazard. The developed methodology can be applied to other datasets on a global or regional scale.


2018 ◽  
Author(s):  
Andy Aeberhard ◽  
Leo Gschwind ◽  
Joe Kossowsky ◽  
Gediminas Luksys ◽  
Dominique de Quervain ◽  
...  

We have established the COgnitive Science Metrics Online Survey (COSMOS) platform that contains a digital psychometrics toolset in the guise of applied games measuring a wide range of cognitive functions. Here we are outlining this online research endeavor designed for automatized psychometric data collection and scalable assessment: Once set up, the low costs and expenditure associated with individual psychometric testing allow substantially increased study cohorts and thus contribute to enhancing study outcome reliability. We are leveraging gamification of the data acquisition method to make the tests suitable for online administration. By putting a strong focus on entertainment and individually tailored feedback, we aim to maximize subjects’ incentives for repeated and continued participation. The objective of measuring repeatedly is obtaining more revealing multi-trial average scores and measures from various operationalizations of the same psychological construct instead of relying on single-shot measurements. COSMOS is set up to acquire an automatically and continuously growing dataset that can be used to answer a wide variety of research questions.Following the principles of the open science movement, this data set will also be made accessible to other publicly-funded researchers, given that all precautions for individual data protection are fulfilled. We have developed a secure hosting platform and a series of digital gamified testing instruments that can measure theory of mind, attention, working memory, episodic long- and short-term memory, spatial memory, reaction times, eye-hand coordination, impulsivity, humor appreciation, altruism, fairness, strategic thinking, decision making and risk-taking behavior. Furthermore, some of the game-based testing instruments also offer the possibility of using classical questionnaire items. A subset of these gamified tests is already implemented in the COSMOS platform, publicly accessible and currently undergoing evaluation and calibration as normative data is being collected. In summary, our approach can be used to accomplish a detailed and reliable psychometric characterization of thousands of individuals to supply various studies with large-scale neuro-cognitive phenotypes. Our game-based online testing strategy can also guide recruitment for studies as they allow very efficient screening and sample composition. Finally, this setup also allows to evaluate potential cognitive training effects and whether improvements are merely task specific or if generalization effects occur in or even across cognitive domains.


2012 ◽  
Vol 8 (4) ◽  
pp. 1141-1151 ◽  
Author(s):  
J. D. Annan ◽  
J. C. Hargreaves

Abstract. We investigate the identifiability of the climate by limited proxy data. We test a data assimilation approach through perfect model pseudoproxy experiments, using a simple likelihood-based weighting based on the particle filtering process. Our experimental set-up enables us to create a massive 10 000-member ensemble at modest computational cost, thus enabling us to generate statistically robust results. We find that the method works well when data are sparse and imprecise, but in this case the reconstruction has a rather low accuracy as indicated by residual RMS errors. Conversely, when data are relatively plentiful and accurate, the estimate tracks the target closely, at least when considering the hemispheric mean. However, in this case, our prior ensemble size of 10 000 appears to be inadequate to correctly represent the true posterior, and the regional performance is poor. Using correlations to assess performance gives a more encouraging picture, with significant correlations ranging from about 0.3 when data are sparse to values over 0.7 when data are plentiful, but the residual RMS errors are substantial in all cases. Our results imply that caution is required in interpreting climate reconstructions, especially when considering the regional scale, as skill on this basis is markedly lower than on the large scale of hemispheric mean temperature.


2012 ◽  
Vol 9 (5) ◽  
pp. 6615-6647 ◽  
Author(s):  
F. Pappenberger ◽  
E. Dutra ◽  
F. Wetterhall ◽  
H. Cloke

Abstract. Global flood hazard maps can be used in the assessment of flood risk in a number of different applications, including (re)insurance and large scale flood preparedness. Such global hazard maps can be generated using large scale physically based models of rainfall-runoff and river routing, when used in conjunction with a number of post-processing methods. In this study, the European Centre for Medium Range Weather Forecasts (ECMWF) land surface model is coupled to ERA-Interim reanalysis meteorological forcing data, and resultant runoff is passed to a river routing algorithm which simulates floodplains and flood flow across the global land area. The global hazard map is based on a 30 yr (1979–2010) simulation period. A Gumbel distribution is fitted to the annual maxima flows to derive a number of flood return periods. The return periods are calculated initially for a 25 × 25 km grid, which is then reprojected onto a 1 × 1 km grid to derive maps of higher resolution and estimate flooded fractional area for the individual 25 × 25 km cells. Several global and regional maps of flood return periods ranging from 2 to 500 yr are presented. The results compare reasonably to a benchmark data set of global flood hazard. The developed methodology can be applied to other datasets on a global or regional scale.


1996 ◽  
Vol 76 (06) ◽  
pp. 0939-0943 ◽  
Author(s):  
B Boneu ◽  
G Destelle ◽  

SummaryThe anti-aggregating activity of five rising doses of clopidogrel has been compared to that of ticlopidine in atherosclerotic patients. The aim of this study was to determine the dose of clopidogrel which should be tested in a large scale clinical trial of secondary prevention of ischemic events in patients suffering from vascular manifestations of atherosclerosis [CAPRIE (Clopidogrel vs Aspirin in Patients at Risk of Ischemic Events) trial]. A multicenter study involving 9 haematological laboratories and 29 clinical centers was set up. One hundred and fifty ambulatory patients were randomized into one of the seven following groups: clopidogrel at doses of 10, 25, 50,75 or 100 mg OD, ticlopidine 250 mg BID or placebo. ADP and collagen-induced platelet aggregation tests were performed before starting treatment and after 7 and 28 days. Bleeding time was performed on days 0 and 28. Patients were seen on days 0, 7 and 28 to check the clinical and biological tolerability of the treatment. Clopidogrel exerted a dose-related inhibition of ADP-induced platelet aggregation and bleeding time prolongation. In the presence of ADP (5 \lM) this inhibition ranged between 29% and 44% in comparison to pretreatment values. The bleeding times were prolonged by 1.5 to 1.7 times. These effects were non significantly different from those produced by ticlopidine. The clinical tolerability was good or fair in 97.5% of the patients. No haematological adverse events were recorded. These results allowed the selection of 75 mg once a day to evaluate and compare the antithrombotic activity of clopidogrel to that of aspirin in the CAPRIE trial.


Sign in / Sign up

Export Citation Format

Share Document