Nonuniqueness in traveltime tomography: Ensemble inference and cluster analysis

Geophysics ◽  
1996 ◽  
Vol 61 (4) ◽  
pp. 1209-1227 ◽  
Author(s):  
Don W. Vasco ◽  
John E. Peterson ◽  
Ernest L. Majer

We examine the nonlinear aspects of seismic traveltime tomography. This is accomplished by completing an extensive set of conjugate gradient inversions on a parallel virtual machine, with each initiated by a different starting model. The goal is an exploratory analysis of a set of conjugate gradient solutions to the traveltime tomography problem. We find that distinct local minima are generated when prior constraints are imposed on traveltime tomographic inverse problems. Methods from cluster analysis determine the number and location of the isolated solutions to the traveltime tomography problem. We apply the cluster analysis techniques to a cross‐borehole traveltime data set gathered at the Gypsy Pilot Site in Pawnee County, Oklahoma. We find that the 1075 final models, satisfying the traveltime data and a model norm penalty, form up to 61 separate solutions. All solutions appear to contain a central low velocity zone bounded above and below by higher velocity layers. Such a structure agrees with well‐logs, hydrological well tests, and a previous seismic inversion.

2007 ◽  
Vol 227 (3) ◽  
Author(s):  
Wolf Dieter Heinbach ◽  
Stefanie Schröpfer

SummaryThe introduction of opening clauses in collective wage agreements allowing firms to deviate from their collective bargaining agreements has become widely accepted for the last fifteen years. With respect to the flexibility agreed through collective bargaining, the distinctions between single collective bargaining areas of the same industry have increased. Hence, the economic idea of uniform industry-wide central collective bargaining agreements is no longer tenable. The data set of the IAW used in this article provides differentiated information about opening clauses in collective wage agreements. By means of correspondence and cluster analysis, seven groups of collective bargaining areas are identified, which differ in the type of opening clauses introduced. Over the period from 1991 until 2004, the examination of dynamic aspects of these seven groups exhibits typical paths of development towards an improved flexibility agreed through collective bargaining. Furthermore, the conjunction of the data set with the German Structure of Earnings Survey of the years 1995 and 2001 makes it possible to show the relevance of the different types of single collective bargaining areas for employment and industries of the German manufacturing sector.


2020 ◽  
Vol Special issue on... ◽  
Author(s):  
Hermann Moisl

International audience Discovery of the chronological or geographical distribution of collections of historical text can be more reliable when based on multivariate rather than on univariate data because multivariate data provide a more complete description. Where the data are high-dimensional, however, their complexity can defy analysis using traditional philological methods. The first step in dealing with such data is to visualize it using graphical methods in order to identify any latent structure. If found, such structure facilitates formulation of hypotheses which can be tested using a range of mathematical and statistical methods. Where, however, the dimensionality is greater than 3, direct graphical investigation is impossible. The present discussion presents a roadmap of how this obstacle can be overcome, and is in three main parts: the first part presents some fundamental data concepts, the second describes an example corpus and a high-dimensional data set derived from it, and the third outlines two approaches to visualization of that data set: dimensionality reduction and cluster analysis.


Geophysics ◽  
2006 ◽  
Vol 71 (1) ◽  
pp. H1-H11 ◽  
Author(s):  
Fuchun Gao ◽  
Alan R. Levander ◽  
R. Gerhard Pratt ◽  
Colin A. Zelt ◽  
Gian Luigi Fradelizio

Application of 2D frequency-domain waveform tomography to a data set from a high-resolution vertical seismic profiling (VSP) experiment at a groundwater contamination site in Hill Air Force Base (HAFB), Utah, reveals a surprisingly complicated shallow substructure with a resolution of approximately 1.5 m. Variance in the waveform misfit function is reduced 69.4% by using an initial velocity model from first-arrival traveltime tomography. The waveform tomography model suggests (1) a low-velocity layer at 1 to 4 m depth, (2) a high-vertical-velocity gradient of 80 m/s/m on average, and (3) severe lateral variations — velocity contrasts as large as about 200 m/s occur in a distance as short as 1.5 m. The model is well correlated with lithologic logs and is interpreted geologically. A Q-value of 20 is estimated for the target area. The extreme lateral and vertical variations of the subsurface compromise many standard seismic processing methods.


2005 ◽  
Vol 9 (1/2) ◽  
pp. 67-80 ◽  
Author(s):  
J. Pempkowiak ◽  
J. Beldowski ◽  
K. Pazdro ◽  
A. Staniszewski ◽  
A. Zaborska ◽  
...  

Abstract. Factors conditioning formation and properties of suspended matter resting on the sea floor (Fluffy Layer Suspended Matter - FLSM) in the Odra river mouth - Arkona Deep system (southern Baltic Sea) were investigated. Thirty FLSM samples were collected from four sampling stations, during nine cruises, in the period 1996-1998. Twenty six chemical properties of the fluffy material were measured (organic matter-total, humic substances, a variety of fatty acids fractions, P, N, δ13C, δ15N; Li; heavy metals- Co, Cd, Pb, Ni, Zn, Fe, Al, Mn, Cu, Cr). The so obtained data set was subjected to statistical evaluation. Comparison of mean values of the measured properties led to conclusion that both seasonal and spatial differences of the fluffy material collected at the stations occured. Application of Principal Component Analysis, and Cluster Analysis, to the data set amended with environmental characteristics (depth, salinity, chlorophyll a, distance from the river mouth), led to quantification of factors conditioning the FLSM formation. The five most important factors were: contribution of the lithogenic component (responsible for 25% of the data set variability), time dependent factors (including primary productivity, mass exchange with fine sediment fraction, atmospheric deposition, contribution of material originating from abrasion-altogether 21%), contribution of fresh autochtonous organic matter (9%), influence of microbial activity (8%), seasonality (8%).


2003 ◽  
Vol 45 (2) ◽  
pp. 1-19 ◽  
Author(s):  
Agnes Nairn ◽  
Paul Bottomley

The customer relationship management (CRM) industry is set to be worth $76.3 billion by 2005 but over 50% of projects will fail to meet benefit objectives. While CRM nirvana is the attainment of profitable one-to-one relationships, current activity is concentrated on segmentation. As technology has moved segmentation from simple classification towards more complex predictive modelling, the use of CRM analytic suites comprising statistical techniques such as decision trees, neural networks and cluster analysis is increasing. It is suggested that the subjective nature of cluster analysis may be overlooked when the technique is integrated with other ‘tools’ into a data-mining package and, consequently, that inadequately tested cluster analysis solutions may be contributing to CRM dissatisfaction. This paper reports the findings of a study which subjected a data set designed for segmentation purposes to a series of rigorous validity and reliability tests and went as far as to randomise the data to ascertain whether current methods could detect ‘false’ data. The study shows, alarmingly, that under certain conditions random data can ‘pass’ standard tests and highlights just how meticulously and thoroughly cluster analysis solutions must be tested before they can be safely used in formulating marketing strategy. Practical, theoretical and technical advice is offered for managers working with CRM analytics suites and avenues suggested for future research into improved CRM performance through effective management of the IT/marketing interface.


2020 ◽  
Vol 16 (7) ◽  
pp. 1223-1245
Author(s):  
V.V. Smirnov

Subject. The article focuses on the modern financial system of Russia. Objectives. I determine the limit of the contemporary financial system in Russia. Methods. The study is based on methods of descriptive statistics, statistical and cluster analysis. Results. The article shows the possibility of determining the scope of the contemporary financial system in Russia by establishing monetary relations as the order of the internal system and concerted operation of subsystems, preserving the structure of the financial system, maintaining the operational regime, implementing the program and achieving the goal. I found that the Russian financial system correlated with the Angolan one, and the real scope of the contemporary financial system in Russia. Conclusions and Relevance. As an attempt to effectively establish monetary relations and manage them, the limit of the contemporary financial system is related to the possibility of using Monetary Aggregate M0 to maintain the balance of the Central Bank of Russia. To overcome the scope of Russia’s financial system, the economy should have changed its specialization, refocusing it on high-tech export and increasing the foreign currency reserves. This can be done if amendments to Russia’s Constitution are adopted. The findings expand the scope of knowledge and create new competence in the establishment of monetary relations, order of the internal system and concerted interaction of subsystems, structural preservation of the financial system and maintenance of its operational regime.


Sign in / Sign up

Export Citation Format

Share Document