Optimization of seismic sparse data acquisition based on an interpolation algorithm – Numerical data verification

2019 ◽  
Author(s):  
D. H. Hien ◽  
T. Q. Minh ◽  
M. T. Lua ◽  
N. T. Tung ◽  
S. Jang
2000 ◽  
Author(s):  
Jeroen Groenenboom ◽  
Evert C. Slob
Keyword(s):  

2021 ◽  
Vol 6 (1) ◽  
pp. 9-18
Author(s):  
Setiadi Setiadi ◽  
Bagus Wicaksono ◽  
Kurdianto Kurdianto ◽  
Bagus H. Jihad

Data Acquisition System has a significant role, especially in static testing of a rocket, determining whether a rocket is declared eligible to fly or not based on the static rocket test. Static testing of the RX320 rocket involves several numerical data instrumentation components, including the Yokogawa DL850 and the CDA900 Signal Conditioner, and the PT750 Pressure sensor. It has functions to accept the physical force that occurs, measure and record the value of the Pressure force in the RX 320 Rocket Chamber at the time Static test during burning time is performed. From the record value of the RX 320 chamber pressure, it can be stated that the RX 320 is suitable for the rocket flight test. The calculation results of the chamber pressure design and the results of measurement and recording of RX320 static test data indicate that the Pressure Chamber RX320 value is still within the safe limits of the RX320 Rocket motor tube material strength.


2005 ◽  
Vol 127 (8) ◽  
pp. 2767-2775 ◽  
Author(s):  
Vitali Tugarinov ◽  
Lewis E. Kay ◽  
Ilghiz Ibraghimov ◽  
Vladislav Yu. Orekhov

2019 ◽  
Vol 65 (3) ◽  
pp. 1578-1588
Author(s):  
Ahmed Elzanaty ◽  
Andrea Giorgetti ◽  
Marco Chiani
Keyword(s):  

2018 ◽  
Author(s):  
Tu-Thach Quach ◽  
Sapan Agarwal ◽  
Conrad D. James ◽  
Matthew J. Marinella ◽  
James Bradley Aimone

IEEE Access ◽  
2019 ◽  
Vol 7 ◽  
pp. 1685-1693
Author(s):  
Tu-Thach Quach ◽  
Sapan Agarwal ◽  
Conrad D. James ◽  
Matthew J. Marinella ◽  
James B. Aimone

2019 ◽  
Vol 8 (1) ◽  
pp. 207-214
Author(s):  
David Thomas Marehn ◽  
Detlef Wilhelm ◽  
Heike Pospisil ◽  
Roberto Pizzoferrato

Abstract. The importance of software validation increases since the need for high usability and suitability of software applications grows. In order to reduce costs and manage risk factors, more and more recommendations and rules have been established. In the field of pharmacy the vendors of so-called chromatography data systems (CDSs) had to implement the guidelines of the Code of Federal Regulations Title 21 (CFR 21) during the last few years in order to fulfill the increasing requirements. The CFR 21 part 11 deals with electronic records and signatures. This part is binding for each company in the regulated environment that wishes to create, edit and sign electronic information instead of printing them on paper. Subsection CFR 21 part 11.10(h) explains how to perform an input check for manual user entries as well as for data that will be collected from an external device. In this article we present an approach performing the double entry method on data provided by the hardware instrument in order to investigate possible influences on the raw data by the handling CDS. A software tool has been written which allows us to communicate with a high-performance liquid chromatography (HPLC) detector and acquire data from it. The communication is completely independent of a CDS which is started separately and connected to the same system. Using this configuration we made a parallel data acquisition of two instances at the same time possible. Two CDSs have been tested and for at least one of them it has been shown that a comparison of the acquired data can be done as with the double entry method for the data verification. For the second CDS we checked whether it would be applicable after a few modifications. The given approach could be either used for a live data verification of produced raw data or as a single test during a software operational qualification to verify the data acquisition functionality of the software.


2021 ◽  
Vol 2068 (1) ◽  
pp. 012010
Author(s):  
Bolun Wang ◽  
Xin Jiang ◽  
Guanying Huo ◽  
Cheng Su ◽  
Dongming Yan ◽  
...  

Abstract B-splines are widely used in the fields of reverse engineering and computer-aided design, due to their superior properties. Traditional B-spline surface interpolation algorithms usually assume regularity of the data distribution. In this paper, we introduce a novel B-spline surface interpolation algorithm: KPI, which can interpolate sparsely and non-uniformly distributed data points. As a two-stage algorithm, our method generates the dataset out of the sparse data using Kriging, and uses the proposed KPI (Key-Point Interpolation) method to generate the control points. Our algorithm can be extended to higher dimensional data interpolation, such as reconstructing dynamic surfaces. We apply the method to interpolating the temperature of Shanxi Province. The generated dynamic surface accurately interpolates the temperature data provided by the weather stations, and the preserved dynamic characteristics can be useful for meteorology studies.


Author(s):  
W.M. Stobbs

I do not have access to the abstracts of the first meeting of EMSA but at this, the 50th Anniversary meeting of the Electron Microscopy Society of America, I have an excuse to consider the historical origins of the approaches we take to the use of electron microscopy for the characterisation of materials. I have myself been actively involved in the use of TEM for the characterisation of heterogeneities for little more than half of that period. My own view is that it was between the 3rd International Meeting at London, and the 1956 Stockholm meeting, the first of the European series , that the foundations of the approaches we now take to the characterisation of a material using the TEM were laid down. (This was 10 years before I took dynamical theory to be etched in stone.) It was at the 1956 meeting that Menter showed lattice resolution images of sodium faujasite and Hirsch, Home and Whelan showed images of dislocations in the XlVth session on “metallography and other industrial applications”. I have always incidentally been delighted by the way the latter authors misinterpreted astonishingly clear thickness fringes in a beaten (”) foil of Al as being contrast due to “large strains”, an error which they corrected with admirable rapidity as the theory developed. At the London meeting the research described covered a broad range of approaches, including many that are only now being rediscovered as worth further effort: however such is the power of “the image” to persuade that the above two papers set trends which influence, perhaps too strongly, the approaches we take now. Menter was clear that the way the planes in his image tended to be curved was associated with the imaging conditions rather than with lattice strains, and yet it now seems to be common practice to assume that the dots in an “atomic resolution image” can faithfully represent the variations in atomic spacing at a localised defect. Even when the more reasonable approach is taken of matching the image details with a computed simulation for an assumed model, the non-uniqueness of the interpreted fit seems to be rather rarely appreciated. Hirsch et al., on the other hand, made a point of using their images to get numerical data on characteristics of the specimen they examined, such as its dislocation density, which would not be expected to be influenced by uncertainties in the contrast. Nonetheless the trends were set with microscope manufacturers producing higher and higher resolution microscopes, while the blind faith of the users in the image produced as being a near directly interpretable representation of reality seems to have increased rather than been generally questioned. But if we want to test structural models we need numbers and it is the analogue to digital conversion of the information in the image which is required.


Sign in / Sign up

Export Citation Format

Share Document