scholarly journals Systematic assessment of acquisition and data-processing parameters in the suspect screening of veterinary drugs in archive matrices using LC-HRMS

Author(s):  
Larissa J. M. Jansen ◽  
Rosalie Nijssen ◽  
Yvette J. C. Bolck ◽  
Robin S. Wegh ◽  
Milou G. M. van de Schans ◽  
...  
2020 ◽  
Vol 9 (10) ◽  
pp. 606
Author(s):  
Ahmed Karam ◽  
Thorbjørn M. Illemann ◽  
Kristian Hegner Reinau ◽  
Goran Vuk ◽  
Christian O. Hansen

To make the right decisions on investments, operations, and policies in the public road sector, decision makers need knowledge about traffic measures of trucks, such as average travel time and the frequency of trips among geographical zones. Private logistics companies daily collect a large amount of freight global positioning system (GPS) and shipment data. Processing such data can provide public decision makers with detailed freight traffic measures, which are necessary for making different planning decisions. The present paper proposes a system framework to be used in the research project “A new system for sharing data between logistics companies and public infrastructure authorities: improving infrastructure while maintaining competitive advantage”. Previous studies ignored the fact that the primary step for delivering valuable and usable data processing systems is to consider the final user’s needs when developing the system framework. Unlike existing studies, this paper develops the system framework through applying a user-centred design approach combining three main steps. The first step is to identify the specific traffic measures that satisfy the public decision makers’ planning needs. The second step aims to identify the different types of freight data required as inputs to the data processing system, while the third step illustrates the procedures needed to process the shared freight data. To do so, the current work employs methods of literature review and users’ need identification in applying a user-centralized approach. In addition, we develop a systematic assessment of the coverage and sufficiency of the currently acquired data. Finally, we illustrate the detailed functionality of the data processing system and provide an application case to illustrate its procedures.


Geophysics ◽  
1971 ◽  
Vol 36 (6) ◽  
pp. 1043-1073 ◽  
Author(s):  
William A. Schneider

The subject matter of this review paper pertains to developments during the past several years in the area of reflection seismic data processing and analysis. While this subject area is extensive in both its breadth and scope, one indisputable fact emerges: the computer is now more pervasive than ever. Processing areas which were computer intensive, such as signal enhancement, are now even more so; and those formerly exclusive domains of man, such as seismic interpretation, are beginning to feel the encroachment of the large number crunchers. What the future holds is anyone’s guess, but it is quite probable that man and computer will share the throne if the interactive seismic processing systems on the drawing boards come to pass. For the present and recent past, however, the most intensively developed areas of seismic data processing and analysis include 1) computer extraction of processing parameters such as stacking velocity and statics, 2) automated detection and tracking of reflections in multidimensional parameter space to provide continuous estimates of traveltime, amplitude, moveout (velocity), dip, etc., 3) direct digital migration in two dimensions, giving improved subsurface “pictures” and utilizing diffraction energy normally lost by specular processing techniques, and 4) development of quantitative understanding of the limitations imposed by current seismic processing practice and assumptions with regard to structural and stratigraphic model building, and recognition of the ultimate need for an iterative signal processing—information extraction—model building closed loop system.


Author(s):  
B. Boriak

Introduction. In the article I analyzed the relationships between parameters of the adaptive exponential smoothing filter and quality of filtration and forecast. Adaptation principle of the exponential filter is based on use of the method of least squares. Aims. To analyze the quality of the algorithm’s adaptation and define the conditions in which it can workappropriately. To get the information that describes connection between data processing parameters and quality of filtration and forecast. Methodology. I have applied concepts of time series analysis and mathematical simulation in Matlab package. Results. I have obtained approximate values of different parameters that describe conditions of the system or device in which this filtering and forecasting algorithm can be integrated. I assessed the quality of smoothing factor adaptation method establishing relationships between parameters. Originality. For the first time I have defined the relationships between rms-errors (filtration and forecast) and the following parameters: the number of steps which the forecast is made for, the quantity of steps which the data processing algorithm uses for quality estimation of filtering process, the value that defines acceptable error and smoothing factor initial value. I also got the information that describes the connection between some of the tparameters mentioned above. Practical value. I have built a structure of the data processing algorithm that can be integrated into different automated control systems. This research gives an opportunity to choose appropriate parameters of the filtering and forecasting algorithm that will give an ability to filter and make a prediction of a signal in channels of measurement and control channels. Proposed data processing algorithm can be implemented as a program for a microcontroller.


2020 ◽  
Vol 6 (2) ◽  
pp. 101-112
Author(s):  
Syamsurijal Rasimeng ◽  
Amelia Isti Ekarena ◽  
Bagus Sapto Mulyanto ◽  
Subarsyah Subarsyah ◽  
Andrian Wilyan Djaja

Migration is one of the stages in seismic data processing aimed at returning the diffraction effect to the actual reflector point. The processing of a seismic data is adjusted to the existing problems in the data itself, so the accuracy in using the migration technique and determination of data processing parameters greatly affects the resulting seismic cross-section. Kirchhoff Pre-Stack Time Migration is one of the most used migration methods in seismic data processing because it shows better results than conventional stacking methods. The parameters that need to be noticed in the Kirchhoff migration are the migration aperture values. Based on this, variations of migration aperture values used are 75 m, 200 m and 512.5 m. The 512.5-m aperture migration value shows the best seismic cross-section results. This is evidenced by the capability in eliminating bowtie effects around CDP 600 up to CDP 800, eliminating diffraction effects around CDP 3900 to CDP 4050, and showing a seismic cross-section with better lateral resolution compared to the migration value of the aperture of 75 m and 200 m. Based on the seismic cross-section of migration results, the geological structure that can be identified is a fault that found in some CDP.


Sign in / Sign up

Export Citation Format

Share Document