Recording Studio: Data Acquisition and Data Processing

Author(s):  
Edilson de Aguiar
2012 ◽  
Vol 546-547 ◽  
pp. 1393-1397
Author(s):  
Zhi Wen Xiong ◽  
Chen Guang Xu ◽  
Hong Zeng

Data acquisition begins with the physical phenomenon or physical property to be measured. Examples of this include temperature, gas pressure, and light intensity, and force, fluid flow, regardless of the type of physical property to be measured. Physical property converted into digital, and then by the computer for storage, processing, display or printing process, the corresponding system is called data acquisition system. With the rapid development of computer technology, data acquisition systems quickly gained popularity. A variety of products based on digital technology have been created. Digital System spread quickly; it’s mainly the following two advantages: the first is the digital processing flexible and convenient; the second is a digital system is very reliable. The main idea of Reconfigurable computing technology [1] is using the FPGA [2][3] allows the system has a dynamically configurable capacity, suitable for harsh environment applications, improve the speed of data processing. By the use of dynamic reconfigurable FPGA devices can be realized on the hardware logic function modification, application of reconfigurable computing technology can improve the speed of data processing. Data acquisition system is widely applied in many fields, and often used the abominable working environment place. The reconfigurable computing technology, can greatly improve the data acquisition system reliability and safety. The paper introduces a kind of multi-channel data acquisition system based on USB bus and FPGA, the factors affecting the performance of system are discussed, and describes how to use reconfigurable computing technology to improve the efficiency of data acquisition system while reduce energy consumption. The system in this paper uses AD's AD9220, ALTERA's EP1C6-8 and IDT's IDT70V24, Cypress’s CY7C68013.


Author(s):  
F. Tsai ◽  
T.-S. Wu ◽  
I.-C. Lee ◽  
H. Chang ◽  
A. Y. S. Su

This paper presents a data acquisition system consisting of multiple RGB-D sensors and digital single-lens reflex (DSLR) cameras. A systematic data processing procedure for integrating these two kinds of devices to generate three-dimensional point clouds of indoor environments is also developed and described. In the developed system, DSLR cameras are used to bridge the Kinects and provide a more accurate ray intersection condition, which takes advantage of the higher resolution and image quality of the DSLR cameras. Structure from Motion (SFM) reconstruction is used to link and merge multiple Kinect point clouds and dense point clouds (from DSLR color images) to generate initial integrated point clouds. Then, bundle adjustment is used to resolve the exterior orientation (EO) of all images. Those exterior orientations are used as the initial values to combine these point clouds at each frame into the same coordinate system using Helmert (seven-parameter) transformation. Experimental results demonstrate that the design of the data acquisition system and the data processing procedure can generate dense and fully colored point clouds of indoor environments successfully even in featureless areas. The accuracy of the generated point clouds were evaluated by comparing the widths and heights of identified objects as well as coordinates of pre-set independent check points against in situ measurements. Based on the generated point clouds, complete and accurate three-dimensional models of indoor environments can be constructed effectively.


2022 ◽  
Author(s):  
Abby Moore

This is a bogus workflow that demos a sequence of resources that cover sample prep, data acquisition and data processing.


2018 ◽  
Vol 210 ◽  
pp. 05016
Author(s):  
Mariusz Chmielewski ◽  
Damian Frąszczak ◽  
Dawid Bugajewski

This paper discusses experiences and architectural concepts developed and tested aimed at acquisition and processing of biomedical data in large scale system for elderly (patients) monitoring. Major assumptions for the research included utilisation of wearable and mobile technologies, supporting maximum number of inertial and biomedical data to support decision algorithms. Although medical diagnostics and decision algorithms have not been the main aim of the research, this preliminary phase was crucial to test capabilities of existing off-the-shelf technologies and functional responsibilities of system’s logic components. Architecture variants contained several schemes for data processing moving the responsibility for signal feature extraction, data classification and pattern recognition from wearable to mobile up to server facilities. Analysis of transmission and processing delays provided architecture variants pros and cons but most of all knowledge about applicability in medical, military and fitness domains. To evaluate and construct architecture, a set of alternative technology stacks and quantitative measures has been defined. The major architecture characteristics (high availability, scalability, reliability) have been defined imposing asynchronous processing of sensor data, efficient data representation, iterative reporting, event-driven processing, restricting pulling operations. Sensor data processing persist the original data on handhelds but is mainly aimed at extracting chosen set of signal features calculated for specific time windows – varying for analysed signals and the sensor data acquisition rates. Long term monitoring of patients requires also development of mechanisms, which probe the patient and in case of detecting anomalies or drastic characteristic changes tune the data acquisition process. This paper describes experiences connected with design of scalable decision support tool and evaluation techniques for architectural concepts implemented within the mobile and server software.


2004 ◽  
Vol 75 (10) ◽  
pp. 4261-4264 ◽  
Author(s):  
M. Ruiz ◽  
E. Barrera ◽  
S. López ◽  
D. Machón ◽  
J. Vega ◽  
...  

Geophysics ◽  
1981 ◽  
Vol 46 (8) ◽  
pp. 1088-1099 ◽  
Author(s):  
Robert B. Rice ◽  
Samuel J. Allen ◽  
O. James Gant ◽  
Robert N. Hodgson ◽  
Don E. Larson ◽  
...  

Advances in exploration geophysics have continued apace during the last six years. We have entered a new era of exploration maturity which will be characterized by the extension of our technologies to their ultimate limits of precision. In gravity and magnetics, new inertial navigation systems permit the very rapid helicopter‐supported land acquisition of precise surface gravity data which is cost‐effective in regions of severe topography. Considerable effort is being expended to obtain airborne gravity data via helicopter which is of exploration quality. Significant progress has also been made in processing and interpreting potential field data. The goal of deriving the maximum amount of accurate subsurface information from seismic data has led to much more densely sampled and precise 2- and 3-D land data acquisition techniques. Land surveying accuracy has been greatly improved. The number of individually recorded detector channels has been increased dramatically (up to 1024) in order to approximate much more accurately a point‐source, point‐detector system. Much more powerful compressional‐wave vibrators can now maintain full force while sweeping up or down from 5 Hz to over 200 Hz. In marine surveying, new streamer cables and shipboard instrumentation permit the recording and limited processing of 96 to 480 channels. Improvements have also been made in marine sources and arrays. The most important developments in seismic data processing—wave‐equation based imaging and inversion methods—may be the forerunners of a totally new processing methodology. Wave‐equation methods have been formulated for migration before and after stack, multiples suppression, datum and replacement statics, velocity estimation, and seismic inversion. Inversion techniques which provide detailed acoustic‐impedance or velocity estimates have found widespread commercial application. Wavelet processing has greatly expanded our stratigraphic analysis capabilities. Much more sophisticated 1-, 2-, and 3-D modeling techniques are being used effectively to guide data acquisition and processing, as direct interpretation aids, and to teach basic interpretation concepts. Some systems can now handle vertical and lateral velocity changes, inelastic attenuation, curved reflection horizons, transitional boundaries, time‐variant waveforms, ghosting, multiples, and array‐response effects. Improved seismic display formats and the extensive use of color have been valuable in data processing, modeling, and interpretation. Stratigraphic interpretation has evolved into three major categories: (1) macrostratigraphy, where regional and basinal depositional patterns are analyzed to describe the broad geologic depositional environment; (2) qualitative stratigraphy, where specific rock units and their properties are analyzed qualitatively to delineate lithology, porosity, structural setting, and areal extent and shape; and (3) quantitative stratigraphy, where anomalies are mapped at a specific facies level to define net porosity‐feet distribution, gas‐fluid contacts, and probable pore fill. In essence, what began as direct hydrocarbon‐indicator technology applicable primarily to Upper Tertiary clastics has now matured to utility in virtually every geologic province. Considerable effort has been expended on the direct generation and recording of shear waves in an attempt to obtain more information about stratigraphy, porosity, and oil and gas saturation. Seismic service companies now offer shear‐wave prospecting using vibrator, horizontal‐impact, or explosive sources. Well logging has seen the acceleration of computerization. Wellsite tape recorders and minicomputers with relatively simple interpretation algorithms are routinely available. More sophisticated computerized interpretation methods are offered as a service at data processing centers.


2016 ◽  
Vol 49 (3) ◽  
pp. 1035-1041 ◽  
Author(s):  
Takanori Nakane ◽  
Yasumasa Joti ◽  
Kensuke Tono ◽  
Makina Yabashi ◽  
Eriko Nango ◽  
...  

A data processing pipeline for serial femtosecond crystallography at SACLA was developed, based onCheetah[Bartyet al.(2014).J. Appl. Cryst.47, 1118–1131] andCrystFEL[Whiteet al.(2016).J. Appl. Cryst.49, 680–689]. The original programs were adapted for data acquisition through the SACLA API, thread and inter-node parallelization, and efficient image handling. The pipeline consists of two stages: The first, online stage can analyse all images in real time, with a latency of less than a few seconds, to provide feedback on hit rate and detector saturation. The second, offline stage converts hit images into HDF5 files and runsCrystFELfor indexing and integration. The size of the filtered compressed output is comparable to that of a synchrotron data set. The pipeline enables real-time feedback and rapid structure solution during beamtime.


The 197 lb. British built Ariel III satellite was put into a near Earth circular orbit on 5 May 1967 by a Scout rocket as part of the joint Anglo/American cooperative space research programme. The scientific objectives were a continuation and extension of the investigations of ionospheric, atmospheric and space phenomena undertaken on Ariel I and Ariel II. As an introduction to the ‘Ariel III discussion meeting’ on 24 April 1968 the following aspects of the programme were described: ( а ) The scientific objectives and how they were related to the earlier work on Ariel I and Ariel II. ( b ) The general management structure of the project and the division of responsibilities. ( c ) The orbit and launch window requirements. ( d ) The performance of the satellite in orbit. ( e ) The organization and effectiveness of the data acquisition and data processing activities.


Sign in / Sign up

Export Citation Format

Share Document