AYDAR, Special Purpose Analog Machine for Raw Data Reduction

1958 ◽  
Vol 5 (1) ◽  
pp. 89-99
Author(s):  
Serge J. Zaroodny ◽  
Tadeusz Leser
2021 ◽  
Vol 12 (1) ◽  
Author(s):  
Abhik Datta ◽  
Kian Fong Ng ◽  
Deepan Balakrishnan ◽  
Melissa Ding ◽  
See Wee Chee ◽  
...  

AbstractFast, direct electron detectors have significantly improved the spatio-temporal resolution of electron microscopy movies. Preserving both spatial and temporal resolution in extended observations, however, requires storing prohibitively large amounts of data. Here, we describe an efficient and flexible data reduction and compression scheme (ReCoDe) that retains both spatial and temporal resolution by preserving individual electron events. Running ReCoDe on a workstation we demonstrate on-the-fly reduction and compression of raw data streaming off a detector at 3 GB/s, for hours of uninterrupted data collection. The output was 100-fold smaller than the raw data and saved directly onto network-attached storage drives over a 10 GbE connection. We discuss calibration techniques that support electron detection and counting (e.g., estimate electron backscattering rates, false positive rates, and data compressibility), and novel data analysis methods enabled by ReCoDe (e.g., recalibration of data post acquisition, and accurate estimation of coincidence loss).


2002 ◽  
Vol 185 ◽  
pp. 624-625
Author(s):  
E. Poretti ◽  
D. Buzasi ◽  
R. Laher ◽  
J. Catanzarite ◽  
T. Conrow

Soon after launch in March 1999, the primary science instrument onboard the Wide-Field Infrared Explorer (WIRE) satellite failed due to loss of coolant. However, it proved possible to begin an asteroseismology program using the 52-mm aperture star camera. A few bright stars were monitored with the 512x512 SITe CCD in a bandpass approximately equivalent to V + R; further details about the orbit, the detector and the raw data reduction can be found in Buzasi et al. (2000) and Buzasi (2000). We included the binary star θ2 Tauri among the targets. It is composed of an A7IV primary and an A5V secondary (P = 140.728 d). The primary is a a δ Scuti star which has been observed several times in the last twenty years. Five terms have been determined in its light curve (Breger, 1989 and references therein); Li et al. (1997) demonstrated the amplitude variability of some of these terms by comparing different campaigns.


2017 ◽  
pp. 15-20
Author(s):  
V. V. Lypchuk ◽  
О. M. Krupa

The article is devoted to the problem of data reduction as an important step on the way of providing reliability and efficiency of socio-economic studies. Through the reduction the large amounts of raw data, generated from different sources, become more useful, convenient and clear for use. Meanwhile, the data reduction is not treated as a separate phase of studies in the national statistic practice. The aim of the article is to substantiate the importance of data reduction in economic studies and attempt to systematize and generalize the essence and components of the phase of data reduction as well as ways of their implementation. The study is based on methods of theoretical generalization, abstract and logic, analogy and others. The essence of data reduction is defined as the process of converting raw data into the pure form and reducing the number of units’ attributes (features), which are not significant to further analysis. In fact, this is part of the analysis involving selection of the data that are most important from the viewpoint of the study’s goals. The significance of data reduction in economic studies is outlined. It is found that it assures the validity of their results, reduces their time and costs, simplifies the representational complexity of the problem being addressed, eliminates the errors and redundant data from the investigated set, looses the requirements to calculation tools. The data resulting from reduction are much more informative. Many dependencies and relationships become more readable (visual). It is emphasized that reduction applies to the current data (on-line), as well as to historical data (off-line), contained in the already created databases. The phases of data reduction are described. They are: control of data collection, data editing, classification, data construction and grouping, coding and transmission (data transmission to the processing tools - computers). Data reduction techniques and methods most common in the global practice are shown. Future studies of data reduction problems are expected to focus on potential ways to implement its advanced methods in the domestic practice of statistical science. It will allow for enhancing significantly the speed and efficiency of economic analysis and the reliability of its results.


2020 ◽  
Vol 635 ◽  
pp. A90
Author(s):  
Anthony Berdeu ◽  
Ferréol Soulez ◽  
Loïc Denis ◽  
Maud Langlois ◽  
Éric Thiébaut

Context. The improvement of large size detectors permitted the development of integral field spectrographs (IFSs) in astronomy. Spectral information for each spatial element of a two-dimensional field of view is obtained thanks to integral field units that spread the spectra on the 2D grid of the sensor. Aims. Here we aim to solve the inherent issues raised by standard data-reduction algorithms based on direct mapping of the 2D + λ data cube: the spectral cross-talk due to the overlap of neighbouring spectra, the spatial correlations of the noise due to the re-interpolation of the cube on a Cartesian grid, and the artefacts due to the influence of defective pixels. Methods. The proposed method, Projection, Interpolation, and Convolution (PIC), is based on an “inverse-problems” approach. By accounting for the overlap of neighbouring spectra as well as the spatial extension in a spectrum of a given wavelength, the model inversion reduces the spectral cross-talk while deconvolving the spectral dispersion. Considered as missing data, defective pixels undetected during the calibration are discarded on-the-fly via a robust penalisation of the data fidelity term. Results. The calibration of the proposed model is presented for the Spectro-Polarimetric High-contrast Exoplanet REsearch instrument (SPHERE). This calibration was applied to extended objects as well as coronagraphic acquisitions dedicated to exoplanet detection or disc imaging. Artefacts due to badly corrected defective pixels or artificial background structures observed in the cube reduced by the SPHERE data reduction pipeline are suppressed while the reconstructed spectra are sharper. This reduces the false detections by the standard exoplanet detection algorithms. Conclusions. These results show the pertinence of the inverse-problems approach to reduce the raw data produced by IFSs and to compensate for some of their imperfections. Our modelling forms an initial building block necessary to develop methods that can reconstruct and/or detect sources directly from the raw data.


1994 ◽  
Author(s):  
Klaus Strodl ◽  
Ursula C. Benz ◽  
Alberto Moreira
Keyword(s):  

1962 ◽  
Vol 17 (9) ◽  
pp. 657-658 ◽  
Author(s):  
Leroy Wolins
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document