scholarly journals A New Method of Low Amplitude Signal Detection and Its Application in Acoustic Emission

2019 ◽  
Vol 10 (1) ◽  
pp. 73 ◽  
Author(s):  
Einar Agletdinov ◽  
Dmitry Merson ◽  
Alexei Vinogradov

A novel methodology is proposed to enhance the reliability of detection of low amplitude transients in a noisy time series. Such time series often arise in a wide range of practical situations where different sensors are used for condition monitoring of mechanical systems, integrity assessment of industrial facilities and/or microseismicity studies. In all these cases, the early and reliable detection of possible damage is of paramount importance and is practically limited by detectability of transient signals on the background of random noise. The proposed triggering algorithm is based on a logarithmic derivative of the power spectral density function. It was tested on the synthetic data, which mimics the actual ultrasonic acoustic emission signal recorded continuously with different signal-to-noise ratios (SNR). Considerable advantages of the proposed method over established fixed amplitude threshold and STA/LTA (Short Time Average / Long Time Average) techniques are demonstrated in comparative tests.

2017 ◽  
Author(s):  
Frank Oppermann ◽  
Thomas Günther

Abstract. We present a new versatile datalogger that can be used for a wide range of possible applications in geosciences. It is adjustable in signal strength and sampling frequency, battery-saving and can remotely be controlled over Global System for Mobile Communication (GSM) connection so that it saves running costs, particulaly in monitoring experiments. Internet connection allows for checking functionality, controlling schedules and optimizing preamplification. We mainly use it for large-scale Electrical Resistivity Tomography (ERT), where it independently registers voltage time series on three channels while a square wave current is injected. For the analysis of this time series we present a new approach that is based on the Lock-In (LI) method, mainly known from electronic circuits. The method searches the working point (phase) using three different functions based on a mask signal, and determines the amplitude using a direct current (DC) correlation function. We use synthetic data with different types of noise to compare the new method with existing approaches, i.e. selective stacking and a modified Fast Fourier Transformation (FFT) based approach that assumes a 1/f noise characteristics. All methods give comparable results, the LI being better than the well established stacking method. The FFT approach can be even better but only if the noise strictly follows the assumed characteristics. If overshoots are present in the data, which is typical in the field, FFT performs worse even with good data which is why we conclude that the new LI approach is the most robust solution. This is also proved by a field data set from a long 2D ERT profile.


2006 ◽  
Vol 45 (3) ◽  
pp. 476-490 ◽  
Author(s):  
Sue Ellen Haupt ◽  
George S. Young ◽  
Christopher T. Allen

Abstract A methodology for characterizing emission sources is presented that couples a dispersion and transport model with a pollution receptor model. This coupling allows the use of the backward (receptor) model to calibrate the forward (dispersion) model, potentially across a wide range of meteorological conditions. Moreover, by using a receptor model one can calibrate from observations taken in a multisource setting. This approach offers practical advantages over calibrating via single-source artificial release experiments. A genetic algorithm is used to optimize the source calibration factors that couple the two models. The ability of the genetic algorithm to correctly couple these two models is demonstrated for two separate source–receptor configurations using synthetic meteorological and receptor data. The calibration factors underlying the synthetic data are successfully reconstructed by this optimization process. A Monte Carlo technique is used to compute error bounds for the resulting estimates of the calibration factors. By creating synthetic data with random noise, it is possible to quantify the robustness of the model's results in the face of variability. When white noise is incorporated into the synthetic pollutant signal at the receptors, the genetic algorithm is still able to compute the calibration factors of the coupled model up to a signal-to-noise ratio of about 2. Beyond that level of noise, the average of many coupled model optimization runs still provides a reasonable estimate of the calibration factor until the noise is an order of magnitude greater than the signal. The calibration factor linking the dispersion to the receptor model provides an estimate of the uncertainty in the combined monitoring and modeling process. This approach recognizes the mismatch between the ensemble average dispersion modeling technology and matching a single realization time of monitored data.


2018 ◽  
Vol 7 (1) ◽  
pp. 55-66 ◽  
Author(s):  
Frank Oppermann ◽  
Thomas Günther

Abstract. We present a new versatile datalogger that can be used for a wide range of possible applications in geosciences. It is adjustable in signal strength and sampling frequency, battery saving and can remotely be controlled over a Global System for Mobile Communication (GSM) connection so that it saves running costs, particularly in monitoring experiments. The internet connection allows for checking functionality, controlling schedules and optimizing pre-amplification. We mainly use it for large-scale electrical resistivity tomography (ERT), where it independently registers voltage time series on three channels, while a square-wave current is injected. For the analysis of this time series we present a new approach that is based on the lock-in (LI) method, mainly known from electronic circuits. The method searches the working point (phase) using three different functions based on a mask signal, and determines the amplitude using a direct current (DC) correlation function. We use synthetic data with different types of noise to compare the new method with existing approaches, i.e. selective stacking and a modified fast Fourier transformation (FFT)-based approach that assumes a 1∕f noise characteristics. All methods give comparable results, but the LI is better than the well-established stacking method. The FFT approach can be even better but only if the noise strictly follows the assumed characteristics. If overshoots are present in the data, which is typical in the field, FFT performs worse even with good data, which is why we conclude that the new LI approach is the most robust solution. This is also proved by a field data set from a long 2-D ERT profile.


2021 ◽  
Vol 13 (16) ◽  
pp. 3069
Author(s):  
Yadong Liu ◽  
Junhwan Kim ◽  
David H. Fleisher ◽  
Kwang Soo Kim

Seasonal forecasts of crop yield are important components for agricultural policy decisions and farmer planning. A wide range of input data are often needed to forecast crop yield in a region where sophisticated approaches such as machine learning and process-based models are used. This requires considerable effort for data preparation in addition to identifying data sources. Here, we propose a simpler approach called the Analogy Based Crop-yield (ABC) forecast scheme to make timely and accurate prediction of regional crop yield using a minimum set of inputs. In the ABC method, a growing season from a prior long-term period, e.g., 10 years, is first identified as analogous to the current season by the use of a similarity index based on the time series leaf area index (LAI) patterns. Crop yield in the given growing season is then forecasted using the weighted yield average reported in the analogous seasons for the area of interest. The ABC approach was used to predict corn and soybean yields in the Midwestern U.S. at the county level for the period of 2017–2019. The MOD15A2H, which is a satellite data product for LAI, was used to compile inputs. The mean absolute percentage error (MAPE) of crop yield forecasts was <10% for corn and soybean in each growing season when the time series of LAI from the day of year 89 to 209 was used as inputs to the ABC approach. The prediction error for the ABC approach was comparable to results from a deep neural network model that relied on soil and weather data as well as satellite data in a previous study. These results indicate that the ABC approach allowed for crop yield forecast with a lead-time of at least two months before harvest. In particular, the ABC scheme would be useful for regions where crop yield forecasts are limited by availability of reliable environmental data.


2021 ◽  
Vol 13 (15) ◽  
pp. 2967
Author(s):  
Nicola Acito ◽  
Marco Diani ◽  
Gregorio Procissi ◽  
Giovanni Corsini

Atmospheric compensation (AC) allows the retrieval of the reflectance from the measured at-sensor radiance and is a fundamental and critical task for the quantitative exploitation of hyperspectral data. Recently, a learning-based (LB) approach, named LBAC, has been proposed for the AC of airborne hyperspectral data in the visible and near-infrared (VNIR) spectral range. LBAC makes use of a parametric regression function whose parameters are learned by a strategy based on synthetic data that accounts for (1) a physics-based model for the radiative transfer, (2) the variability of the surface reflectance spectra, and (3) the effects of random noise and spectral miscalibration errors. In this work we extend LBAC with respect to two different aspects: (1) the platform for data acquisition and (2) the spectral range covered by the sensor. Particularly, we propose the extension of LBAC to spaceborne hyperspectral sensors operating in the VNIR and short-wave infrared (SWIR) portion of the electromagnetic spectrum. We specifically refer to the sensor of the PRISMA (PRecursore IperSpettrale della Missione Applicativa) mission, and the recent Earth Observation mission of the Italian Space Agency that offers a great opportunity to improve the knowledge on the scientific and commercial applications of spaceborne hyperspectral data. In addition, we introduce a curve fitting-based procedure for the estimation of column water vapor content of the atmosphere that directly exploits the reflectance data provided by LBAC. Results obtained on four different PRISMA hyperspectral images are presented and discussed.


2019 ◽  
Vol 12 (11) ◽  
pp. 4661-4679 ◽  
Author(s):  
Bin Cao ◽  
Xiaojing Quan ◽  
Nicholas Brown ◽  
Emilie Stewart-Jones ◽  
Stephan Gruber

Abstract. Simulations of land-surface processes and phenomena often require driving time series of meteorological variables. Corresponding observations, however, are unavailable in most locations, even more so, when considering the duration, continuity and data quality required. Atmospheric reanalyses provide global coverage of relevant meteorological variables, but their use is largely restricted to grid-based studies. This is because technical challenges limit the ease with which reanalysis data can be applied to models at the site scale. We present the software toolkit GlobSim, which automates the downloading, interpolation and scaling of different reanalyses – currently ERA5, ERA-Interim, JRA-55 and MERRA-2 – to produce meteorological time series for user-defined point locations. The resulting data have consistent structure and units to efficiently support ensemble simulation. The utility of GlobSim is demonstrated using an application in permafrost research. We perform ensemble simulations of ground-surface temperature for 10 terrain types in a remote tundra area in northern Canada and compare the results with observations. Simulation results reproduced seasonal cycles and variation between terrain types well, demonstrating that GlobSim can support efficient land-surface simulations. Ensemble means often yielded better accuracy than individual simulations and ensemble ranges additionally provide indications of uncertainty arising from uncertain input. By improving the usability of reanalyses for research requiring time series of climate variables for point locations, GlobSim can enable a wide range of simulation studies and model evaluations that previously were impeded by technical hurdles in obtaining suitable data.


1989 ◽  
Vol 111 (3) ◽  
pp. 199-205 ◽  
Author(s):  
S. Y. Liang ◽  
D. A. Dornfeld

This paper discusses the monitoring of cutting tool wear based on time series analysis of acoustic emission signals. In cutting operations, acoustic emission provides useful information concerning the tool wear condition because of the fundamental differences between its source mechanisms in the rubbing friction on the wear land and the dislocation action in the shear zones. In this study, a signal processing scheme is developed which uses an autoregressive time-series to model the acoustic emission generated during cutting. The modeling scheme is implemented with a stochastic gradient algorithm to update the model parameters adoptively and is thus a suitable candidate for in-process sensing applications. This technique encodes the acoustic emission signal features into a time varying model parameter vector. Experiments indicate that the parameter vector ignores the change of cutting parameters, but shows a strong sensitivity to the progress of cutting tool wear. This result suggests that tool wear detection can be achieved by monitoring the evolution of the model parameter vector during machining processes.


Geophysics ◽  
2006 ◽  
Vol 71 (3) ◽  
pp. V79-V86 ◽  
Author(s):  
Hakan Karsli ◽  
Derman Dondurur ◽  
Günay Çifçi

Time-dependent amplitude and phase information of stacked seismic data are processed independently using complex trace analysis in order to facilitate interpretation by improving resolution and decreasing random noise. We represent seismic traces using their envelopes and instantaneous phases obtained by the Hilbert transform. The proposed method reduces the amplitudes of the low-frequency components of the envelope, while preserving the phase information. Several tests are performed in order to investigate the behavior of the present method for resolution improvement and noise suppression. Applications on both 1D and 2D synthetic data show that the method is capable of reducing the amplitudes and temporal widths of the side lobes of the input wavelets, and hence, the spectral bandwidth of the input seismic data is enhanced, resulting in an improvement in the signal-to-noise ratio. The bright-spot anomalies observed on the stacked sections become clearer because the output seismic traces have a simplified appearance allowing an easier data interpretation. We recommend applying this simple signal processing for signal enhancement prior to interpretation, especially for single channel and low-fold seismic data.


2021 ◽  
Vol 13 (9) ◽  
pp. 1743
Author(s):  
Daniel Paluba ◽  
Josef Laštovička ◽  
Antonios Mouratidis ◽  
Přemysl Štych

This study deals with a local incidence angle correction method, i.e., the land cover-specific local incidence angle correction (LC-SLIAC), based on the linear relationship between the backscatter values and the local incidence angle (LIA) for a given land cover type in the monitored area. Using the combination of CORINE Land Cover and Hansen et al.’s Global Forest Change databases, a wide range of different LIAs for a specific forest type can be generated for each scene. The algorithm was developed and tested in the cloud-based platform Google Earth Engine (GEE) using Sentinel-1 open access data, Shuttle Radar Topography Mission (SRTM) digital elevation model, and CORINE Land Cover and Hansen et al.’s Global Forest Change databases. The developed method was created primarily for time-series analyses of forests in mountainous areas. LC-SLIAC was tested in 16 study areas over several protected areas in Central Europe. The results after correction by LC-SLIAC showed a reduction of variance and range of backscatter values. Statistically significant reduction in variance (of more than 40%) was achieved in areas with LIA range >50° and LIA interquartile range (IQR) >12°, while in areas with low LIA range and LIA IQR, the decrease in variance was very low and statistically not significant. Six case studies with different LIA ranges were further analyzed in pre- and post-correction time series. Time-series after the correction showed a reduced fluctuation of backscatter values caused by different LIAs in each acquisition path. This reduction was statistically significant (with up to 95% reduction of variance) in areas with a difference in LIA greater than or equal to 27°. LC-SLIAC is freely available on GitHub and GEE, making the method accessible to the wide remote sensing community.


2013 ◽  
Vol 58 (4) ◽  
pp. 1331-1336 ◽  
Author(s):  
J. Berdowski ◽  
S. Berdowska ◽  
F. Aubry

Abstract The purpose of this paper was to investigate the physical and mechanical properties of compressed expanded graphite (CEG) and their porous derivatives after impregnation, polymerization; and carbonization by the use of acoustic emission method (AE). The mechanical and structural characteristics of compressed expanded graphite and their three groups of porous composites after each technological process are presented and discussed. The measurements of acoustic emission parameters in these materials were carried out at wide range of frequency of the waves (0.1÷2.5 MHz). The changes of two of parameters: - AE pulses counts rate and spectrum distribution of AE waves - are presented in this paper. The analysis of the respective parameters AE also gives possibility to determine the micro- and macro structural changes of materials at different levels of technological processes. Applications of these materials as catalysts with high specific surface make them very interesting subject of study. Also compressed expanded graphite composite membranes prepared from furfuryl alcohol polymers are promising for gas separation.


Sign in / Sign up

Export Citation Format

Share Document