irregular sampling
Recently Published Documents


TOTAL DOCUMENTS

228
(FIVE YEARS 46)

H-INDEX

25
(FIVE YEARS 5)

Author(s):  
Yingnan Nie ◽  
Xuanjun Guo ◽  
Xiao Li ◽  
Xinyi Geng ◽  
Yan Li ◽  
...  

Abstract Objective. Closed-loop deep brain stimulation (DBS) with neural feedback has shown great potential in improving the therapeutic effect and reducing side effects. However, the amplitude of stimulation artifacts is much larger than the local field potentials, which remains a bottleneck in developing a closed-loop stimulation strategy with varied parameters. Approach. We proposed an irregular sampling method for the real-time removal of stimulation artifacts. The artifact peaks were detected by applying a threshold to the raw recordings, and the samples within the contaminated period of the stimulation pulses were excluded and replaced with the interpolation of the samples prior to and after the stimulation artifact duration. This method was evaluated with both simulation signals and in vivo closed-loop DBS applications in Parkinsonian animal models. Main results. The irregular sampling method was able to remove the stimulation artifacts effectively with the simulation signals. The relative errors between the power spectral density of the recovered and true signals within a wide frequency band (2-150 Hz) were 2.14%, 3.93%, 7.22%, 7.97% and 6.25% for stimulation at 20 Hz, 60 Hz, 130 Hz, 180 Hz, and stimulation with variable low and high frequencies, respectively. This stimulation artifact removal method was verified in real-time closed-loop DBS application in vivo, and the artifacts were effectively removed during stimulation with frequency continuously changing from 130 Hz to 1 Hz and stimulation adaptive to beta oscillations. Significance. The proposed method provides an approach for real-time removal in closed-loop DBS applications, which is effective in stimulation with low frequency, high frequency, and variable frequency. This method can facilitate the development of more advanced closed-loop DBS strategies.


Author(s):  
Qingxiong Tan ◽  
Mang Ye ◽  
Grace Lai-Hung Wong ◽  
PongChi Yuen

Due to the dynamic health status of patients and discrepant stability of physiological variables, health data often presents as irregular multi-rate multivariate time series (IMR-MTS) with significantly varying sampling rates. Existing methods mainly study changes of IMR-MTS values in the time domain, without considering their different dominant frequencies and varying data quality. Hence, we propose a novel Cooperative Joint Attentive Network (CJANet) to analyze IMR-MTS in frequency domain, which adaptively handling discrepant dominant frequencies while tackling diverse data qualities caused by irregular sampling. In particular, novel dual-channel joint attention is designed to jointly identify important magnitude and phase signals while detecting their dominant frequencies, automatically enlarging the positive influence of key variables and frequencies. Furthermore, a new cooperative learning module is introduced to enhance information exchange between magnitude and phase channels, effectively integrating global signals to optimize the network. A frequency-aware fusion strategy is finally designed to aggregate the learned features. Extensive experimental results on real-world medical datasets indicate that CJANet significantly outperforms existing methods and provides highly interpretable results.


2021 ◽  
Vol 5 (1) ◽  
pp. 57
Author(s):  
Sophie Castel ◽  
Wesley S. Burr

Real-world time series data often contain missing values due to human error, irregular sampling, or unforeseen equipment failure. The ability of a computational interpolation method to repair such data greatly depends on the characteristics of the time series itself, such as the number of periodic and polynomial trends and noise structure, as well as the particular configuration of the missing values themselves. The interpTools package presents a systematic framework for analyzing the statistical performance of a time series interpolator in light of such data features. Its utility and features are demonstrated through evaluation of a novel algorithm, the Hybrid Wiener Interpolator.


Wavelets ◽  
2021 ◽  
pp. 305-363
Author(s):  
Hans G. Feichtinger ◽  
Karlheinz Gröchenig

Author(s):  
Eunseob Kim ◽  
Huitaek Yun ◽  
Martin Byung-Guk Jun ◽  
Kyunghyun Kim ◽  
Suk Won Cha

Abstract In the new era of manufacturing with Industry 4.0, Smart Manufacturing (SM) is growing in popularity as a potential for the factory of the future. A critical component of SM is effective machine monitoring. Legacy machines indirect monitoring using Internet of Things (IoT) sensors are preferred instead of modifying hardware directly. Machine tools are composed of rotary components, resulting in machine tools emitting acoustic and vibratory signals. However, sound data cannot easily function as a direct representation for machine status due to its noise, variable time course, and irregular sampling. In this paper, we attempt to bridge this gap through machine learning techniques and auditory monitoring of auxiliary components (i.e., coolant, chip conveyor, and mist collector) as well as the main spindle running state of machine tools. Multi-label classification and Convolutional Neural Network (CNN) were utilized to train models for monitoring machine tools from the sound features. An external microphone and three internal sound sensors were attached to both mill and lathe machines. As a sound feature, Mel-frequency cepstrum (MFCC) features were extracted. The classification task performance was compared between each sensor location and early sensor fusion. The results showed that the sensor fusion approach resulted in the highest F1 score on both machine system.


Geophysics ◽  
2021 ◽  
pp. 1-44
Author(s):  
Mengli Zhang

The time-lapse seismic method plays a critical role in the reservoir monitoring and characterization. However, time-lapse data acquisitions are costly. Sparse acquisitions combined with post-acquisition data reconstruction could reduce the cost and facilitate more frequent applications of the time-lapse seismic monitoring. We present a sparse time-lapse seismic data reconstruction methodology based on compressive sensing. The method works with a hybrid of repeated and non-repeated sample locations. To make use of the additional information from non-repeated locations, we present a view that non-repeated samples in space are equivalent to irregular samples in calendar time. Therefore, we use these irregular samples in time coming from non-repeated samples in space to improve the performance of compressive sensing reconstruction. The tests on synthetic and field datasets indicate that our method can achieve a sufficiently accurate reconstruction by using as few as 10% of the receivers or traces. The method not only works with spatially irregular sampling for dealing with the land accessibility problem and for reducing the number of nodal sensors, but also utilizes the non-repeated measurements to improve the reconstruction accuracy.


Geophysics ◽  
2021 ◽  
pp. 1-71
Author(s):  
Fang Ouyang ◽  
Jianguo Zhao ◽  
Shikun Dai ◽  
Longwei Chen ◽  
Shangxu Wang

Multi-dimensional Fourier transform on an irregular grid is a useful tool for various seismic forward problems caused by complex media and wavefield distributions. Using a shape-function-based strategy, we develop four different algorithms for 1D and 2D non-uniform Fourier transforms, including two high-accuracy Fourier transforms (LSF-FT and QSF-FT) and two non-uniform fast Fourier transforms (LSF-NUFFT and QSF-NUFFT), respectively based on linear and quadratic shape functions. The main advantage of incorporating shape functions into the Fourier transform is that triangular elements can be used to mesh any complex wavefield distribution in the 2D case. These algorithms, therefore, can be used in conjunction with any irregular sampling strategies. The accuracy and efficiency of the four non-uniform Fourier transforms are investigated and compared by applying them in the frequency-domain seismic wave modeling. All algorithms are compared with exact solutions. Numerical tests show that the quadratic shape-function-based algorithms are more accurate than those based on linear shape function. Moreover, LSF-FT/QSF-FT exhibits higher accuracy but much slower calculation speed, while LSF-NUFFT/QSF-NUFFT is highly efficient but has lower accuracy at near-source points. In contrast, a combination of these algorithms by using QSF-FT at near-source points and LSF-NUFFT/QSF-NUFFT at others, achieves satisfactory efficiency and high accuracy at all points. Although our tests are restricted to seismic models, these improved non-uniform fast Fourier transform algorithms may also have potential applications in other geophysical problems, such as forward modeling in complex gravity and magnetic models.


2021 ◽  
Author(s):  
Bushra Atfeh ◽  
Erzsébet Kristóf ◽  
Róbert Mészáros ◽  
Zoltán Barcza

<p>This work focuses on indoor air quality measurements carried out in an apartment in the suburban region of Budapest. The measurements were made by an IQAir AirVisual node air quality monitor which is a so-called low-cost sensor capable to monitor PM<sub>2.5</sub> and carbon dioxide concentration. In this study we analyze data measured during January 2017 that was characterized by an extreme air pollution episode in Budapest. The aim of the study was to calculate daily indoor PM<sub>2.5</sub> concentrations that are comparable with the outdoor concentrations provided by the official Hungarian Air Quality Monitoring Network. Given the fact that AirVisual Pro provides data with irregular sampling frequency, data processing is expected to influence the calculated daily mean concentrations.  The results indicated that the uneven sampling frequency characteristic of AirVisual node indeed causes problems during data processing and has an effect on the calculated means. We propose a ‘best method’ for data processing for sensors with irregular sampling frequency.</p>


2021 ◽  
Author(s):  
Tobias Braun ◽  
Sebastian F. M. Breitenbach ◽  
Erin Ray ◽  
James U. L. Baldini ◽  
Lisa M. Baldini ◽  
...  

<p><span>The reconstruction and analysis of palaeoseasonality from speleothem records remains a notoriously challenging task. Although the seasonal cycle is obscured by noise, dating uncertainties and irregular sampling, its extraction can identify regime transitions and enhance the understanding of long-term climate variability. Shifts in seasonal predictability of hydroclimatic conditions have immediate and serious repercussions for agricultural societies.</span></p><p><span>We present a highly resolved speleothem record (ca. 0.22 years temporal resolution with episodes twice as high) of palaeoseasonality from Yok Balum cave in Belize covering the Common Era (400-2006 CE) and demonstrate how seasonal-scale hydrological variability can be extracted from δ<sup>13</sup>C and δ<sup>18</sup>O isotope records. We employ a Monte-Carlo based framework in which dating uncertainties are transferred into magnitude uncertainty and propagated. Regional historical proxy data enable us to relate climate variability to agricultural disasters throughout the Little Ice Age and population size variability during the Terminal Classic Maya collapse.</span></p><p><span>Spectral analysis reveals the seasonal cycle as well as nonstationary ENSO- and multi-decadal-scale variability. Variations in both the subannual distribution of rainfall and mean average hydroclimate pose limitations on how reliably farmers can predict crop yield. A characterization of year-to-year predictability as well as the complexity of seasonal patterns unconver shifts in the seasonal-scale variability. These are discussed in the context of their implications for rainfall dependent agricultural societies.</span></p>


Sign in / Sign up

Export Citation Format

Share Document