irregularly sampled data
Recently Published Documents


TOTAL DOCUMENTS

80
(FIVE YEARS 13)

H-INDEX

16
(FIVE YEARS 2)

2021 ◽  
Author(s):  
Inês Silva ◽  
Christen H. Fleming ◽  
Michael J. Noonan ◽  
Jesse Alston ◽  
Cody Folta ◽  
...  

1. Modern tracking devices allow for the collection of high-volume animal tracking data at improved sampling rates over VHF radiotelemetry. Home range estimation is a key output from these tracking datasets, but the inherent properties of animal movement can lead traditional statistical methods to under- or overestimate home range areas. 2. The Autocorrelated Kernel Density Estimation (AKDE) family of estimators were designed to be statistically efficient while explicitly dealing with the complexities of modern movement data: autocorrelation, small sample sizes, and missing or irregularly sampled data. Although each of these estimators has been described in separate technical papers, here we review how these estimators work and provide a user-friendly guide on how they may be combined to reduce multiple biases simultaneously. 3. We describe the magnitude of the improvements offered by these estimators and their impact on home range area estimates, using both empirical case studies and simulations, contrasting their computational costs.4. Finally, we provide guidelines for researchers to choose among alternative estimators and an R script to facilitate the application and interpretation of AKDE home range estimates.


2021 ◽  
Vol 0 (0) ◽  
Author(s):  
David Uhlig ◽  
Michael Heizmann

Abstract Sophisticated and highly specialized optical measuring devices are becoming increasingly important for high-precision manufacturing and environment perception. In particular, light field cameras are experiencing an ever-increasing interest in research and industry as they enable a variety of new measurement methods. Unfortunately, due to their complex structure, their calibration is very difficult and usually precisely tailored to the particular type of light field camera. To overcome these difficulties, we present a method that decodes a light field from the raw data of any light field imaging system without knowing and modeling the internal optical elements. We calibrate the camera using a precise generic calibration method and transform the obtained ray set into an equivalent light field representation. Finally, we reconstruct a rectified light field from the irregularly sampled data and in addition we derive the geometric ray properties as intrinsic camera parameters. Experimental results validate the method by showing that both the information of the observed scene and the geometric structure of the light field are preserved by an adequate rectification and calibration.


Author(s):  
Nils Damaschke ◽  
Volker Kühn ◽  
Holger Nobach

AbstractThe prediction and correction of systematic errors in direct spectral estimation from irregularly sampled data taken from a stochastic process is investigated. Different sampling schemes are investigated, which lead to such an irregular sampling of the observed process. Both kinds of sampling schemes are considered, stochastic sampling with non-equidistant sampling intervals from a continuous distribution and, on the other hand, nominally equidistant sampling with missing individual samples yielding a discrete distribution of sampling intervals. For both distributions of sampling intervals, continuous and discrete, different sampling rules are investigated. On the one hand, purely random and independent sampling times are considered. This is given only in those cases, where the occurrence of one sample at a certain time has no influence on other samples in the sequence. This excludes any preferred delay intervals or external selection processes, which introduce correlations between the sampling instances. On the other hand, sampling schemes with interdependency and thus correlation between the individual sampling instances are investigated. This is given whenever the occurrence of one sample in any way influences further sampling instances, e.g., any recovery times after one instance, any preferences of sampling intervals including, e.g., sampling jitter or any external source with correlation influencing the validity of samples. A bias-free estimation of the spectral content of the observed random process from such irregularly sampled data is the goal of this investigation.


2021 ◽  
Vol 54 (20) ◽  
pp. 26-31
Author(s):  
Satya Prasad Maddipatla ◽  
Hossein Haeri ◽  
Kshitij Jerath ◽  
Sean Brennan

Energies ◽  
2020 ◽  
Vol 13 (21) ◽  
pp. 5600
Author(s):  
Saeed Mian Qaisar

Lithium-ion batteries are deployed in a range of modern applications. Their utilization is evolving with the aim of achieving a greener environment. Batteries are costly, and battery management systems (BMSs) ensure long life and proper battery utilization. Modern BMSs are complex and cause a notable overhead consumption on batteries. In this paper, the time-varying aspect of battery parameters is used to reduce the power consumption overhead of BMSs. The aim is to use event-driven processing to realize effective BMSs. Unlike the conventional approach, parameters of battery cells, such as voltages and currents, are no longer regularly measured at a predefined time step and are instead recorded on the basis of events. This renders a considerable real-time compression. An inventive event-driven coulomb counting method is then presented, which employs the irregularly sampled data information for an effective online state of charge (SOC) determination. A high energy battery model for electric vehicle (EV) applications is studied in this work. It is implemented by using the equivalent circuit modeling (ECM) approach. A comparison of the developed framework is made with conventional fixed-rate counterparts. The results show that, in terms of compression and computational complexities, the devised solution surpasses the second order of magnitude gain. The SOC estimation error is also quantified, and the system attains a ≤4% SOC estimation error bound.


2020 ◽  
pp. 1-44
Author(s):  
Jia Li ◽  
Yunxiao Liu

Abstract We provide an asymptotic theory for the estimation of a general class of smooth nonlinear integrated volatility functionals. Such functionals are broadly useful for measuring financial risk and estimating economic models using high-frequency transaction data. The theory is valid under general volatility dynamics, which accommodates both Itô semimartingales (e.g., jump-diffusions) and long-memory processes (e.g., fractional Brownian motions). We establish the semiparametric efficiency bound under a nonstandard nonergodic setting with infill asymptotics, and show that the proposed estimator attains this efficiency bound. These results on efficient estimation are further extended to a setting with irregularly sampled data.


2020 ◽  
Vol 15 (7) ◽  
pp. 1167-1175
Author(s):  
Agnieszka Barbara Szczotka ◽  
Dzhoshkun Ismail Shakir ◽  
Daniele Ravì ◽  
Matthew J. Clarkson ◽  
Stephen P. Pereira ◽  
...  

2020 ◽  
Vol 37 (3) ◽  
pp. 449-465 ◽  
Author(s):  
Jeffrey J. Early ◽  
Adam M. Sykulski

AbstractA comprehensive method is provided for smoothing noisy, irregularly sampled data with non-Gaussian noise using smoothing splines. We demonstrate how the spline order and tension parameter can be chosen a priori from physical reasoning. We also show how to allow for non-Gaussian noise and outliers that are typical in global positioning system (GPS) signals. We demonstrate the effectiveness of our methods on GPS trajectory data obtained from oceanographic floating instruments known as drifters.


Geophysics ◽  
2020 ◽  
Vol 85 (2) ◽  
pp. V119-V130 ◽  
Author(s):  
Yingying Wang ◽  
Benfeng Wang ◽  
Ning Tu ◽  
Jianhua Geng

Seismic trace interpolation is an important technique because irregular or insufficient sampling data along the spatial direction may lead to inevitable errors in multiple suppression, imaging, and inversion. Many interpolation methods have been studied for irregularly sampled data. Inspired by the working idea of the autoencoder and convolutional neural network, we have performed seismic trace interpolation by using the convolutional autoencoder (CAE). The irregularly sampled data are taken as corrupted data. By using a training data set including pairs of the corrupted and complete data, CAE can automatically learn to extract features from the corrupted data and reconstruct the complete data from the extracted features. It can avoid some assumptions in the traditional trace interpolation method such as the linearity of events, low-rankness, or sparsity. In addition, once the CAE network training is completed, the corrupted seismic data can be interpolated immediately with very low computational cost. A CAE network composed of three convolutional layers and three deconvolutional layers is designed to explore the capabilities of CAE-based seismic trace interpolation for an irregularly sampled data set. To solve the problem of rare complete shot gathers in field data applications, the trained network on synthetic data is used as an initialization of the network training on field data, called the transfer learning strategy. Experiments on synthetic and field data sets indicate the validity and flexibility of the trained CAE. Compared with the curvelet-transform-based method, CAE can lead to comparable or better interpolation performances efficiently. The transfer learning strategy enhances the training efficiency on field data and improves the interpolation performance of CAE with limited training data.


2019 ◽  
Vol 632 ◽  
pp. A37 ◽  
Author(s):  
Stefan S. Brems ◽  
Martin Kürster ◽  
Trifon Trifonov ◽  
Sabine Reffert ◽  
Andreas Quirrenbach

Context. Stars show various amounts of radial-velocity (RV) jitter due to varying stellar activity levels. The typical amount of RV jitter as a function of stellar age and observational timescale has not yet been systematically quantified, although it is often larger than the instrumental precision of modern high-resolution spectrographs used for Doppler planet detection and characterization. Aims. We aim to empirically determine the intrinsic stellar RV variation for mostly G and K dwarf stars on different timescales and for different stellar ages independently of stellar models. We also focus on young stars (≲30 Myr), where the RV variation is known to be large. Methods. We use archival FEROS and HARPS RV data of stars which were observed at least 30 times spread over at least two years. We then apply the pooled variance (PV) technique to these data sets to identify the periods and amplitudes of underlying, quasiperiodic signals. We show that the PV is a powerful tool to identify quasiperiodic signals in highly irregularly sampled data sets. Results. We derive activity-lag functions for 20 putative single stars, where lag is the timescale on which the stellar jitter is measured. Since the ages of all stars are known, we also use this to formulate an activity–age–lag relation which can be used to predict the expected RV jitter of a star given its age and the timescale to be probed. The maximum RV jitter on timescales of decades decreases from over 500 m s−1 for 5 Myr-old stars to 2.3 m s−1 for stars with ages of around 5 Gyr. The decrease in RV jitter when considering a timescale of only 1 d instead of 1 yr is smaller by roughly a factor of 4 for stars with an age of about 5 Myr, and a factor of 1.5 for stars with an age of 5 Gyr. The rate at which the RV jitter increases with lag strongly depends on stellar age and reaches 99% of the maximum RV jitter over a timescale of a few days for stars that are a few million years old, up to presumably decades or longer for stars with an age of a few gigayears.


Sign in / Sign up

Export Citation Format

Share Document