scholarly journals Two-dimensional Kalman filter approach to airborne vector gravimetry

2019 ◽  
Vol 9 (1) ◽  
pp. 87-96 ◽  
Author(s):  
V.S. Vyazmin ◽  
Yu.V. Bolotin

Abstract The paper presents a new approach to the airborne vector gravimetry problem. The idea of the approach is to take into account spatial correlation of the gravity field to improve observability of horizontal components of the gravity disturbance vector (GDV). We consider the GDV determination problem given airborne data at a set of parallel survey lines assuming that lines are flown in the same direction at a constant height above the reference ellipsoid. We use a 2-D random field model for the gravity field at the flight height. The random field is governed by two autoregressive equations (one in the direction along the lines, the other across the lines). Then we pose the estimation problem simultaneously for the GDV horizontal components and systematic errors of an inertial navigation system at all the lines simultaneously. The developed estimation algorithm is based on 2D Kalman filtering and smoothing techniques. Numerical results obtained from simulated data processing showed improved accuracy of the gravity horizontal component determination.

2021 ◽  
Vol 13 (11) ◽  
pp. 2069
Author(s):  
M. V. Alba-Fernández ◽  
F. J. Ariza-López ◽  
M. D. Jiménez-Gamero

The usefulness of the parameters (e.g., slope, aspect) derived from a Digital Elevation Model (DEM) is limited by its accuracy. In this paper, a thematic-like quality control (class-based) of aspect and slope classes is proposed. A product can be compared against a reference dataset, which provides the quality requirements to be achieved, by comparing the product proportions of each class with those of the reference set. If a distance between the product proportions and the reference proportions is smaller than a small enough positive tolerance, which is fixed by the user, it will be considered that the degree of similarity between the product and the reference set is acceptable, and hence that its quality meets the requirements. A formal statistical procedure, based on a hypothesis test, is developed and its performance is analyzed using simulated data. It uses the Hellinger distance between the proportions. The application to the slope and aspect is illustrated using data derived from a 2×2 m DEM (reference) and 5×5 m DEM in Allo (province of Navarra, Spain).


1998 ◽  
Vol 09 (01) ◽  
pp. 71-85 ◽  
Author(s):  
A. Bevilacqua ◽  
D. Bollini ◽  
R. Campanini ◽  
N. Lanconelli ◽  
M. Galli

This study investigates the possibility of using an Artificial Neural Network (ANN) for reconstructing Positron Emission Tomography (PET) images. The network is trained with simulated data which include physical effects such as attenuation and scattering. Once the training ends, the weights of the network are held constant. The network is able to reconstruct every type of source distribution contained inside the area mapped during the learning. The reconstruction of a simulated brain phantom in a noiseless case shows an improvement if compared with Filtered Back-Projection reconstruction (FBP). In noisy cases there is still an improvement, even if we do not compensate for noise fluctuations. These results show that it is possible to reconstruct PET images using ANNs. Initially we used a Dec Alpha; then, due to the high data parallelism of this reconstruction problem, we ported the learning on a Quadrics (SIMD) machine, suited for the realization of a small medical dedicated system. These results encourage us to continue in further studies that will make possible reconstruction of images of bigger dimension than those used in the present work (32 × 32 pixels).


Author(s):  
Jiayi Su ◽  
Yuqin Weng ◽  
Susan C. Schneider ◽  
Edwin E. Yaz

Abstract In this work, a new approach to detect sensor and actuator intrusion for Cyber-Physical Systems using a bank of Kalman filters is presented. The case where the unknown type of the intrusion signal is considered first, using two Kalman filters in a bank to provide the conditional state estimates, then the unknown type of intrusion signal can be detected properly via the adaptive estimation algorithm. The case where the target (either sensor or actuator) of the intrusion signal is unknown is also considered, using four Kalman filters in a bank designed to detect if the intrusion signal is about to affect healthy sensor or actuator signal. To test these methods, a DC motor speed control system subject to attack by different types of sensor and actuator signals is simulated. Simulations show that different types of sensor and actuator intrusion signals can be detected properly without the knowledge of the nature and the type of these signals.


Geophysics ◽  
1988 ◽  
Vol 53 (10) ◽  
pp. 1355-1361 ◽  
Author(s):  
Steven J. Brzezowski ◽  
Warren G. Heller

Gradiometer system noise, sampling effects, downward continuation, and limited data extent are the important contributors to moving‐base gravity gradiometer survey error. We apply a two‐dimensional frequency‐domain approach in simulations of several sets of airborne survey conditions to assess the significance of the first two sources. A special error allocation technique is used to account for the downward continuation and limited extent effects. These two sources cannot be modeled adequately as measurement noise in a linear error estimation algorithm. For a typical characterization of the Earth’s gravity field, our modeling indicates that limited data extent generally contributes about one‐half of the total error variance associated with recovery of the gravity disturbance vector at the Earth’s surface; gradiometer system noise typically contributes about one‐third. However, sampling effects are also very important (and are controlled through the survey track spacing). A 5 km track spacing provides a reasonable tradeoff between survey cost and errors due to track spacing. Furthermore, our results indicate that a moving‐base gravity gradiometer system can recover each component of the gravity disturbance vector with an rms accuracy better than 1.0 mGal.


2011 ◽  
Vol 8 (2) ◽  
pp. 237-252
Author(s):  
Mauro M. Sette ◽  
Hendrik Van Brussel ◽  
Jos Vander Sloten

Tactile feedback is a major missing feature in minimally invasive procedures; it is an essential means of diagnosis and orientation during surgical procedures. Previous works have presented a remote palpation feedback system based on the coupling between a pressure sensor and a general haptic interface. Here a new approach is presented based on the direct estimation of the tissue mechanical properties and finally their presentation to the operator by means of a haptic interface. The approach presents different technical difficulties and some solutions are proposed: the implementation of a fast Young’s modulus estimation algorithm, the implementation of a real time finite element model, and finally the implementation of a stiffness estimation approach in order to guarantee the system’s stability. The work is concluded with an experimental evaluation of the whole system.


2018 ◽  
Author(s):  
Jasmijn A. Baaijens ◽  
Alexander Schönhuth

AbstractHaplotype aware genome assembly plays an important role in genetics, medicine, and various other disciplines, yet generation of haplotype-resolved de novo assemblies remains a major challenge. Beyond distinguishing between errors and true sequential variants, one needs to assign the true variants to the different genome copies. Recent work has pointed out that the enormous quantities of traditional NGS read data have been greatly underexploited in terms of haplotig computation so far, which reflects that methodology for reference independent haplotig computation has not yet reached maturity. We present POLYTE (POLYploid genome fitTEr) as a new approach to de novo generation of haplotigs for diploid and polyploid genomes. Our method follows an iterative scheme where in each iteration reads or contigs are joined, based on their interplay in terms of an underlying haplotype-aware overlap graph. Along the iterations, contigs grow while preserving their haplotype identity. Benchmarking experiments on both real and simulated data demonstrate that POLYTE establishes new standards in terms of error-free reconstruction of haplotype-specific sequence. As a consequence, POLYTE outperforms state-of-the-art approaches in various relevant aspects, where advantages become particularly distinct in polyploid settings. POLYTE is freely available as part of the HaploConduct package at https://github.com/HaploConduct/HaploConduct, implemented in Python and C++.


2010 ◽  
Vol 25 ◽  
pp. 3-9 ◽  
Author(s):  
R. S. Chadwick ◽  
D. I. F. Grimes ◽  
R. W. Saunders ◽  
P. N. Francis ◽  
T. A. Blackmore

Abstract. A multi-spectral rainfall estimation algorithm has been developed for the Sahel region of West Africa with the purpose of producing accumulated rainfall estimates for drought monitoring and food security. Radar data were used to calibrate multi-channel SEVIRI data from MSG, and a probability of rainfall at several different rain-rates was established for each combination of SEVIRI radiances. Radar calibrations from both Europe (the SatPrecip algorithm) and Niger (TAMORA algorithm) were used. 10 day estimates were accumulated from SatPrecip and TAMORA and compared with kriged gauge data and TAMSAT satellite rainfall estimates over West Africa. SatPrecip was found to produce large overestimates for the region, probably because of its non-local calibration. TAMORA was negatively biased for areas of West Africa with relatively high rainfall, but its skill was comparable to TAMSAT for the low-rainfall region climatologically similar to its calibration area around Niamey. These results confirm the high importance of local calibration for satellite-derived rainfall estimates. As TAMORA shows no improvement in skill over TAMSAT for dekadal estimates, the extra cloud-microphysical information provided by multi-spectral data may not be useful in determining rainfall accumulations at a ten day timescale. Work is ongoing to determine whether it shows improved accuracy at shorter timescales.


2019 ◽  
Author(s):  
T. Delabastita ◽  
M. Afschrift ◽  
B. Vanwanseele ◽  
F. De Groote

We present and evaluate a new approach to estimate calf muscle-tendon parameters and calculate calf muscle-tendon function during walking. We used motion analysis, ultrasound, and EMG data of the calf muscles collected in six young and six older adults during treadmill walking as inputs to a new optimal estimation algorithm. We used estimated parameters or scaled generic parameters in an existing approach to calculate muscle fiber lengths and activations. We calculated the fit with experimental data in terms of root mean squared differences (RMSD) and coefficients of determination (R2). We also calculated the calf muscle metabolic energy cost. RMSD between measured and calculated fiber lengths and activations decreased and R2 increased when estimating parameters compared to using scaled generic parameters. Moreover, R2 between measured and calculated gastrocnemius medialis fiber length and soleus activations increased by 19 % and 70 %, and calf muscle metabolic energy decreased by 25% when using estimated parameters compared to using scaled generic parameters at speeds not used for estimation. This new approach estimates calf muscle-tendon parameters in good accordance with values reported in literature. The approach improves calculations of calf muscle-tendon interaction during walking and highlights the importance of individualizing calf muscle-tendon parameters.


2021 ◽  
Vol 22 (1) ◽  
Author(s):  
Bin Huang ◽  
Guozheng Wei ◽  
Bing Wang ◽  
Fusong Ju ◽  
Yi Zhong ◽  
...  

Abstract Background Optical maps record locations of specific enzyme recognition sites within long genome fragments. This long-distance information enables aligning genome assembly contigs onto optical maps and ordering contigs into scaffolds. The generated scaffolds, however, often contain a large amount of gaps. To fill these gaps, a feasible way is to search genome assembly graph for the best-matching contig paths that connect boundary contigs of gaps. The combination of searching and evaluation procedures might be “searching followed by evaluation”, which is infeasible for long gaps, or “searching by evaluation”, which heavily relies on heuristics and thus usually yields unreliable contig paths. Results We here report an accurate and efficient approach to filling gaps of genome scaffolds with aids of optical maps. Using simulated data from 12 species and real data from 3 species, we demonstrate the successful application of our approach in gap filling with improved accuracy and completeness of genome scaffolds. Conclusion Our approach applies a sequential Bayesian updating technique to measure the similarity between optical maps and candidate contig paths. Using this similarity to guide path searching, our approach achieves higher accuracy than the existing “searching by evaluation” strategy that relies on heuristics. Furthermore, unlike the “searching followed by evaluation” strategy enumerating all possible paths, our approach prunes the unlikely sub-paths and extends the highly-probable ones only, thus significantly increasing searching efficiency.


Sign in / Sign up

Export Citation Format

Share Document