Optimization Between Estimation Error and Transmit Energy in Cross-Correlation Based Underwater Network Cardinality Estimation

2017 ◽  
Vol 97 (4) ◽  
pp. 5797-5816
Author(s):  
S. A. H. Chowdhury ◽  
J. E. Giti ◽  
M. S. Anower
2014 ◽  
Vol 1006-1007 ◽  
pp. 815-820
Author(s):  
Zhen Wang ◽  
Lan Xiang Zhu ◽  
Feng Yu ◽  
Lei Gu

Based on Electromagnetic Environmental Sensory(EES)and Multiple-input Multiple-Out(MIMO) radar sensing algorithm , this paper presents SVD-TLD perception algorithm, which firstly use the cross-spectrum AR model parameter estimation, and secondly considering the cross-correlation matrix of the estimation error function disturbance and lastly taking into account of the two ends of the equation, using the cross-correlation function of the estimated measurement errors to affect the Total Least Squares (TLS) method . Compared with the AR model parameter estimation, the accuracy of SVD algorithm cross-spectral estimation has significantly improved, greatly reducing the amount of computation and is more conducive to real-time online computing.


2021 ◽  
Vol 14 (11) ◽  
pp. 2019-2032
Author(s):  
Parimarjan Negi ◽  
Ryan Marcus ◽  
Andreas Kipf ◽  
Hongzi Mao ◽  
Nesime Tatbul ◽  
...  

Recently there has been significant interest in using machine learning to improve the accuracy of cardinality estimation. This work has focused on improving average estimation error, but not all estimates matter equally for downstream tasks like query optimization. Since learned models inevitably make mistakes, the goal should be to improve the estimates that make the biggest difference to an optimizer. We introduce a new loss function, Flow-Loss, for learning cardinality estimation models. Flow-Loss approximates the optimizer's cost model and search algorithm with analytical functions, which it uses to optimize explicitly for better query plans. At the heart of Flow-Loss is a reduction of query optimization to a flow routing problem on a certain "plan graph", in which different paths correspond to different query plans. To evaluate our approach, we introduce the Cardinality Estimation Benchmark (CEB) which contains the ground truth cardinalities for sub-plans of over 16 K queries from 21 templates with up to 15 joins. We show that across different architectures and databases, a model trained with Flow-Loss improves the plan costs and query runtimes despite having worse estimation accuracy than a model trained with Q-Error. When the test set queries closely match the training queries, models trained with both loss functions perform well. However, the Q-Error-trained model degrades significantly when evaluated on slightly different queries (e.g., similar but unseen query templates), while the Flow-Loss-trained model generalizes better to such situations, achieving 4 -- 8× better 99th percentile runtimes on unseen templates with the same model architecture and training data.


Author(s):  
Douglas L. Dorset ◽  
Barbara Moss

A number of computing systems devoted to the averaging of electron images of two-dimensional macromolecular crystalline arrays have facilitated the visualization of negatively-stained biological structures. Either by simulation of optical filtering techniques or, in more refined treatments, by cross-correlation averaging, an idealized representation of the repeating asymmetric structure unit is constructed, eliminating image distortions due to radiation damage, stain irregularities and, in the latter approach, imperfections and distortions in the unit cell repeat. In these analyses it is generally assumed that the electron scattering from the thin negativelystained object is well-approximated by a phase object model. Even when absorption effects are considered (i.e. “amplitude contrast“), the expansion of the transmission function, q(x,y)=exp (iσɸ (x,y)), does not exceed the first (kinematical) term. Furthermore, in reconstruction of electron images, kinematical phases are applied to diffraction amplitudes and obey the constraints of the plane group symmetry.


Author(s):  
D. E. Luzzi ◽  
L. D. Marks ◽  
M. I. Buckett

As the HREM becomes increasingly used for the study of dynamic localized phenomena, the development of techniques to recover the desired information from a real image is important. Often, the important features are not strongly scattering in comparison to the matrix material in addition to being masked by statistical and amorphous noise. The desired information will usually involve the accurate knowledge of the position and intensity of the contrast. In order to decipher the desired information from a complex image, cross-correlation (xcf) techniques can be utilized. Unlike other image processing methods which rely on data massaging (e.g. high/low pass filtering or Fourier filtering), the cross-correlation method is a rigorous data reduction technique with no a priori assumptions.We have examined basic cross-correlation procedures using images of discrete gaussian peaks and have developed an iterative procedure to greatly enhance the capabilities of these techniques when the contrast from the peaks overlap.


Sign in / Sign up

Export Citation Format

Share Document