Data smoothing and end correction using entropic kernel compression

Stat ◽  
2014 ◽  
Vol 3 (1) ◽  
pp. 250-257 ◽  
Author(s):  
Thomas Bläsche ◽  
Roger J. Bowden ◽  
Peter N. Posch
2021 ◽  
Vol 15 ◽  
pp. 174830262110084
Author(s):  
Bishnu P Lamichhane ◽  
Elizabeth Harris ◽  
Quoc Thong Le Gia

We compare a recently proposed multivariate spline based on mixed partial derivatives with two other standard splines for the scattered data smoothing problem. The splines are defined as the minimiser of a penalised least squares functional. The penalties are based on partial differential operators, and are integrated using the finite element method. We compare three methods to two problems: to remove the mixture of Gaussian and impulsive noise from an image, and to recover a continuous function from a set of noisy observations.


Open Physics ◽  
2018 ◽  
Vol 16 (1) ◽  
pp. 1033-1045
Author(s):  
Guodong Zhou ◽  
Huailiang Zhang ◽  
Raquel Martínez Lucas

Abstract Aiming at the excellent descriptive ability of SURF operator for local features of images, except for the shortcoming of global feature description ability, a compressed sensing image restoration algorithm based on improved SURF operator is proposed. The SURF feature vector set of the image is extracted, and the vector set data is reduced into a single high-dimensional feature vector by using a histogram algorithm, and then the image HSV color histogram is extracted.MSA image decomposition algorithm is used to obtain sparse representation of image feature vectors. Total variation curvature diffusion method and Bayesian weighting method perform image restoration for data smoothing feature and local similarity feature of texture part respectively. A compressed sensing image restoration model is obtained by using Schatten-p norm, and image color supplement is performed on the model. The compressed sensing image is iteratively solved by alternating optimization method, and the compressed sensing image is restored. The experimental results show that the proposed algorithm has good restoration performance, and the restored image has finer edge and texture structure and better visual effect.


1991 ◽  
Vol 113 (3) ◽  
pp. 348-351 ◽  
Author(s):  
W. Simons ◽  
K. H. Yang

A differentiation method, which combines the concepts of least squares and splines, has been developed to analyze human motion data. This data smoothing technique is not dependent on a choice of a cut-off frequency and yet it closely reflects the nature of the phenomenon. Two sets of published benchmark data were used to evaluate the new algorithm.


2017 ◽  
Vol 34 (1) ◽  
pp. 123-133 ◽  
Author(s):  
Zeguang Yi ◽  
Nan Pan ◽  
Yi Liu ◽  
Yu Guo

Purpose This paper aims to reduce and eliminate the abnormal peaks which, because of the reflection in the process of laser detection, make it easier to proceed with further analysis. Design/methodology/approach To solve the above problem, an abnormal data correction algorithm based on histogram, K-Means clustering and improved robust locally weighted scatter plot smoothing (LOWESS) is put forward. The proposed algorithm does section leveling for shear plant first and then applies histogram to define the abnormal fluctuation data between the neighboring points and utilizes a K-Means clustering to eliminate the abnormal data. After that, the improved robust LOWESS method, which is based on Euclidean distance, is used to remove the noise interference and finally obtain the waveform characteristics for next data processing. Findings The experiment result of liner tool mark laser test data correction demonstrates the accuracy and reliability of the proposed algorithm. Originality/value The study enables the following points: the detection signal automatic leveling; abnormal data identification and demarcation using K-Means clustering and histogram; and data smoothing using LOWESS.


1992 ◽  
Vol 02 (03) ◽  
pp. 313-323 ◽  
Author(s):  
I.F. WEST ◽  
G.E. COOTE ◽  
R.W. GAULDIE

We use a proton microprobe to examine the distribution of elements in otoliths and scales of teleost (bony) fish. The elements of principal interest are calcium and strontium in otoliths and calcium and fluorine in scales. Changes in the distribution of these elements across hard structures may allow inferences about the life histories of fish. Otoliths and scales of interest are up to a centimeter in linear dimension and to reveal the structures of interest up to 200 sampling points are required in each dimension. The time needed to accumulate high X-ray counts at each sampling point can be large, particularly for strontium. To reduce microprobe usage we use data smoothing techniques to reveal changing patterns with modest X-ray count accumulations at individual data points. In this paper we review performance for revealing pattern at modest levels of X-ray count accumulations of a selection of digital filters (moving average smoothers), running median filters, robust locally weighted regression filters and adaptive spline filters.


Sign in / Sign up

Export Citation Format

Share Document