scholarly journals Robust Hessian Locally Linear Embedding Techniques for High-Dimensional Data

Algorithms ◽  
2016 ◽  
Vol 9 (2) ◽  
pp. 36 ◽  
Author(s):  
Xianglei Xing ◽  
Sidan Du ◽  
Kejun Wang
Author(s):  
JING CHEN ◽  
ZHENGMING MA

The goal of nonlinear dimensionality reduction is to find the meaningful low dimensional structure of the nonlinear manifold from the high dimensional data. As a classic method of nonlinear dimensional reduction, locally linear embedding (LLE) is more and more attractive to researchers due to its ability to deal with large amounts of high dimensional data and its noniterative way of finding the embeddings. However, several problems in the LLE algorithm still remain open, such as its sensitivity to noise, inevitable ill-conditioned eigenproblems, the inability to deal with the novel data, etc. The existing extensions are comprehensively reviewed and discussed classifying into different categories in this paper. Their strategies, advantages/disadvantages and performances are elaborated. By generalizing different tactics in various extensions related to different stages of LLE and evaluating their performances, several promising directions for future research have been suggested.


2010 ◽  
Vol 139-141 ◽  
pp. 2599-2602
Author(s):  
Zheng Wei Li ◽  
Ru Nie ◽  
Yao Fei Han

Fault diagnosis is a kind of pattern recognition problem and how to extract diagnosis features and improve recognition performance is a difficult problem. Local Linear Embedding (LLE) is an unsupervised non-linear technique that extracts useful features from the high-dimensional data sets with preserved local topology. But the original LLE method is not taking the known class label information of input data into account. A new characteristics similarity-based supervised locally linear embedding (CSSLLE) method for fault diagnosis is proposed in this paper. The CSSLLE method attempts to extract the intrinsic manifold features from high-dimensional fault data by computing Euclidean distance based on characteristics similarity and translate complex mode space into a low-dimensional feature space in which fault classification and diagnosis are carried out easily. The experiments on benchmark data and real fault dataset demonstrate that the proposed approach obtains better performance compared to SLLE, and it is an accurate technique for fault diagnosis.


Author(s):  
Yuan Li ◽  
Chengcheng Feng

Aiming at fault detection in industrial processes with nonlinear or high dimensions, a novel method based on locally linear embedding preserve neighborhood for fault detection is proposed in this paper. Locally linear embedding preserve neighborhood is a feature-mapping method that combines Locally linear embedding and Laplacian eigenmaps algorithms. First, two weight matrices are obtained by the Locally linear embedding and Laplacian eigenmaps, respectively. Subsequently, the two weight matrices are combined by a balance factor to obtain the objective function. Locally linear embedding preserve neighborhood method can effectively maintain the characteristics of data in high-dimensional space. The purpose of dimension reduction is to map the high-dimensional data to low-dimensional space by optimizing the objective function. Process monitoring is performed by constructing T2 and Q statistics. To demonstrate its effectiveness and superiority, the proposed locally linear embedding preserve neighborhood for fault detection method is tested under the Swiss Roll dataset and an industrial case study. Compared with traditional fault detection methods, the proposed method in this paper effectively improves the detection rate and reduces the false alarm rate.


2021 ◽  
pp. 1-11
Author(s):  
Guo Niu ◽  
Zhengming Ma

Locally Linear Embedding (LLE) is honored as the first algorithm of manifold learning. Generally speaking, the relation between a data and its nearest neighbors is nonlinear and LLE only extracts its linear part. Therefore, local nonlinear embedding is an important direction of improvement to LLE. However, any attempt in this direction may lead to a significant increase in computational complexity. In this paper, a novel algorithm called local quasi-linear embedding (LQLE) is proposed. In our LQLE, each high-dimensional data vector is first expanded by using Kronecker product. The expanded vector contains not only the components of the original vector, but also the polynomials of its components. Then, each expanded vector of high dimensional data is linearly approximated with the expanded vectors of its nearest neighbors. In this way, the proposed LQLE achieves a certain degree of local nonlinearity and learns the data dimensionality reduction results under the principle of keeping local nonlinearity unchanged. More importantly, LQLE does not increase computation complexity by only replacing the data vectors with their Kronecker product expansions in the original LLE program. Experimental results between our proposed methods and four comparison algorithms on various datasets demonstrate the well performance of the proposed methods.


Author(s):  
Jin-Hang Liu ◽  
Tao Peng ◽  
Xiaogang Zhao ◽  
Kunfang Song ◽  
Minghua Jiang ◽  
...  

Data in a high-dimensional data space may reside in a low-dimensional manifold embedded within the high-dimensional space. Manifold learning discovers intrinsic manifold data structures to facilitate dimensionality reductions. We propose a novel manifold learning technique called fast [Formula: see text] selection for locally linear embedding or FSLLE, which judiciously chooses an appropriate number (i.e., parameter [Formula: see text]) of neighboring points where the local geometric properties are maintained by the locally linear embedding (LLE) criterion. To measure the spatial distribution of a group of neighboring points, FSLLE relies on relative variance and mean difference to form a spatial correlation index characterizing the neighbors’ data distribution. The goal of FSLLE is to quickly identify the optimal value of parameter [Formula: see text], which aims at minimizing the spatial correlation index. FSLLE optimizes parameter [Formula: see text] by making use of the spatial correlation index to discover intrinsic structures of a data point’s neighbors. After implementing FSLLE, we conduct extensive experiments to validate the correctness and evaluate the performance of FSLLE. Our experimental results show that FSLLE outperforms the existing solutions (i.e., LLE and ISOMAP) in manifold learning and dimension reduction. We apply FSLLE to face recognition in which FSLLE achieves higher accuracy than the state-of-the-art face recognition algorithms. FSLLE is superior to the face recognition algorithms, because FSLLE makes a good tradeoff between classification precision and performance.


2009 ◽  
Vol 20 (9) ◽  
pp. 2376-2386 ◽  
Author(s):  
Gui-Hua WEN ◽  
Ting-Hui LU ◽  
Li-Jun JIANG ◽  
Jun WEN

Sign in / Sign up

Export Citation Format

Share Document