scholarly journals Quantum locally linear embedding for nonlinear dimensionality reduction

2020 ◽  
Vol 19 (9) ◽  
Author(s):  
Xi He ◽  
Li Sun ◽  
Chufan Lyu ◽  
Xiaoting Wang
2013 ◽  
Vol 677 ◽  
pp. 436-441 ◽  
Author(s):  
Kang Hua Hui ◽  
Chun Li Li ◽  
Xin Zhong Xu ◽  
Xiao Rong Feng

The locally linear embedding (LLE) algorithm is considered as a powerful method for the problem of nonlinear dimensionality reduction. In this paper, a new method called Self-Regulated LLE is proposed. It achieves to solve the problem of deciding appropriate neighborhood parameter for LLE by finding the local patch which is close to be a linear one. The experiment results show that LLE with self-regulation performs better in most cases than LLE based on different evaluation criteria and spends less time on several data sets.


2010 ◽  
Vol 23 (5-6) ◽  
pp. 327-338 ◽  
Author(s):  
Peter Mannfolk ◽  
Ronnie Wirestam ◽  
Markus Nilsson ◽  
Freddy Ståhlberg ◽  
Johan Olsrud

2014 ◽  
Vol 1014 ◽  
pp. 375-378 ◽  
Author(s):  
Ri Sheng Huang

To improve effectively the performance on speech emotion recognition, it is needed to perform nonlinear dimensionality reduction for speech feature data lying on a nonlinear manifold embedded in high-dimensional acoustic space. This paper proposes an improved SLLE algorithm, which enhances the discriminating power of low-dimensional embedded data and possesses the optimal generalization ability. The proposed algorithm is used to conduct nonlinear dimensionality reduction for 48-dimensional speech emotional feature data including prosody so as to recognize three emotions including anger, joy and neutral. Experimental results on the natural speech emotional database demonstrate that the proposed algorithm obtains the highest accuracy of 90.97% with only less 9 embedded features, making 11.64% improvement over SLLE algorithm.


Sign in / Sign up

Export Citation Format

Share Document