Coordinate descent based ontology sparse vector computing strategy and its applications

2017 ◽  
Vol 22 (S4) ◽  
pp. 10309-10323
Author(s):  
Wei Gao ◽  
Muhammad Shoaib Sardar ◽  
Sohail Zafar ◽  
Zohaib Zahid
Symmetry ◽  
2020 ◽  
Vol 12 (9) ◽  
pp. 1562
Author(s):  
Jianzhang Wu ◽  
Arun Kumar Sangaiah ◽  
Wei Gao

The ontology sparse vector learning algorithm is essentially a dimensionality reduction trick, i.e., the key components in the p-dimensional vector are taken out, and the remaining components are set to zero, so as to obtain the key information in a certain ontology application background. In the early stage of ontology data processing, the goal of the algorithm is to find the location of key components through the learning of some ontology sample points, if the relevant concepts and structure information of each ontology vertex with p-dimensional vectors are expressed. The ontology sparse vector itself contains a certain structure, such as the symmetry between components and the binding relationship between certain components, and the algorithm can also be used to dig out the correlation and decisive components between the components. In this paper, the graph structure is used to express these components and their interrelationships, and the optimal solution is obtained by using spectral graph theory and graph optimization techniques. The essence of the proposed ontology learning algorithm is to find the decisive vertices in the graph Gβ. Finally, two experiments show that the given ontology learning algorithm is effective in similarity calculation and ontology mapping in some specific engineering fields.


IEEE Access ◽  
2021 ◽  
pp. 1-1
Author(s):  
Fanhua Shang ◽  
Zhihui Zhang ◽  
Yuanyuan Liu ◽  
Hongying Liua ◽  
Jing Xu

IEEE Access ◽  
2021 ◽  
Vol 9 ◽  
pp. 48544-48554
Author(s):  
Pyunghwan Ahn ◽  
Hyeong Gwon Hong ◽  
Junmo Kim
Keyword(s):  

Mathematics ◽  
2021 ◽  
Vol 9 (5) ◽  
pp. 540
Author(s):  
Soodabeh Asadi ◽  
Janez Povh

This article uses the projected gradient method (PG) for a non-negative matrix factorization problem (NMF), where one or both matrix factors must have orthonormal columns or rows. We penalize the orthonormality constraints and apply the PG method via a block coordinate descent approach. This means that at a certain time one matrix factor is fixed and the other is updated by moving along the steepest descent direction computed from the penalized objective function and projecting onto the space of non-negative matrices. Our method is tested on two sets of synthetic data for various values of penalty parameters. The performance is compared to the well-known multiplicative update (MU) method from Ding (2006), and with a modified global convergent variant of the MU algorithm recently proposed by Mirzal (2014). We provide extensive numerical results coupled with appropriate visualizations, which demonstrate that our method is very competitive and usually outperforms the other two methods.


Author(s):  
Feiping Nie ◽  
Jingjing Xue ◽  
Danyang Wu ◽  
Rong Wang ◽  
Hui Li ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document