local invariance
Recently Published Documents


TOTAL DOCUMENTS

27
(FIVE YEARS 4)

H-INDEX

5
(FIVE YEARS 1)

Symmetry ◽  
2019 ◽  
Vol 11 (12) ◽  
pp. 1429
Author(s):  
Torsten Asselmeyer-Maluga ◽  
Jerzy Król

Category theory allows one to treat logic and set theory as internal to certain categories. What is internal to SET is 2-valued logic with classical Zermelo–Fraenkel set theory, while for general toposes it is typically intuitionistic logic and set theory. We extend symmetries of smooth manifolds with atlases defined in Set towards atlases with some of their local maps in a topos T . In the case of the Basel topos and R 4 , the local invariance with respect to the corresponding atlases implies exotic smoothness on R 4 . The smoothness structures do not refer directly to Casson handless or handle decompositions, which may be potentially useful for describing the so far merely putative exotic R 4 underlying an exotic S 4 (should it exist). The tovariance principle claims that (physical) theories should be invariant with respect to the choice of topos with natural numbers object and geometric morphisms changing the toposes. We show that the local T -invariance breaks tovariance even in the weaker sense.


Author(s):  
Zhenyu Huang ◽  
Joey Tianyi Zhou ◽  
Xi Peng ◽  
Changqing Zhang ◽  
Hongyuan Zhu ◽  
...  

Multi-view clustering aims to cluster data from diverse sources or domains, which has drawn considerable attention in recent years. In this paper, we propose a novel multi-view clustering method named multi-view spectral clustering network (MvSCN) which could be the first deep version of multi-view spectral clustering to the best of our knowledge. To deeply cluster multi-view data, MvSCN incorporates the local invariance within every single view and the consistency across different views into a novel objective function, where the local invariance is defined by a deep metric learning network rather than the Euclidean distance adopted by traditional approaches. In addition, we enforce and reformulate an orthogonal constraint as a novel layer stacked on an embedding network for two advantages, i.e. jointly optimizing the neural network and performing matrix decomposition and avoiding trivial solutions. Extensive experiments on four challenging datasets demonstrate the effectiveness of our method compared with 10 state-of-the-art approaches in terms of three evaluation metrics.


Sign in / Sign up

Export Citation Format

Share Document