scholarly journals Local topological moves determine global diffusion properties of hyperbolic higher-order networks

2021 ◽  
Vol 104 (5) ◽  
Author(s):  
Ana P. Millán ◽  
Reza Ghorbanchian ◽  
Nicolò Defenu ◽  
Federico Battiston ◽  
Ginestra Bianconi
Big Data ◽  
2020 ◽  
Vol 8 (4) ◽  
pp. 255-269 ◽  
Author(s):  
Mandana Saebi ◽  
Giovanni Luca Ciampaglia ◽  
Lance M. Kaplan ◽  
Nitesh V. Chawla

2021 ◽  
Vol 127 (15) ◽  
Author(s):  
Guillaume St-Onge ◽  
Hanlin Sun ◽  
Antoine Allard ◽  
Laurent Hébert-Dufresne ◽  
Ginestra Bianconi

2021 ◽  
Author(s):  
Ginestra Bianconi

Higher-order networks describe the many-body interactions of a large variety of complex systems, ranging from the the brain to collaboration networks. Simplicial complexes are generalized network structures which allow us to capture the combinatorial properties, the topology and the geometry of higher-order networks. Having been used extensively in quantum gravity to describe discrete or discretized space-time, simplicial complexes have only recently started becoming the representation of choice for capturing the underlying network topology and geometry of complex systems. This Element provides an in-depth introduction to the very hot topic of network theory, covering a wide range of subjects ranging from emergent hyperbolic geometry and topological data analysis to higher-order dynamics. This Elements aims to demonstrate that simplicial complexes provide a very general mathematical framework to reveal how higher-order dynamics depends on simplicial network topology and geometry.


2021 ◽  
Vol 103 (3) ◽  
Author(s):  
Guillaume St-Onge ◽  
Vincent Thibeault ◽  
Antoine Allard ◽  
Louis J. Dubé ◽  
Laurent Hébert-Dufresne

1992 ◽  
Vol 03 (04) ◽  
pp. 323-350 ◽  
Author(s):  
JOYDEEP GHOSH ◽  
YOAN SHIN

This paper introduces a class of higher-order networks called pi-sigma networks (PSNs). PSNs are feedforward networks with a single “hidden” layer of linear summing units and with product units in the output layer. A PSN uses these product units to indirectly incorporate the capabilities of higher-order networks while greatly reducing network complexity. PSNs have only one layer of adjustable weights and exhibit fast learning. A PSN with K summing units provides a constrained Kth order approximation of a continuous function. A generalization of the PSN is presented that can uniformly approximate any continuous function defined on a compact set. The use of linear hidden units makes it possible to mathematically study the convergence properties of various LMS type learning algorithms for PSNs. We show that it is desirable to update only a partial set of weights at a time rather than synchronously updating all the weights. Bounds for learning rates which guarantee convergence are derived. Several simulation results on pattern classification and function approximation problems highlight the capabilities of the PSN. Extensive comparisons are made with other higher order networks and with multilayered perceptrons. The neurobiological plausibility of PSN type networks is also discussed.


Sign in / Sign up

Export Citation Format

Share Document