scholarly journals Asymptotic normality of the size of the giant component in a random hypergraph

2012 ◽  
Vol 41 (4) ◽  
pp. 441-450 ◽  
Author(s):  
Béla Bollobás ◽  
Oliver Riordan
1998 ◽  
Vol 14 (4) ◽  
pp. 833-848
Author(s):  
Malcolm P. Quine ◽  
Władysław Szczotka
Keyword(s):  

Author(s):  
Mark Newman

A discussion of the most fundamental of network models, the configuration model, which is a random graph model of a network with a specified degree sequence. Following a definition of the model a number of basic properties are derived, including the probability of an edge, the expected number of multiedges, the excess degree distribution, the friendship paradox, and the clustering coefficient. This is followed by derivations of some more advanced properties including the condition for the existence of a giant component, the size of the giant component, the average size of a small component, and the expected diameter. Generating function methods for network models are also introduced and used to perform some more advanced calculations, such as the calculation of the distribution of the number of second neighbors of a node and the complete distribution of sizes of small components. The chapter ends with a brief discussion of extensions of the configuration model to directed networks, bipartite networks, networks with degree correlations, networks with high clustering, and networks with community structure, among other possibilities.


Author(s):  
Mark Newman

An introduction to the mathematics of the Poisson random graph, the simplest model of a random network. The chapter starts with a definition of the model, followed by derivations of basic properties like the mean degree, degree distribution, and clustering coefficient. This is followed with a detailed derivation of the large-scale structural properties of random graphs, including the position of the phase transition at which a giant component appears, the size of the giant component, the average size of the small components, and the expected diameter of the network. The chapter ends with a discussion of some of the shortcomings of the random graph model.


2021 ◽  
Vol 11 (9) ◽  
pp. 3867
Author(s):  
Zhewei Liu ◽  
Zijia Zhang ◽  
Yaoming Cai ◽  
Yilin Miao ◽  
Zhikun Chen

Extreme Learning Machine (ELM) is characterized by simplicity, generalization ability, and computational efficiency. However, previous ELMs fail to consider the inherent high-order relationship among data points, resulting in being powerless on structured data and poor robustness on noise data. This paper presents a novel semi-supervised ELM, termed Hypergraph Convolutional ELM (HGCELM), based on using hypergraph convolution to extend ELM into the non-Euclidean domain. The method inherits all the advantages from ELM, and consists of a random hypergraph convolutional layer followed by a hypergraph convolutional regression layer, enabling it to model complex intraclass variations. We show that the traditional ELM is a special case of the HGCELM model in the regular Euclidean domain. Extensive experimental results show that HGCELM remarkably outperforms eight competitive methods on 26 classification benchmarks.


Sign in / Sign up

Export Citation Format

Share Document