A Natural Representation of Functions that Facilitates `Exact Learning'

2020 ◽  
Author(s):  
Benedict William John Irwin
2021 ◽  
Author(s):  
Benedict Irwin

Abstract We present a collection of mathematical tools and emphasise a fundamental representation of analytic functions. Connecting these concepts leads to a framework for `exact learning', where an unknown numeric distribution could in principle be assigned an exact mathematical description. This is a new perspective on machine learning with potential applications in all domains of the mathematical sciences and the generalised representations presented here have not yet been widely considered in the context of machine learning and data analysis. The moments of a multivariate function or distribution are extracted using a Mellin transform and the generalised form of the coefficients is trained assuming a highly generalised Mellin-Barnes integral representation. The functions use many fewer parameters than contemporary machine learning methods and any implementation that connects these concepts successfully will likely carry across to non-exact problems and provide approximate solutions. We compare the equations for the exact learning method with those for a neural network which leads to a new perspective on understanding what a neural network may be learning and how to interpret the parameters of those networks.


1994 ◽  
Vol 09 (12) ◽  
pp. 2103-2115 ◽  
Author(s):  
D.G. BARCI ◽  
L.E. OXMAN

We consider a fermionic field obeying a second order equation containing a pair of complex conjugate mass parameters. After obtaining a natural representation for the different degrees of freedom, we are able to construct a unique vacuum as the more symmetric state (zero energy-momentum, charge and spin). This representation, unlike the real mass case, is not holomorphic in the Grassmann variables. The vacuum eigenstate allows the calculation of the field propagator which turns out to be half advanced plus half retarded.


2015 ◽  
Vol 22 (6) ◽  
pp. 463-473 ◽  
Author(s):  
Hamidreza Chitsaz ◽  
Mohammad Aminisharifabad

1996 ◽  
Vol 52 (3) ◽  
pp. 421-433 ◽  
Author(s):  
Nader H. Bshouty ◽  
Richard Cleve ◽  
Ricard Gavaldà ◽  
Sampath Kannan ◽  
Christino Tamon
Keyword(s):  

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Hao Hua ◽  
Ludger Hovestadt

AbstractThe Erdős-Rényi (ER) random graph G(n, p) analytically characterizes the behaviors in complex networks. However, attempts to fit real-world observations need more sophisticated structures (e.g., multilayer networks), rules (e.g., Achlioptas processes), and projections onto geometric, social, or geographic spaces. The p-adic number system offers a natural representation of hierarchical organization of complex networks. The p-adic random graph interprets n as the cardinality of a set of p-adic numbers. Constructing a vast space of hierarchical structures is equivalent for combining number sequences. Although the giant component is vital in dynamic evolution of networks, the structure of multiple big components is also essential. Fitting the sizes of the few largest components to empirical data was rarely demonstrated. The p-adic ultrametric enables the ER model to simulate multiple big components from the observations of genetic interaction networks, social networks, and epidemics. Community structures lead to multimodal distributions of the big component sizes in networks, which have important implications in intervention of spreading processes.


1993 ◽  
Vol 25 (6) ◽  
pp. 3-14
Author(s):  
Aubrey McIntosh ◽  
Jennifer Brodbelt

Sign in / Sign up

Export Citation Format

Share Document