scholarly journals How to Get an Efficient yet Verified Arbitrary-Precision Integer Library

Author(s):  
Raphaël Rieu-Helft ◽  
Claude Marché ◽  
Guillaume Melquiond
Keyword(s):  
1969 ◽  
Vol 12 (4) ◽  
pp. 213-214 ◽  
Author(s):  
Georges Schwachheim

2018 ◽  
Vol 40 (6) ◽  
pp. C726-C747 ◽  
Author(s):  
Fredrik Johansson ◽  
Marc Mezzarobba

2003 ◽  
Vol 15 (8) ◽  
pp. 1897-1929 ◽  
Author(s):  
Barbara Hammer ◽  
Peter Tiňo

Recent experimental studies indicate that recurrent neural networks initialized with “small” weights are inherently biased toward definite memory machines (Tiňno, Čerňanský, & Beňušková, 2002a, 2002b). This article establishes a theoretical counterpart: transition function of recurrent network with small weights and squashing activation function is a contraction. We prove that recurrent networks with contractive transition function can be approximated arbitrarily well on input sequences of unbounded length by a definite memory machine. Conversely, every definite memory machine can be simulated by a recurrent network with contractive transition function. Hence, initialization with small weights induces an architectural bias into learning with recurrent neural networks. This bias might have benefits from the point of view of statistical learning theory: it emphasizes one possible region of the weight space where generalization ability can be formally proved. It is well known that standard recurrent neural networks are not distribution independent learnable in the probably approximately correct (PAC) sense if arbitrary precision and inputs are considered. We prove that recurrent networks with contractive transition function with a fixed contraction parameter fulfill the so-called distribution independent uniform convergence of empirical distances property and hence, unlike general recurrent networks, are distribution independent PAC learnable.


2011 ◽  
Vol 84 (1) ◽  
Author(s):  
Alberto Abad ◽  
Roberto Barrio ◽  
Ángeles Dena

1983 ◽  
Vol 29 (3) ◽  
pp. 237-244 ◽  
Author(s):  
J. Demsky ◽  
M. Schlesinger ◽  
R.D. Kent

2014 ◽  
Vol 21 (04) ◽  
pp. 1450010
Author(s):  
Toru Fuda

By carrying out appropriate continuous quantum measurements with a family of projection operators, a unitary channel can be approximated in an arbitrary precision in the trace norm sense. In particular, the quantum Zeno effect is described as an application. In the case of an infinite dimension, although the von Neumann entropy is not necessarily continuous, the difference of the entropies between the states, as mentioned above, can be made arbitrarily small under some conditions.


Sign in / Sign up

Export Citation Format

Share Document