Lower Bounds on the VC Dimension of Smoothly Parameterized Function Classes

1995 ◽  
Vol 7 (5) ◽  
pp. 1040-1053 ◽  
Author(s):  
Wee Sun Lee ◽  
Peter L. Bartlett ◽  
Robert C. Williamson

We examine the relationship between the VC dimension and the number of parameters of a threshold smoothly parameterized function class. We show that the VC dimension of such a function class is at least k if there exists a k-dimensional differentiable manifold in the parameter space such that each member of the manifold corresponds to a different decision boundary. Using this result, we are able to obtain lower bounds on the VC dimension proportional to the number of parameters for several thresholded function classes including two-layer neural networks with certain smooth activation functions and radial basis functions with a gaussian basis. These lower bounds hold even if the magnitudes of the parameters are restricted to be arbitrarily small. In Valiant's probably approximately correct learning framework, this implies that the number of examples necessary for learning these function classes is at least linear in the number of parameters.

1996 ◽  
Vol 8 (3) ◽  
pp. 625-628 ◽  
Author(s):  
Peter L. Bartlett ◽  
Robert C. Williamson

We give upper bounds on the Vapnik-Chervonenkis dimension and pseudodimension of two-layer neural networks that use the standard sigmoid function or radial basis function and have inputs from {−D, …,D}n. In Valiant's probably approximately correct (pac) learning framework for pattern classification, and in Haussler's generalization of this framework to nonlinear regression, the results imply that the number of training examples necessary for satisfactory learning performance grows no more rapidly than W log (WD), where W is the number of weights. The previous best bound for these networks was O(W4).


1997 ◽  
Vol 9 (4) ◽  
pp. 771-776 ◽  
Author(s):  
Yossi Erlich ◽  
Dan Chazan ◽  
Scott Petrack ◽  
Avraham Levy

We show that the VC-dimension of a smoothly parameterized function class is not less than the dimension of any manifold in the parameter space, as long as distinct parameter values induce distinct decision boundaries. A similar theorem was published recently and used to introduce lower bounds on VC-dimension for several cases (Lee, Bartlett, & Williamson, 1995). This theorem is not correct, but our theorem could replace it for those cases and many other practical ones.


1997 ◽  
Vol 9 (4) ◽  
pp. 765-769 ◽  
Author(s):  
Wee Sun Lee ◽  
Peter L. Bartlett ◽  
Robert C. Williamson

The earlier article gives lower bounds on the VC-dimension of various smoothly parameterized function classes. The results were proved by showing a relationship between the uniqueness of decision boundaries and the VC-dimension of smoothly parameterized function classes. The proof is incorrect; there is no such relationship under the conditions stated in the article. For the case of neural networks with tanh activation functions, we give an alternative proof of a lower bound for the VC-dimension proportional to the number of parameters, which holds even when the magnitude of the parameters is restricted to be arbitrarily small.


1993 ◽  
Vol 5 (3) ◽  
pp. 371-373 ◽  
Author(s):  
Peter L. Bartlett

We show that the Vapnik-Chervonenkis dimension of the class of functions that can be computed by arbitrary two-layer or some completely connected three-layer threshold networks with real inputs is at least linear in the number of weights in the network. In Valiant's "probably approximately correct" learning framework, this implies that the number of random training examples necessary for learning in these networks is at least linear in the number of weights.


2021 ◽  
Vol 19 (1) ◽  
pp. 329-337
Author(s):  
Huo Tang ◽  
Kaliappan Vijaya ◽  
Gangadharan Murugusundaramoorthy ◽  
Srikandan Sivasubramanian

Abstract Let f k ( z ) = z + ∑ n = 2 k a n z n {f}_{k}\left(z)=z+{\sum }_{n=2}^{k}{a}_{n}{z}^{n} be the sequence of partial sums of the analytic function f ( z ) = z + ∑ n = 2 ∞ a n z n f\left(z)=z+{\sum }_{n=2}^{\infty }{a}_{n}{z}^{n} . In this paper, we determine sharp lower bounds for Re { f ( z ) / f k ( z ) } {\rm{Re}}\{f\left(z)\hspace{-0.08em}\text{/}\hspace{-0.08em}{f}_{k}\left(z)\} , Re { f k ( z ) / f ( z ) } {\rm{Re}}\{{f}_{k}\left(z)\hspace{-0.08em}\text{/}\hspace{-0.08em}f\left(z)\} , Re { f ′ ( z ) / f k ′ ( z ) } {\rm{Re}}\{{f}^{^{\prime} }\left(z)\hspace{-0.08em}\text{/}\hspace{-0.08em}{f}_{k}^{^{\prime} }\left(z)\} and Re { f k ′ ( z ) / f ′ ( z ) } {\rm{Re}}\{{f}_{k}^{^{\prime} }\left(z)\hspace{-0.08em}\text{/}\hspace{-0.08em}{f}^{^{\prime} }\left(z)\} , where f ( z ) f\left(z) belongs to the subclass J p , q m ( μ , α , β ) {{\mathcal{J}}}_{p,q}^{m}\left(\mu ,\alpha ,\beta ) of analytic functions, defined by Sălăgean ( p , q ) \left(p,q) -differential operator. In addition, the inclusion relations involving N δ ( e ) {N}_{\delta }\left(e) of this generalized function class are considered.


Author(s):  
Anne Driemel ◽  
André Nusser ◽  
Jeff M. Phillips ◽  
Ioannis Psarros

AbstractThe Vapnik–Chervonenkis dimension provides a notion of complexity for systems of sets. If the VC dimension is small, then knowing this can drastically simplify fundamental computational tasks such as classification, range counting, and density estimation through the use of sampling bounds. We analyze set systems where the ground set X is a set of polygonal curves in $$\mathbb {R}^d$$ R d and the sets $$\mathcal {R}$$ R are metric balls defined by curve similarity metrics, such as the Fréchet distance and the Hausdorff distance, as well as their discrete counterparts. We derive upper and lower bounds on the VC dimension that imply useful sampling bounds in the setting that the number of curves is large, but the complexity of the individual curves is small. Our upper and lower bounds are either near-quadratic or near-linear in the complexity of the curves that define the ranges and they are logarithmic in the complexity of the curves that define the ground set.


10.37236/3262 ◽  
2013 ◽  
Vol 20 (3) ◽  
Author(s):  
Simon R. Blackburn

A rack of order $n$ is a binary operation $\vartriangleright$ on a set $X$ of cardinality $n$, such that right multiplication is an automorphism. More precisely, $(X,\vartriangleright)$ is a rack provided that the map $x\mapsto x\vartriangleright y$ is a bijection for all $y\in X$, and $(x\vartriangleright y)\vartriangleright z=(x\vartriangleright z)\vartriangleright (y\vartriangleright z)$ for all $x,y,z\in X$.The paper provides upper and lower bounds of the form $2^{cn^2}$ on the number of isomorphism classes of racks of order $n$. Similar results on the number of isomorphism classes of quandles and kei are obtained. The results of the paper are established by first showing how an arbitrary rack is related to its operator group (the permutation group on $X$ generated by the maps $x\mapsto x\vartriangleright y$ for $y\in Y$), and then applying some of the theory of permutation groups. The relationship between a rack and its operator group extends results of Joyce and of Ryder; this relationship might be of independent interest.


2019 ◽  
Vol 2019 (4) ◽  
pp. 132-151 ◽  
Author(s):  
Raphael Bost ◽  
Pierre-Alain Fouque

Abstract Besides their security, the efficiency of searchable encryption schemes is a major criteria when it comes to their adoption: in order to replace an unencrypted database by a more secure construction, it must scale to the systems which rely on it. Unfortunately, the relationship between the efficiency and the security of searchable encryption has not been widely studied, and the minimum cost of some crucial security properties is still unclear. In this paper, we present new lower bounds on the trade-offs between the size of the client state, the efficiency and the security for searchable encryption schemes. These lower bounds target two kinds of schemes: schemes hiding the repetition of search queries, and forward-private dynamic schemes, for which updates are oblivious. We also show that these lower bounds are tight, by either constructing schemes matching them, or by showing that even a small increase in the amount of leaked information allows for constructing schemes breaking the lower bounds.


Sign in / Sign up

Export Citation Format

Share Document