Gillis: Serving Large Neural Networks in Serverless Functions with Automatic Model Partitioning

Author(s):  
Minchen Yu ◽  
Zhifeng Jiang ◽  
Hok Chun Ng ◽  
Wei Wang ◽  
Ruichuan Chen ◽  
...  
2001 ◽  
Vol 10 (03) ◽  
pp. 345-371
Author(s):  
GEORGE D. MANIOUDAKIS ◽  
SPIRIDON D. LIKOTHANASSIS

Neural Networks are massively parallel processing systems, that require expensive and usually not available hardware, in order to be realized. Fortunately, the development of effective and accessible software, makes their simulation easy. Thus, various neural network's implementation tools exist in the market, which are oriented to the specific learning algorithm used. Furthermore, they can simulate only fixed size networks. In this work, we present some object-oriented techniques that have been used to defined some types of neuron and network objects, that can be used to realize, in a localized approach, some fast and powerful learning algorithms which combine results of the optimal filtering and the multi-model partitioning theory. Thus, one can build and implement intelligent learning algorithms that face both, the training as well as the on-line adjustment of the network size. Furthermore, the design methodology used, results to a system modeled as a collection of concurrent executable objects, making easy the parallel implementation. The whole design results in a general purpose tool box which is characterized by maintainability, reusability, and increased modularity. The provided features are shown by the presentation of some practical applications.


Sign in / Sign up

Export Citation Format

Share Document