growing neural networks
Recently Published Documents


TOTAL DOCUMENTS

17
(FIVE YEARS 1)

H-INDEX

4
(FIVE YEARS 0)

2021 ◽  
pp. 222-234
Author(s):  
Paul Caillon ◽  
Christophe Cerisara


Complexity ◽  
2020 ◽  
Vol 2020 ◽  
pp. 1-10
Author(s):  
Shize Huang ◽  
Xiaowen Liu ◽  
Xiaolu Yang ◽  
Zhaoxin Zhang ◽  
Lingyu Yang

Trams have increasingly deployed object detectors to perceive running conditions, and deep learning networks have been widely adopted by those detectors. Growing neural networks have incurred severe attacks such as adversarial example attacks, imposing threats to tram safety. Only if adversarial attacks are studied thoroughly, researchers can come up with better defence methods against them. However, most existing methods of generating adversarial examples have been devoted to classification, and none of them target tram environment perception systems. In this paper, we propose an improved projected gradient descent (PGD) algorithm and an improved Carlini and Wagner (C&W) algorithm to generate adversarial examples against Faster R-CNN object detectors. Experiments verify that both algorithms can successfully conduct nontargeted and targeted white-box digital attacks when trams are running. We also compare the performance of the two methods, including attack effects, similarity to clean images, and the generating time. The results show that both algorithms can generate adversarial examples within 220 seconds, a much shorter time, without decrease of the success rate.



2017 ◽  
Vol 95 ◽  
pp. 29-43 ◽  
Author(s):  
Kenneth R. Ball ◽  
Christopher Grant ◽  
William R. Mundy ◽  
Timothy J. Shafer






Author(s):  
Andrey Bondarenko ◽  
Arkady Borisov ◽  
Ludmila Alekseeva

<p class="R-AbstractKeywords">Artificial neural networks (ANN) are well known for their good classification abilities. Recent advances in deep learning imposed second ANN renaissance. But neural networks possesses some problems like choosing hyper parameters such as neuron layers count and sizes which can greatly influence classification rate. Thus pruning techniques were developed that can reduce network sizes, increase its generalization abilities and overcome overfitting. Pruning approaches, in contrast to growing neural networks approach, assume that sufficiently large ANN is already trained and can be simplified with acceptable classification accuracy loss.</p><p class="R-AbstractKeywords">Current paper compares nodes vs weights pruning algorithms and gives experimental results for pruned networks accuracy rates versus their non-pruned counterparts. We conclude that nodes pruning is more preferable solution, with some sidenotes.</p>





Sign in / Sign up

Export Citation Format

Share Document