adaptive weighted sum
Recently Published Documents


TOTAL DOCUMENTS

19
(FIVE YEARS 4)

H-INDEX

8
(FIVE YEARS 1)

2020 ◽  
Vol 88 ◽  
pp. 107320 ◽  
Author(s):  
Rui Liu ◽  
Min Yuan ◽  
Huang Xu ◽  
Pinzhong Chen ◽  
Xu Steven Xu ◽  
...  

Author(s):  
Hanzhang Hu ◽  
Debadeepta Dey ◽  
Martial Hebert ◽  
J. Andrew Bagnell

This work considers the trade-off between accuracy and testtime computational cost of deep neural networks (DNNs) via anytime predictions from auxiliary predictions. Specifically, we optimize auxiliary losses jointly in an adaptive weighted sum, where the weights are inversely proportional to average of each loss. Intuitively, this balances the losses to have the same scale. We demonstrate theoretical considerations that motivate this approach from multiple viewpoints, including connecting it to optimizing the geometric mean of the expectation of each loss, an objective that ignores the scale of losses. Experimentally, the adaptive weights induce more competitive anytime predictions on multiple recognition data-sets and models than non-adaptive approaches including weighing all losses equally. In particular, anytime neural networks (ANNs) can achieve the same accuracy faster using adaptive weights on a small network than using static constant weights on a large one. For problems with high performance saturation, we also show a sequence of exponentially deepening ANNs can achieve near-optimal anytime results at any budget, at the cost of a const fraction of extra computation.


2016 ◽  
Vol 06 (04) ◽  
pp. 61-73 ◽  
Author(s):  
Renfang Jiang ◽  
Jianping Dong ◽  
Yilin Dai

2013 ◽  
Vol 33 (2) ◽  
pp. 483-499 ◽  
Author(s):  
Nian Cai ◽  
Nannan Zhu ◽  
Wenting Guo ◽  
Bingo Wing-Kuen Ling ◽  
Han Wang ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document