hard thresholding
Recently Published Documents


TOTAL DOCUMENTS

208
(FIVE YEARS 40)

H-INDEX

18
(FIVE YEARS 1)

2022 ◽  
Vol 56 ◽  
pp. 367-390
Author(s):  
Jian-Feng Cai ◽  
Jingzhi Li ◽  
Xiliang Lu ◽  
Juntao You


2021 ◽  
Author(s):  
Mingrui Chen ◽  
Weiyu Li ◽  
weizhi lu

Recently, it has been observed that $\{0,\pm1\}$-ternary codes which are simply generated from deep features by hard thresholding, tend to outperform $\{-1, 1\}$-binary codes in image retrieval. To obtain better ternary codes, we for the first time propose to jointly learn the features with the codes by appending a smoothed function to the networks. During training, the function could evolve into a non-smoothed ternary function by a continuation method, and then generate ternary codes. The method circumvents the difficulty of directly training discrete functions and reduces the quantization errors of ternary codes. Experiments show that the proposed joint learning indeed could produce better ternary codes.



2021 ◽  
Author(s):  
Mingrui Chen ◽  
Weiyu Li ◽  
weizhi lu

Recently, it has been observed that $\{0,\pm1\}$-ternary codes which are simply generated from deep features by hard thresholding, tend to outperform $\{-1, 1\}$-binary codes in image retrieval. To obtain better ternary codes, we for the first time propose to jointly learn the features with the codes by appending a smoothed function to the networks. During training, the function could evolve into a non-smoothed ternary function by a continuation method, and then generate ternary codes. The method circumvents the difficulty of directly training discrete functions and reduces the quantization errors of ternary codes. Experiments show that the proposed joint learning indeed could produce better ternary codes.





Author(s):  
R. Grotheer ◽  
S. Li ◽  
A. Ma ◽  
D. Needell ◽  
J. Qin




Sign in / Sign up

Export Citation Format

Share Document