scholarly journals Progressive Generative Hashing for Image Retrieval

Author(s):  
Yuqing Ma ◽  
Yue He ◽  
Fan Ding ◽  
Sheng Hu ◽  
Jun Li ◽  
...  

Recent years have witnessed the success of the emerging hashing techniques in large-scale image retrieval. Owing to the great learning capacity, deep hashing has become one of the most promising solutions, and achieved attractive performance in practice. However, without semantic label information, the unsupervised deep hashing still remains an open question. In this paper, we propose a novel progressive generative hashing (PGH) framework to help learn a discriminative hashing network in an unsupervised way. Very different from existing studies, it first treats the hash codes as a kind of semantic condition for the similar image generation, and simultaneously feeds the original image and its codes into the generative adversarial networks (GANs). The real images together with the synthetic ones can further help train a discriminative hashing network based on a triplet loss. By iteratively inputting the learnt codes into the hash conditioned GANs, we can progressively enable the hashing network to discover the semantic relations. Extensive experiments on the widely-used image datasets demonstrate that PGH can significantly outperforms state-of-the-art unsupervised hashing methods.

Author(s):  
Rong-Cheng Tu ◽  
Xian-Ling Mao ◽  
Bo-Si Feng ◽  
Shu-ying Yu

Recently, similarity-preserving hashing methods have been extensively studied for large-scale image retrieval. Compared with unsupervised hashing, supervised hashing methods for labeled data have usually better performance by utilizing semantic label information. Intuitively, for unlabeled data, it will improve the performance of unsupervised hashing methods if we can first mine some supervised semantic 'label information' from unlabeled data and then incorporate the 'label information' into the training process. Thus, in this paper, we propose a novel Object Detection based Deep Unsupervised Hashing method (ODDUH). Specifically, a pre-trained object detection model is utilized to mining supervised 'label information', which is used to guide the learning process to generate high-quality hash codes. Extensive experiments on two public datasets demonstrate that the proposed method outperforms the state-of-the-art unsupervised hashing methods in the image retrieval task.


2017 ◽  
Vol 243 ◽  
pp. 166-173 ◽  
Author(s):  
Wanqing Zhao ◽  
Hangzai Luo ◽  
Jinye Peng ◽  
Jianping Fan

Author(s):  
Yongfei Zhang ◽  
Cheng Peng ◽  
Jingtao Zhang ◽  
Xianglong Liu ◽  
Shiliang Pu ◽  
...  

2017 ◽  
Vol 2017 ◽  
pp. 1-8
Author(s):  
Lijuan Duan ◽  
Chongyang Zhao ◽  
Jun Miao ◽  
Yuanhua Qiao ◽  
Xing Su

Hashing has been widely deployed to perform the Approximate Nearest Neighbor (ANN) search for the large-scale image retrieval to solve the problem of storage and retrieval efficiency. Recently, deep hashing methods have been proposed to perform the simultaneous feature learning and the hash code learning with deep neural networks. Even though deep hashing has shown the better performance than traditional hashing methods with handcrafted features, the learned compact hash code from one deep hashing network may not provide the full representation of an image. In this paper, we propose a novel hashing indexing method, called the Deep Hashing based Fusing Index (DHFI), to generate a more compact hash code which has stronger expression ability and distinction capability. In our method, we train two different architecture’s deep hashing subnetworks and fuse the hash codes generated by the two subnetworks together to unify images. Experiments on two real datasets show that our method can outperform state-of-the-art image retrieval applications.


IEEE Access ◽  
2020 ◽  
Vol 8 ◽  
pp. 167504-167515
Author(s):  
Wei Yao ◽  
Feifei Lee ◽  
Lu Chen ◽  
Chaowei Lin ◽  
Shuai Yang ◽  
...  

2017 ◽  
Vol 77 (9) ◽  
pp. 10471-10484 ◽  
Author(s):  
Liangfu Cao ◽  
Lianli Gao ◽  
Jingkuan Song ◽  
Fumin Shen ◽  
Yuan Wang

Sign in / Sign up

Export Citation Format

Share Document