Sorry, you need to enable JavaScript to visit this website.

SUPERVISED HASHING WITH JOINTLY LEARNING EMBEDDING AND QUANTIZATION

Citation Author(s):
Xiang Xiang;Feng Wang;Trac D. Tran
Submitted by:
Hao Zhu
Last updated:
15 September 2017 - 11:51am
Document Type:
Poster
Document Year:
2017
Event:
Paper Code:
TQ-PG.10
 

Compared with unsupervised hashing, supervised hashing commonly illustrates better accuracy in many real applica- tions by leveraging semantic (label) information. However, it is tough to solve the supervised hashing problem directly because it is essentially a discrete optimization problem. Some other works try to solve the discrete optimization problem directly using binary quadratic programming, but they are typically too complicated and time-consuming while some supervised hashing methods have to solve a relaxed continuous optimization problem by dropping the discrete con- straints. However, these methods typically suffer from poor performance due to the errors caused by the relaxation manner. In this paper based on the general two-step framework: learning binary embedded codes and learning hash functions, we propose a new method to solve the problem introduced by relaxing the cost function. Inspired by the property of rotation invariance of learning embedding features, our method tries to jointly learn similarity-preserving representation and rotation transformation for better quantization alternatively. In experiments, our method shows significant improvement. Compared with the methods based on discrete optimization our methods obtains the competitive performance and even achieves the state-of-the-art performance in some image retrieval applications.

up
0 users have voted: