Paper ID | ARS-10.12 | ||
Paper Title | A simple supervised hashing algorithm using projected gradient and oppositional weights | ||
Authors | Sobhan Hemati, Mohammad Hadi Mehdizavareh, Morteza Babaie, University of Waterloo, Canada; Shivam Kalra, Kimia Lab, University of Waterloo, Canada; H.R. Tizhoosh, University of Waterloo, Canada | ||
Session | ARS-10: Image and Video Analysis and Synthesis | ||
Location | Area H | ||
Session Time: | Monday, 20 September, 15:30 - 17:00 | ||
Presentation Time: | Monday, 20 September, 15:30 - 17:00 | ||
Presentation | Poster | ||
Topic | Image and Video Analysis, Synthesis, and Retrieval: Image & Video Storage and Retrieval | ||
IEEE Xplore Open Preview | Click here to view in IEEE Xplore | ||
Abstract | Learning to hash is generating similarity-preserving binary representations of images, which is, among others, an efficient way for fast image retrieval. Two-step hashing has become a common approach because it simplifies the learning by separating binary code inference from hash function training. However, the binary code inference typically leads to an intractable optimization problem with binary constraints. Different relaxation methods, which are generally based on complicated optimization techniques, have been proposed to address this challenge. In this paper, a simple relaxation scheme based on the projected gradient is proposed. To this end in each iteration, we try to update the optimization variable as if there is no binary constraint and then project the updated solution to the feasible set. We formulate the projection step as fining closet binary matrix to the updated matrix and take advantage of the closed-form solution for the projection step to complete our learning algorithm. Inspired by opposition-based learning, pairwise opposite weights between data points are incorporated to impose a stronger penalty on data instances with higher misclassification probability in the proposed objective function. We show that this simple learning algorithm leads to binary codes that achieve competitive results on both CIFAR-10 and NUS-WIDE datasets compared to state-of-the-art benchmarks. |