Skip to main content

A Simple Supervised Hashing Algorithm Using Projected Gradient And Oppositional Weights

Sobhan Hemati, Mohammad Hadi Mehdizavareh, Morteza Babaie, Shivam Kalra, H.R. Tizhoosh

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:14:36
20 Sep 2021

Learning to hash is generating similarity-preserving binary representations of images, which is, among others, an efficient way for fast image retrieval. Two-step hashing has become a common approach because it simplifies the learning by separating binary code inference from hash function training. However, the binary code inference typically leads to an intractable optimization problem with binary constraints. Different relaxation methods, which are generally based on complicated optimization techniques, have been proposed to address this challenge. In this paper, a simple relaxation scheme based on the projected gradient is proposed. To this end in each iteration, we try to update the optimization variable as if there is no binary constraint and then project the updated solution to the feasible set. We formulate the projection step as fining closet binary matrix to the updated matrix and take advantage of the closed-form solution for the projection step to complete our learning algorithm. Inspired by opposition-based learning, pairwise opposite weights between data points are incorporated to impose a stronger penalty on data instances with higher misclassification probability in the proposed objective function. We show that this simple learning algorithm leads to binary codes that achieve competitive results on both CIFAR-10 and NUS-WIDE datasets compared to state-of-the-art benchmarks.

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: Free
    IEEE Members: $25.00
    Non-members: $40.00
  • SPS
    Members: Free
    IEEE Members: $25.00
    Non-members: $40.00
  • SPS
    Members: Free
    IEEE Members: $85.00
    Non-members: $100.00