Contractive loss

  • Contrastive Loss is a distance-based Loss function (as opposed to prediction error-based losses like cross entropy) used to learn discriminative features for images.
  • Like any distance-based loss, it tries to ensure that semantically similar examples are embedded close together. It is calculated on Pairs
  • This loss measures the similarity between two inputs.
  • Each sample is composed of two images (positive pairs or negative pairs). Our goal is to maximize the distance between negative pairs and minimize the distance between *positive pairs*.
  • We want small distance between the positive pairs (because they are similar images/inputs), and great distance than some margin m for negative pairs.



Enjoy Reading This Article?

Here are some more articles you might like to read next:

  • MySQL Syntax
  • EER diagrams and database tables
  • Is KL divergence same as cross entropy for image classification?
  • Why cross entropy comes in hand with Softmax layer?
  • Joined and be an event volunteer in the conference of Goodle cloud of Toronto