Skip to content

Instantly share code, notes, and snippets.

@gisbi-kim
Last active April 24, 2021 04:11
Show Gist options
  • Select an option

  • Save gisbi-kim/0d69c7be8deb4ae3b3874d2df76b496b to your computer and use it in GitHub Desktop.

Select an option

Save gisbi-kim/0d69c7be8deb4ae3b3874d2df76b496b to your computer and use it in GitHub Desktop.
Understanding how a constrastive loss works
%% Understanding how a constrastive loss works
% constrastive loss refs:
% - Kihyuk Sohn. Improved deep metric learning with multi- class n-pair loss objective. In Advances in Neural Informa- tion Processing Systems (NeurIPS), 2016.
% - Aaron van den Oord, Yazhe Li, and Oriol Vinyals. Repre- sentation learning with contrastive predictive coding. arXiv preprint arXiv:1807.03748, 2018.
% - Zaiwei Zhang et al. Self-Supervised Pretraining of 3D Features on any Point-Cloud, 2021
%% param
num_negs = 100;
tmpr = 0.1; % temparature
% low temparature value makes good case's loss more lower and makes bad
% case's loss more higher (i.e., their gap would be increased)
%% good case (low positive dist, high negatives dists)
cossim_pos = 0.9;
cossim_neg_min = 0.2;
cossim_neg_max = 0.5;
cossim_negs = (cossim_neg_max - cossim_neg_min) .* rand(num_negs,1) + cossim_neg_min;
d_contrast = -log( exp(cossim_pos/tmpr) / (exp(cossim_pos/tmpr) + sum(exp(cossim_negs/tmpr))) );
d_contrast
%% bad case (high positive dist, low negatives dists)
cossim_pos = 0.2;
cossim_neg_min = 0.5;
cossim_neg_max = 0.8;
cossim_negs = (cossim_neg_max - cossim_neg_min) .* rand(num_negs,1) + cossim_neg_min;
d_contrast = -log( exp(cossim_pos/tmpr) / (exp(cossim_pos/tmpr) + sum(exp(cossim_negs/tmpr))) );
d_contrast
%% expected results
% d_contrast =
% 0.4662 % low loss means the estimated (simulated) distance is good and the estimator is well optimized
% d_contrast =
% 9.4555 % high loss means the estimated (simulated) distance is not good and the estimator is not yet well optimized
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment