Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Removing L2-norm in contrastive loss (L2-norm already present in CosSim) #6550

Merged
merged 2 commits into from
May 24, 2023

Conversation

Lucas-rbnt
Copy link
Contributor

@Lucas-rbnt Lucas-rbnt commented May 24, 2023

Description

The forward method of the ContrastiveLoss performs L2-normalization before computing cosine similarity. The torch.nn.functional.cosine_similarity method already handles this pre-processing to make sure that input and target lie on the surface of the unit hypersphere. This step involves an unnecessary cost and, thus, can be removed.

Types of changes

  • Non-breaking change (fix or new feature that would not break existing functionality).
  • Integration tests passed locally by running ./runtests.sh -f -u --net --coverage.
  • Quick tests passed locally by running ./runtests.sh --quick --unittests --disttests.

@Lucas-rbnt Lucas-rbnt changed the title Removing L2-norm in contrastive loss (L2-norm already present in cosi… Removing L2-norm in contrastive loss (L2-norm already present in CosSim) May 24, 2023
…ne-similarity computation)

Signed-off-by: Lucas Robinet <robinet.lucas@iuct-oncopole.fr>
@wyli
Copy link
Contributor

wyli commented May 24, 2023

/build

@wyli
Copy link
Contributor

wyli commented May 24, 2023

Hi @Lucas-rbnt, when the L2 norms are large values (e.g. for high dimensional embeddings) do you think this might be less stable numerically, have you tested this PR in end-to-end trainings?

cc @finalelement

@wyli
Copy link
Contributor

wyli commented May 24, 2023

Hi @Lucas-rbnt, when the L2 norms are large values (e.g. for high dimensional embeddings) do you think this might be less stable numerically, have you tested this PR in end-to-end trainings?

cc @finalelement

seems to be addressed in pytorch 1.12 for the same pytorch/pytorch@9e137ee

Copy link
Contributor

@wyli wyli left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, it looks good to me. (probably less stable for early versions of pytorch but more efficient for the recent versions)

Copy link
Contributor

@wyli wyli left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, it looks good to me. (probably less stable for early versions of pytorch but more efficient for the recent versions)

@Lucas-rbnt
Copy link
Contributor Author

Hi @wyli!
You were faster than me! I found also this commit. In torchmetrics for instance, the normalization comes before the calculation also.

I am currently running a SimCLR training on BraTS21 data centered on tumor to compare before and after the commit.
I will post here results and insights. The change is so minor that the difference will also be minor but will see! First results estimated in 3 hours :)

@wyli wyli enabled auto-merge (squash) May 24, 2023 13:17
@wyli
Copy link
Contributor

wyli commented May 24, 2023

/build

@wyli wyli merged commit ef2bd45 into Project-MONAI:dev May 24, 2023
@Lucas-rbnt Lucas-rbnt deleted the contrastive_loss branch October 15, 2024 13:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants