Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Barlow Twins loss for representation learning #7530

Merged
merged 10 commits into from
Mar 22, 2024

Conversation

Lucas-rbnt
Copy link
Contributor

Description

Addition of the BarlowTwinsLoss class. This cost function is introduced in the http://proceedings.mlr.press/v139/zbontar21a/zbontar21a.pdf paper with the aim of disentangling the representations learned on two views of the same sample, making it a powerful tool for multimodal and unsupervised learning.
This cost function is similar to the InfoNCE Loss function already implemented in MONAI (https://docs.monai.io/en/latest/_modules/monai/losses/contrastive.html#ContrastiveLoss). However, it differs in several respects: there is no l2-normalisation, but rather a z-normalisation. In addition, rather than working between pairs of embeddings, Barlow Twins seeks to decorrelate the components of the representations.

$$\mathcal{L}_{BT} := \sum_i (1 - \mathcal{C}_{ii})^2 + \lambda \sum_i \sum_{i\neq j} \mathcal{C}_{ij}^2$$

with $\lambda$ a positive hyperparameters and $\mathcal{C}$ the cross-correlation matrix

Types of changes

  • Non-breaking change (fix or new feature that would not break existing functionality).
  • Breaking change (fix or new feature that would cause existing functionality to change).
  • New tests added to cover the changes.
  • Integration tests passed locally by running ./runtests.sh -f -u --net --coverage.
  • Quick tests passed locally by running ./runtests.sh --quick --unittests --disttests.
  • In-line docstrings updated.
  • Documentation updated, tested make html command in the docs/ folder.

Signed-off-by: Lucas Robinet <robinet.lucas@iuct-oncopole.fr>
Lucas-rbnt and others added 4 commits March 11, 2024 18:42
Co-authored-by: Eric Kerfoot <17726042+ericspod@users.noreply.github.com>
Signed-off-by: Lucas Robinet <67736918+Lucas-rbnt@users.noreply.github.com>
Co-authored-by: Eric Kerfoot <17726042+ericspod@users.noreply.github.com>
Signed-off-by: Lucas Robinet <67736918+Lucas-rbnt@users.noreply.github.com>
…s aim and use cases

Signed-off-by: Lucas Robinet <robinet.lucas@iuct-oncopole.fr>
Co-authored-by: Eric Kerfoot <17726042+ericspod@users.noreply.github.com>
Signed-off-by: Lucas Robinet <67736918+Lucas-rbnt@users.noreply.github.com>
Copy link
Contributor

@KumoLiu KumoLiu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the PR! The overall work looks good to me.
However, I have one comment regarding the batch_size. Please take a look when you get a chance

monai/losses/barlow_twins.py Outdated Show resolved Hide resolved
monai/losses/barlow_twins.py Outdated Show resolved Hide resolved
Lucas-rbnt and others added 4 commits March 18, 2024 10:49
Co-authored-by: YunLiu <55491388+KumoLiu@users.noreply.github.com>
Signed-off-by: Lucas Robinet <67736918+Lucas-rbnt@users.noreply.github.com>
Signed-off-by: Lucas Robinet <robinet.lucas@iuct-oncopole.fr>
@KumoLiu
Copy link
Contributor

KumoLiu commented Mar 22, 2024

/build

1 similar comment
@KumoLiu
Copy link
Contributor

KumoLiu commented Mar 22, 2024

/build

@KumoLiu KumoLiu merged commit c649934 into Project-MONAI:dev Mar 22, 2024
28 checks passed
juampatronics pushed a commit to juampatronics/MONAI that referenced this pull request Mar 25, 2024
### Description

Addition of the BarlowTwinsLoss class. This cost function is introduced
in the http://proceedings.mlr.press/v139/zbontar21a/zbontar21a.pdf paper
with the aim of disentangling the representations learned on two views
of the same sample, making it a powerful tool for multimodal and
unsupervised learning.
This cost function is similar to the InfoNCE Loss function already
implemented in MONAI
(https://docs.monai.io/en/latest/_modules/monai/losses/contrastive.html#ContrastiveLoss).
However, it differs in several respects: there is no l2-normalisation,
but rather a z-normalisation. In addition, rather than working between
pairs of embeddings, Barlow Twins seeks to decorrelate the components of
the representations.

```math
\mathcal{L}_{BT} := \sum_i (1 - \mathcal{C}_{ii})^2 + \lambda \sum_i \sum_{i\neq j} \mathcal{C}_{ij}^2
```
with $\lambda$ a positive hyperparameters and $\mathcal{C}$ the
cross-correlation matrix

### Types of changes
<!--- Put an `x` in all the boxes that apply, and remove the not
applicable items -->
- [x] Non-breaking change (fix or new feature that would not break
existing functionality).
- [ ] Breaking change (fix or new feature that would cause existing
functionality to change).
- [x] New tests added to cover the changes.
- [x] Integration tests passed locally by running `./runtests.sh -f -u
--net --coverage`.
- [x] Quick tests passed locally by running `./runtests.sh --quick
--unittests --disttests`.
- [x] In-line docstrings updated.
- [x] Documentation updated, tested `make html` command in the `docs/`
folder.

---------

Signed-off-by: Lucas Robinet <robinet.lucas@iuct-oncopole.fr>
Signed-off-by: Lucas Robinet <67736918+Lucas-rbnt@users.noreply.github.com>
Co-authored-by: Lucas Robinet <robinet.lucas@iuct-oncopole.fr>
Co-authored-by: Eric Kerfoot <17726042+ericspod@users.noreply.github.com>
Co-authored-by: YunLiu <55491388+KumoLiu@users.noreply.github.com>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Signed-off-by: Juan Pablo de la Cruz Gutiérrez <juampatronics@gmail.com>
Yu0610 pushed a commit to Yu0610/MONAI that referenced this pull request Apr 11, 2024
### Description

Addition of the BarlowTwinsLoss class. This cost function is introduced
in the http://proceedings.mlr.press/v139/zbontar21a/zbontar21a.pdf paper
with the aim of disentangling the representations learned on two views
of the same sample, making it a powerful tool for multimodal and
unsupervised learning.
This cost function is similar to the InfoNCE Loss function already
implemented in MONAI
(https://docs.monai.io/en/latest/_modules/monai/losses/contrastive.html#ContrastiveLoss).
However, it differs in several respects: there is no l2-normalisation,
but rather a z-normalisation. In addition, rather than working between
pairs of embeddings, Barlow Twins seeks to decorrelate the components of
the representations.

```math
\mathcal{L}_{BT} := \sum_i (1 - \mathcal{C}_{ii})^2 + \lambda \sum_i \sum_{i\neq j} \mathcal{C}_{ij}^2
```
with $\lambda$ a positive hyperparameters and $\mathcal{C}$ the
cross-correlation matrix

### Types of changes
<!--- Put an `x` in all the boxes that apply, and remove the not
applicable items -->
- [x] Non-breaking change (fix or new feature that would not break
existing functionality).
- [ ] Breaking change (fix or new feature that would cause existing
functionality to change).
- [x] New tests added to cover the changes.
- [x] Integration tests passed locally by running `./runtests.sh -f -u
--net --coverage`.
- [x] Quick tests passed locally by running `./runtests.sh --quick
--unittests --disttests`.
- [x] In-line docstrings updated.
- [x] Documentation updated, tested `make html` command in the `docs/`
folder.

---------

Signed-off-by: Lucas Robinet <robinet.lucas@iuct-oncopole.fr>
Signed-off-by: Lucas Robinet <67736918+Lucas-rbnt@users.noreply.github.com>
Co-authored-by: Lucas Robinet <robinet.lucas@iuct-oncopole.fr>
Co-authored-by: Eric Kerfoot <17726042+ericspod@users.noreply.github.com>
Co-authored-by: YunLiu <55491388+KumoLiu@users.noreply.github.com>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Signed-off-by: Yu0610 <612410030@alum.ccu.edu.tw>
@Lucas-rbnt Lucas-rbnt deleted the barlow-twins branch October 15, 2024 13:32
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants