Skip to content

Commit

Permalink
Merge pull request #25 from shreydan/anthony-add-colab
Browse files Browse the repository at this point in the history
add colab button
  • Loading branch information
asusevski authored Feb 7, 2024
2 parents eabb8db + 7183a81 commit 33edf14
Showing 1 changed file with 6 additions and 1 deletion.
Original file line number Diff line number Diff line change
Expand Up @@ -33,4 +33,9 @@ The distillation loss is formulated as:
Where the KL loss refers to the Kullback-Leibler Divergence between the teacher and the student's output distributions.
The overall loss for the student model is then formulated as the sum of this distillation loss with the standard cross-entropy loss over the ground-truth labels.

To see this loss function implemented in Python as well as a fully worked out example in Python, lets check out the notebook for this section, ```KnowledgeDistillation.ipynb```.
To see this loss function implemented in Python as well as a fully worked out example in Python, lets check out the notebook for this section, ```KnowledgeDistillation.ipynb```

<a target="_blank" href="https://colab.research.google.com/github/johko/computer-vision-course/blob/main/notebooks/Unit%203%20-%20Vision%20Transformers/KnowledgeDistillation.ipynb">
<img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/>
</a>

0 comments on commit 33edf14

Please sign in to comment.