Skip to content

TSAI-END3/Session10

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 

Repository files navigation

TSAI Group Assignment

Group Members:

  1. Arjun Gupta
  2. Himanshu
  3. Aeshna Singh
  4. Palash Baranwal

SESSION 10 - Transformers Review

ASSIGNMENT

  1. Train the same code, but on different data. If you have n-classes, your accuracy MUST be more than 4 * 100 / n.
  2. Submit the Github link, that includes your notebook with training logs, and proper readme file.

DATASET USED

MANYTHINGS ORG DUTCH - ENGLISH DATASET

Links:

image


DIAGRAMS

Transformer

Transformer

Encoder

Encoder

Attention

Transformer

Decoder

Transformer


SCREENSHOTS

TRAINING LOGS

image

EVALUATION OUTPUT

image

TRANSLATION OUTPUT

image

REFERENCES

  1. Attention is All You Need: https://github.com/ammesatyajit/pytorch-seq2seq/blob/master/6%20-%20Attention%20is%20All%20You%20Need.ipynb
  2. Paper: Attention is All You Need https://arxiv.org/pdf/1706.03762.pdf
  3. The Illustrated Transformer https://jalammar.github.io/illustrated-transformer/
  4. What Do Position Embeddings Learn? https://arxiv.org/pdf/2010.04903.pdf
  5. https://github.com/bentrevett/pytorch-seq2seq/blob/master/6%20-%20Attention%20is%20All%20You%20Need.ipynb

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published