Skip to content

Release v0.8.0

Compare
Choose a tag to compare
@astonzhang astonzhang released this 30 May 06:51
· 3892 commits to master since this release
41dad43

Highlights

D2L is now runnable on Amazon SageMaker and Google Colab.

New Contents

The following chapters are re-organized:

  • Natural Language Processing: Pretraining
  • Natural Language Processing: Applications

The following sections are added:

  • Subword Embedding (Byte-pair encoding)
  • Bidirectional Encoder Representations from Transformers (BERT)
  • The Dataset for Pretraining BERT
  • Pretraining BERT
  • Natural Language Inference and the Dataset
  • Natural Language Inference: Using Attention
  • Fine-Tuning BERT for Sequence-Level and Token-Level Applications
  • Natural Language Inference: Fine-Tuning BERT

Improvements

There have been many light revisions and improvements throughout the book.