myNotes —— 记录一些常用的代码、资料啥的 In PyTorch, F.log_softmax and F.nll_loss are combined in one optimized function, F.cross_entropy. F.nll_loss(F.log_softmax(pred, -1), y_train) = F.cross_entropy(pred, y_train) Fastai courses A Code-First Introduction to Natural Language Processing https://www.youtube.com/playlist?list=PLtmWHNX-gukKocXQOkQjuVxglSDYWsSh9 LM data(X,y) processing https://github.com/nadavbh12/Character-Level-Language-Modeling-with-Deeper-Self-Attention-pytorch/blob/master/main.py https://medium.com/the-artificial-impostor/notes-neural-language-model-with-pytorch-a8369ba80a5c