From b6df36507054f0cd0d21273dc0edda3efdda1edd Mon Sep 17 00:00:00 2001 From: Thomas Date: Mon, 10 Apr 2023 11:44:51 +0200 Subject: [PATCH] Update part2.qmd --- Lessons/part2.qmd | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/Lessons/part2.qmd b/Lessons/part2.qmd index 20bb729..a510a20 100644 --- a/Lessons/part2.qmd +++ b/Lessons/part2.qmd @@ -21,7 +21,7 @@ In this course, we'll explore diffusion methods such as Denoising Diffusion Prob Along the way, we'll cover essential deep learning topics like neural network architectures, data augmentation approaches, and various loss functions. We'll build our own models from scratch, such as Multi-Layer Perceptrons (MLPs), ResNets, and Unets, while experimenting with generative architectures like autoencoders and transformers. -Throughout the course, we'll PyTorch to implement our models, and will create our own deep learning framework called `miniai`. We'll master Python concepts like iterators, generators, and decorators to keep our code clean and efficient. We'll also explore deep learning optimizers like stochastic gradient descent (SGD) accelerated approaches, learning rate annealing, and learning how to experiment with the impact different initialisers, batch sizes and learning rates. And of course, we'll make use of handy tools like the Python debugger (pdb) and nbdev for building Python modules from Jupyter notebooks. +Throughout the course, we'll use PyTorch to implement our models, and will create our own deep learning framework called `miniai`. We'll master Python concepts like iterators, generators, and decorators to keep our code clean and efficient. We'll also explore deep learning optimizers like stochastic gradient descent (SGD) accelerated approaches, learning rate annealing, and learning how to experiment with the impact different initialisers, batch sizes and learning rates. And of course, we'll make use of handy tools like the Python debugger (pdb) and nbdev for building Python modules from Jupyter notebooks. Lastly, we'll touch on fundamental concepts like tensors, calculus, and pseudo-random number generation to provide a solid foundation for our exploration. We'll apply these concepts to machine learning techniques like mean shift clustering and convolutional neural networks (CNNs), and will see how to use tracking with Weights and Biases (W&B).