A tour of different optimization algorithms in PyTorch.
-
Updated
Dec 11, 2021 - Jupyter Notebook
A tour of different optimization algorithms in PyTorch.
This package is dedicated to high-order optimization methods. All the methods can be used similarly to standard PyTorch optimizers.
SGLD and cSGLD as a PyTorch Optimizer
0th order optimizers, gradient chaining, random gradient approximation
Add a description, image, and links to the pytorch-optimizers topic page so that developers can more easily learn about it.
To associate your repository with the pytorch-optimizers topic, visit your repo's landing page and select "manage topics."