Skip to content
/ optimchan Public

Optimization in DNN: Gradient Descent (Vanilla, Momentum, Nesterov Momentum), Adagrad, AdaDelta, RMSProp, Adam with various DNN architectures (CNN, ResNet, VGG, GAN, VAE, RNN)

Notifications You must be signed in to change notification settings

p2pc/optimchan

About

Optimization in DNN: Gradient Descent (Vanilla, Momentum, Nesterov Momentum), Adagrad, AdaDelta, RMSProp, Adam with various DNN architectures (CNN, ResNet, VGG, GAN, VAE, RNN)

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published