- Pytorch like autodifferentiation engine(dynamically constructed computational graph)
- Weights initialization:
- Activations: ReLU, Sigmoid, Tanh, Swish, ELU, LeakyRLU
- Convolutions
- Layers: Linear, BatchNorm, Flatten, Dropout
- Optimizers: SGD, Adam, AdamW
- Loss: CrossEntropyLoss, Mean Squared Error
- Computational Graph Visualizer
Show the difference between picograd and PyTorch {Train on Basic Neural Network}