Stochaster is a versatile and user-friendly artificial neural network framework. Designed with clarity, conciseness, and readability in mind, it aims to simplify the process of creating and understanding neural networks.
-
Autograd Engine: Implements reverse-mode autodifferentiation for efficient backpropagation.
- Customizable tensors via the
Tensor
class. - An autodiff wrapper for custom functions through the
Function
class.
- Customizable tensors via the
-
Neural Network Library: A brief yet powerful collection of tools for building neural networks.
- Supports common layer types and optimizers.
- Currently provides MNIST-level capabilities.
- Code Readability: The engine's source code prioritizes the
Tensor
andFunction
abstractions to ensure that anyone familiar with the mathematics of backpropagation can easily understand and work with the framework. - API Design: The API for both the engine and neural network library is designed to be like PyTorch.
Stochaster Org. also provides secondary-level educational Jupyter Notebooks on ANNs in Notebooks covering from top-level APIs (e.g. PyTorch) to linear algebra/calculus present in basic ANNs.
The only option for using Stochaster is building from source for now. PyPi package (via pip) will be available after succesful tests
- Engine Example: Section to be completed.
- Neural Network Examples: Check out the
examples
folder for practical implementations of Stochaster's NN and Engine libraries. Currently, it features an MNIST classifier implemented using cross-entropy loss and SGD for optimization.
- Upcoming Features: Tracing for backpropagation and its visualization to enhance understanding and debugging capabilities.
To run the unit tests you will have to install PyTorch. Then simply:
python -m pytest
Stochaster is made available under the MIT License, promoting open and permissive software use and redistribution.