Skip to content

amanmoon/micrograd

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 

Repository files navigation

Backpropagation Algorithm Implementation

This project focuses on implementing a basic-level backpropagation algorithm, inspired by the "micrograd" project developed by Andrej Karpathy. The core component of this implementation is the Value class, which serves as a container for storing data related to an object's value and gradient.

This project provides an implementation of a basic-level backpropagation algorithm through the Value class. The class allows for mathematical operations, such as addition, multiplication, and exponentiation, as well as activation functions like ReLU. It also supports reverse operations and provides a mechanism for running backpropagation to compute gradients. This implementation serves as a valuable learning tool for understanding the principles behind neural networks and gradient-based optimization.

Neural Network Implementation

This project also includes a neural network implementation consisting of the Neuron, Layer, and MLP classes. These classes work together to create and predict outputs from a multi-layer perceptron (MLP) neural network.

This project provides an implementation of a neural network using the Neuron, Layer, and MLP classes. The neural network is capable of predicting outputs based on given inputs. The weights and biases of the network are randomly initialized. This implementation serves as a starting point for understanding and experimenting with neural networks.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages