Skip to content

Modern deep neural networks from scratch using only NumPy. Minimal, self-contained, and easy to understand codebase. Inspired by Karpathy's micrograd.

Notifications You must be signed in to change notification settings

kinshukdua/begiNNer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Neural Network From Scratch

Motivation

As a somewhat experienced Machine Learning Engineer (MLE), I have always been fascinated by the inner workings of neural networks. Despite having worked and contributed to various high-level libraries like Pytorch, I believe there is immense value in understanding and building neural networks from the ground up. This project aims to create a minimal, self-contained, and beginner-friendly neural network implementation using only NumPy.

Features

  • Minimal and self-contained neural network implementation
  • Beginner-friendly with extensive documentation and examples
  • Pure numpy-based operations

Roadmap

Initial Implementation

Basic Features

  • Linear layer

  • Forward pass

  • Autograd system

    • Define computational graph
    • Compute gradients
    • Chain rule implementation
    • Backpropagation
  • Activation functions

    • ReLU
    • Sigmoid
    • Tanh
  • Backward pass

  • Loss functions

    • Mean Squared Error (MSE)
    • Cross-Entropy Loss
  • Optimizers

    • Stochastic Gradient Descent (SGD)
    • Adam

Intermediate Features

  • Conv layers
  • Scaled Dot Self Attention Layers
  • Batch normalization
  • Dropout

Advanced Features

  • GPU support with CuPy
  • Model saving and loading

Future Goals

  • Comprehensive test suite
  • Documentation and tutorials
  • Standalone Rust/C++ (probably a new project)

About

Modern deep neural networks from scratch using only NumPy. Minimal, self-contained, and easy to understand codebase. Inspired by Karpathy's micrograd.

Topics

Resources

Stars

Watchers

Forks