Skip to content

Latest commit

 

History

History
37 lines (28 loc) · 1.48 KB

README.md

File metadata and controls

37 lines (28 loc) · 1.48 KB

microtorch

A small autograd engine with backpropagation (reverse-mode autodiff) over a dynamically built directed acyclic graph (DAG) and a small neural networks library on top of it following a PyTorch-like API. The DAG only operates over scalar values (unlike n-dimensional Tensors in PyTorch), so e.g. we chop up each neuron into all of its individual tiny adds and multiplies. However, this is enough to build up entire deep neural nets doing binary classification, as the demo notebook shows. Project, which is a result of ongoing research and exploration to deeply understand DNNs at a fundamental level.

Installation

You can download the repo and use it locally to test it.

Example usage

Below is an example showing a number of possible supported operations (as of Nov 10, 2024):

from micrograd.engine import Value

a = Value(-4.0)
b = Value(2.0)
c = a + b
d = a * b + b**3
c += c + 1
c += 1 + c + (-a)
d += d * 2 + (b + a).tanh()
d += 3 * d + (b - a).tanh()
e = c - d
f = e**2
g = f / 2.0
g += 10.0 / f
print(f'{g.data:.4f}') # prints 24.7041, the outcome of this forward pass
g.backward()
print(f'{a.grad:.4f}') # prints 138.8338, i.e. the numerical value of dg/da
print(f'{b.grad:.4f}') # prints 645.5773, i.e. the numerical value of dg/db

Tracing / visualization

You can see a visualization of the expressions as you build them using graphviz which is implemented in the notebook creating_microtorch.ipynb.