Skip to content

MLX Institute | Models and experiments: Handwritten Transformer Architecture in PyTorch | CNN Autoencoder with PCA

Notifications You must be signed in to change notification settings

mimireyburn/MLX_Models

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

25 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Architectures | MLX Institute

My handwritten models and experiments from the intensive ML program at MLX Institute.

Transformers

Transformer Architecture: Intensive Week 4 at MLX Institute

A GPT for generating new stories. A Transformer architecture containing Multi-Headed Self-Attention and tracking with Weights and Biases, trained on the TinyStories dataset.

PCA

CNNs and Autoencoders: Intensive Week 2 at MLX Institute

A CNN autoencoder script for visualising similarity between a dataset of images from the MLX Institute TinyWorld simulation, including interactive 3D plots of Principle Component Analysis (PCA) to visualise clusters of similar encodings.

## Numpy CNNs and MLP in Numpy: Module 1 Project 1 at MLX Institute

Simple implementaitons of multi-layer perceptron architecture using only numpy and expanding to convolutional neural networks.

PCA & Transformers files can be run as Jupyter Notebooks in VSC with the Jupyter Extension.

About

MLX Institute | Models and experiments: Handwritten Transformer Architecture in PyTorch | CNN Autoencoder with PCA

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published