This repository provides a Python implementation of GPT-2 from scratch, along with the ability to load pre-trained weights provided by OpenAI. This allows you to generate text using the power of GPT-2 without relying on external libraries.
- GPT-2 implemented from scratch: Understand the inner workings of GPT-2 by exploring its implementation using basic Python libraries like NumPy.
- Load OpenAI pre-trained weights: Utilize the pre-trained weights provided by OpenAI to generate high-quality text without extensive training.
- Command-line interface: Interact with the model easily using a simple command-line interface to input your starting text and generate continuations.
A detailed explanation of the code is posted on Medium.
Create your own GPT and generate text with OpenAI’s pre-trained parameters