nanoGPT is a framework for training and fine-tuning medium-sized generative pretrained transformers (GPT). The author, Andrej Karpathy, references Attention is All You Need and OpenAI's GPT-3 papers to build a GPT from scratch using PyTorch. With all the hype around generative AI, we want to highlight nanoGPT for its simplicity and focus on clearly articulating the building blocks of the GPT architecture.