Eamon2009 / Transformer-language-model
PublicAn educational implementation of a GPT-style language model built from scratch using PyTorch to understand how transformer-based AI models work. No pre-trained weights. No fine-tuning,can be trained on $300 laptop
This repository provides a from-scratch implementation of a character-level transformer language model in PyTorch, trained on children's stories to generate simple narrative text, with detailed documentation on setup, training results, and analysis.
How It Works
You find this fun project on GitHub that lets you teach a computer to create children's stories by learning patterns from real tales.
You collect simple children's stories into a text file so the computer has examples to learn from.
You launch the program, and it begins reading your stories to figure out how words and sentences connect.
You see progress updates as it improves step by step, getting better at predicting the next letters and words, just like a child learning to read.
It automatically keeps the smartest point it reaches during learning for later use.
Your computer now writes its own simple stories, like 'Once upon a time, there were two friends...', and you can keep creating more forever.
Star Growth
Repurpose is a Pro feature
Generate ready-to-use prompts for X threads, LinkedIn posts, blog posts, YouTube scripts, and more -- with full repo context baked in.
Unlock RepurposeSimilar repos coming soon.