42,420 stars | 5,102 forks | Python
🚀🚀 「大模型」2小时完全从0训练26M的小参数GPT!🌏 Train a 26M-parameter GPT from scratch in just 2h!
What it does
MiniMind is a project that allows users to train a small-scale GPT model with only 26M parameters in just 2 hours using minimal resources. It provides comprehensive code for training, fine-tuning, and deploying language models, making it accessible even to those with limited computational power.
Why it matters: Train your own GPT model in just 2 hours! Dive into the world of AI and explore how MiniMind makes it possible for everyone to create, train, and deploy their own language models.
Trending today with 478 new stars
Want to create content about this repo? Use Nemati AI tools to generate articles, tutorials, and social posts.



![HKUDS/LightRAG — [EMNLP2025] "LightRAG: Simple and Fast Retrieval-Augmented Generation"](https://nerdstudio-backend-bucket.s3.us-east-2.amazonaws.com/media/blog/images/github/c1921c95cbbd4ee4.webp)

