Model Collapse: The AI Feedback Loop Problem Nobody Wants to Talk About
dev.to·12h·
Discuss: DEV
Flag this post

AI models are eating their own tail, and it’s going to be a problem.

The entire premise of modern LLMs is that they’re trained on human-generated content. Books, articles, research papers, Stack Overflow answers, GitHub repositories - billions of tokens of actual human knowledge. But that assumption is breaking down faster than anyone wants to admit.

The Core Issue

As we approach the end of 2025, the web is saturated with AI-generated content:

Stack Overflow answers copy-pasted from ChatGPT

GitHub repos with AI-generated documentation and comments

Blog posts churned out by content farms using GPT

Social media posts from bots

Technical articles written entirely by LLMs

Yet AI companies still scrape the web for training data. They can’t reliably distinguish human …

Similar Posts

Loading similar posts...