
If you think scrolling the internet all day is making you dumber, just imagine what it’s doing to large language models that consume a near-endless stream of absolute trash crawled from the web in the name of “training.” A research team recently proposed and tested a theory called “LLM Brain Rot Hypothesis,” which posited that the more junk data is fed into an AI model, the worse its outputs would become. Turns out that is a pretty solid theory, as a preprint paper published to arXiv…

![[CITYPNG.COM]White Google Play PlayStore Logo – 1500×1500](https://startupnews.fyi/wp-content/uploads/2025/08/CITYPNG.COMWhite-Google-Play-PlayStore-Logo-1500x1500-1-630x630.png)