AI Models Show Signs of Falling Apart as They Ingest More AI-Generated Data
Briefly

AI Models Show Signs of Falling Apart as They Ingest More AI-Generated Data
"This situation creates a vicious cycle in which AI-generated data leads to AI models degrading over time, resulting in further reliance on flawed data."
"As reliance on AI-generated data increases among companies, we face the risk of models being trained on increasingly unreliable information."
The article by Steven Vaughn-Nichols highlights the significant risk of model collapse in the AI industry, driven by a reliance on AI-generated data to train large language models (LLMs). As the availability of authentic, human-generated training data dwindles, companies like Google and OpenAI adopt retrieval-augmented generation (RAG) to prevent this issue. However, the internet is filled with inaccurate AI-generated content, leading to poor model performance. A study revealed that leading LLMs produced more unsafe responses when querying such content. This cycle of 'Garbage In/Garbage Out' presents a growing threat to the sustainability of AI technologies.
Read at Futurism
Unable to calculate read time
[
|
]