Basically, if the models continue eating each other’s data, perhaps without even knowing it, they’ll progressively get weirder and dumber until they collapse. The researchers provide numerous examples and mitigation methods, but they go so far as to call model collapse “inevitable,” at least in theory.
https://techcrunch.com/2024/07/24/model-collapse-scientists-warn-against-letting-ai-eat-its-own-tail/