Recursive generative retraining with pluralistic preferences converges to a stable diverse distribution that satisfies a weighted Nash bargaining solution.
Is model collapse inevitable? Breaking the curse of recursion by accumulating real and synthetic data
6 Pith papers cite this work. Polarity classification is still indexing.
years
2026 6representative citing papers
By mid-2025 roughly 35% of new websites are AI-generated or AI-assisted, correlating with lower semantic diversity and higher positive sentiment but showing no significant drop in factual accuracy or stylistic diversity.
Filter Babel explores a future of AI-personalized private experiences that may erode common ground in communication while supporting individual identity and selfhood.
Model collapse threatens AI democratization by disproportionately degrading data and efficiency for low-resource communities.
LLM integration in software engineering builds epistemological debt that erodes mental models and homogenizes code via recursive training, risking systemic fragility as illustrated by 2026 Amazon outages.
Knowledge distillation evaluations must report lost teacher capabilities via a Distillation Loss Statement rather than relying solely on task scores.
citing papers explorer
-
Curated Synthetic Data Doesn't Have to Collapse: A Theoretical Study of Generative Retraining with Pluralistic Preferences
Recursive generative retraining with pluralistic preferences converges to a stable diverse distribution that satisfies a weighted Nash bargaining solution.
-
The Impact of AI-Generated Text on the Internet
By mid-2025 roughly 35% of new websites are AI-generated or AI-assisted, correlating with lower semantic diversity and higher positive sentiment but showing no significant drop in factual accuracy or stylistic diversity.
-
Filter Babel: The Challenge of Synthetic Media to Authenticity and Common Ground in AI-Mediated Communication
Filter Babel explores a future of AI-personalized private experiences that may erode common ground in communication while supporting individual identity and selfhood.
-
Position: the Stochastic Parrot in the Coal Mine. Model Collapse is a Threat to Low-Resource Communities
Model collapse threatens AI democratization by disproportionately degrading data and efficiency for low-resource communities.
-
Cognitive Atrophy and Systemic Collapse in AI-Dependent Software Engineering
LLM integration in software engineering builds epistemological debt that erodes mental models and homogenizes code via recursive training, risking systemic fragility as illustrated by 2026 Amazon outages.
-
Knowledge Distillation Must Account for What It Loses
Knowledge distillation evaluations must report lost teacher capabilities via a Distillation Loss Statement rather than relying solely on task scores.