Model Autophagy Disorder
Blog post on the model collapse phenomenon. The basic insight is that when AI models generate things—text, images, sound—and then those generated products are used to train a subsequent model, the new model actually gets worse at generating images and texts. Over a few generations it can fail completely, producing only a string of gibberish or a single same image over and over again.