Using model-generated content in training causes irreversible defects, a team of researchers says. “The tails of the original content distribution disappears,” writes co-author Ross Anderson from the University of Cambridge in a blog post. “Within a few generations, text becomes garbage, as Gaussian distributions converge and may even become delta functions.”

Here’s is the study: http://web.archive.org/web/20230614184632/https://arxiv.org/abs/2305.17493

  • Kerb@discuss.tchncs.de
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    Afaik, there are already solution to that.

    You first train the data on the outdated but correct data, to establish the correct “thought” patterns.

    And then you can train the ai on the fresh but flawed data, without tripping about the mistakes.