Misinformation in the 2024 election will be rampant due to accessible AI tools, says Eric Schmidt. Social media’s failure to protect against false AI-generated content and the reduction of trust and safety groups are concerns. Schmidt suggests marking content and holding users accountable for law violations.

  • Senicar@social.cyb3r.dog
    link
    fedilink
    English
    arrow-up
    16
    ·
    1 year ago

    they haven’t solved it yet

    Google hasn’t solved it yet either. You’ve probably all experienced the deluge of AI generated garbage with almost zero pertinent information clogging up search queries. And those aren’t even politically motivated. It’s getting harder and harder to even find actual information from actual humans.

    • bazmatazable@reddthat.com
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      Why is it expected that social media companies will find a solution for this? Political discussions are part of the democratic process so why would any of the big social networks (who are effectively advertising companies) have an incentive to foster the fair and open exchange of ideas and information?

      • Senicar@social.cyb3r.dog
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        Because we have surrendered so much control of our lives to these companies we can hardly envision an alternative.

  • fiasco@possumpat.io
    link
    fedilink
    English
    arrow-up
    9
    ·
    1 year ago

    The only thing deep learning has done is make forgery more accessible. But Stalin was airbrushing unpersons out of photos sixty years ago, so in principle this is nothing new.

    When it comes to politics, there’s already enough money floating around that you don’t need deep learning to clog the internet with shit. So personally I’m not expecting anything different.

    • DoucheAsaurus@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      I’m expecting the exact same thing as every year, shitty candidates spewing lies and fake promises at each other until the voters get to choose between the two options that corporate America has decided we can vote on.

        • DoucheAsaurus@kbin.social
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          That is absolutely not what I’m saying. The whole thing is a sham because of corporate interests, lobbyists, super PACs, the electoral college, super delegates, corporate media bias, etc. What I’m saying is that democracy was stolen from us.

          • mrnotoriousman@kbin.social
            link
            fedilink
            arrow-up
            0
            ·
            1 year ago

            Well can’t disagree that it’s creeping damn close to corporatocracy here. Sucks having no actual left wing representation outside of like a handful of congresspeople.

            • DoucheAsaurus@kbin.social
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              If the DNC had let us have Bernie in 2016 instead of jerking themselves off with their super delegates we would be in a lot better place today. I fully believe that.

    • hglman@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      The volume of humanesque text that can be produced by AI is orders of magnitude greater. It will be diffrent this time and it will be really annoying.

      • fiasco@possumpat.io
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I understand that, but the amount of money that gets fed into political campaigns already generates staggering amounts of spurious text. It’s hard to remember what happened the day before yesterday, but “fake news” originally meant sites that were set up to vaguely look like news sites, all for the purpose of pushing one or two entirely made-up propaganda pieces. Yes, deep learning can partly automate this, but automation isn’t necessary in this case.

        There comes a point of diminishing returns with spurious text, and I feel like we’re already past that point.