• redcalcium@lemmy.institute
    link
    fedilink
    arrow-up
    130
    arrow-down
    2
    ·
    3 months ago

    So, it’s like folding@home, but instead of donating your spare compute to science, you sell it to generate porn?

  • cygon@lemmy.world
    link
    fedilink
    arrow-up
    84
    arrow-down
    2
    ·
    3 months ago

    So… this AI company gets gaming teens to “donate” their computing power, rather than pay for render farms / GPU clouds?

    And then oblivious parents pay the power bills, effectively covering the computing costs of the AI porn company?

    Sounds completely ethical to me /s.

    • fidodo@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      3 months ago

      No no, they’re getting copies of digital images out of it. It’s a totally fair trade!

  • PDFuego@lemmy.world
    link
    fedilink
    arrow-up
    38
    arrow-down
    1
    ·
    3 months ago

    If I’m reading this right, it’s a program that users sign up for to donate their processing power (and can opt in or out of adult content), which is then used by client companies to generate their own users’ content? It even says that Salad can’t view or moderate the images, so what exactly are they doing wrong besides providing service to potentially questionable companies? It makes as much sense as blaming Nvidia or Microsoft, am I missing something?

    • Cethin@lemmy.zip
      link
      fedilink
      English
      arrow-up
      21
      ·
      3 months ago

      Based on the rewards, I’m assuming it’s being done by very young people. Presumably the value of rewards is really low, but these kids haven’t done the cost-benefit analysis. If I had to guess, for the vast majority it costs more in electricity than they get back, but the parents don’t know it’s happening.

      This could be totally wrong. I haven’t looked into it. This is how most of these things work though. They prey on the youth and their desire for these products to take advantage of them.

      • CheeseNoodle@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        3 months ago

        Honestly what roblox kids are willing to do for pitiful pay is scary, if you work in any kind of creative digital medium those kids will do days of your job for a fiver if any real money at all. It won’t be industry quality or anything but damn we got a whole digital version of sending kids down the mines. (And some of these roblox games can have unexpectedly big players behind them exploiting kids)

      • PDFuego@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        3 months ago

        Right, so it’s not like they’re being tricked into generating porn or anything. It’s not some option that they would have turned off if they’d known about it, they just don’t care what’s happening because they only want the reward. Again I’m not saying I agree with it or that Salad’s right to do it, but if they say that’s potentially what it can be used for (and they do because the opt-out is available) then the focus should be on the client companies using the tool for questionable purposes.

    • fidodo@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      3 months ago

      so what exactly are they doing wrong besides providing service to potentially questionable companies?

      Well I think that is the main point of what is wrong. I think the big question is whether the mature content toggle is on by default or not. The company says it’s off, but some users said otherwise. Dunno why the author didn’t install it and check.

      • PDFuego@lemmy.world
        link
        fedilink
        arrow-up
        7
        ·
        3 months ago

        They said they did.

        However, by default the software settings opt users into generating adult content. An option exists to “configure workload types manually” which enables users to uncheck the “Adult Content Workloads” option (via 404 media), however this is easily missed in the setup process, which I duly tested for myself to confirm.

        Honestly, and I’m not saying I support what’s being done here, the way I see it if you’re tech savvy enough to be interested in using a program like this you should be looking through all of the options properly anyway. If users don’t care what they’re doing and are only interested in the rewards that’s kind of on them.

        I just think the article is focused on the wrong company, Salad is selling a tool that is being potentially misused by users of their client’s service. I can certainly see why that can be a problem, but based on the information given in the article I don’t think it’s really theirs. If that’s ALL Salad’s used for then that’s a different story.

        • fidodo@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 months ago

          Ah thanks I think I forgot that sentence by the end of the article and thought it was just a user report that it was checked by default. I really don’t think that it should be checked by default, depending on where you are it could even get you in trouble. App setup for this kind of stuff isn’t necessarily only for power users now, it has gotten very streamlined and tested for conversion.

  • pokexpert30@lemmy.pussthecat.org
    link
    fedilink
    arrow-up
    32
    arrow-down
    5
    ·
    3 months ago

    I kinda fail to see the problem. The GPU owner doesn’t see what workload they are processing. The pr0n company is willing to pay for GPU power. The GPU owner wants to earn money with his hardware. There’s a demand, there’s an offer, nobody is getting hurt (ai pr0n is not illegal, at least for now) so let people what they want to do

    • mavu@discuss.tchncs.de
      link
      fedilink
      arrow-up
      15
      arrow-down
      1
      ·
      3 months ago

      The problem is that they are clearly targeting minors who don’t pay their own electricity bill, and dont even neccessarily have awareness that they are paying for their fortnite skins with their parents money. Also: there is a good chance that the generated pictures are at some point present on in the filesystem of the generating computer, and that alone is a giant can of worms that can even lead to legal troubles, if the person lives in a country where some or all kinds of pronography are illegal.

      This is a shitty grift, abusing people who don’t understand the consequences of the software.

      • blindsight@beehaw.org
        link
        fedilink
        arrow-up
        3
        ·
        3 months ago

        Agreed. Preying on children who don’t understand what they’re signing up for is shitty to begin with.

        Then, add that deepfake AI porn is unethical and likely illegal (and who knows what other kinds of potentially-illegal images are being generated…)

        And, as you point out, the files having existed in the computer could, alone, be illegal.

        Then, as and extra fuck you, burning GPU cycles to make AI images is causing CO2 emissions, GPU wear, waste heat that might trigger AC, and other negative externalities too, I’m sure…

        It’s shit all around.

  • Mango@lemmy.world
    link
    fedilink
    arrow-up
    11
    ·
    3 months ago

    Great. Now we’re trading pre-made traditional artwork to kids in exchange for fresh robot porn!

    • DudeDudenson@lemmings.world
      link
      fedilink
      arrow-up
      16
      arrow-down
      1
      ·
      3 months ago

      I’d rather have a wealth of new porn around rather than thousands random Blockchains going around.

      At least the porn will probably be useful for someone long term haha

      • Daxtron2@startrek.website
        link
        fedilink
        arrow-up
        2
        ·
        3 months ago

        But that’s the same issue of making fakes that we’ve had for 30+ years since digital manipulation became feasible.

        • foo@programming.dev
          link
          fedilink
          arrow-up
          13
          arrow-down
          1
          ·
          edit-2
          3 months ago

          Yeah sure except now to make deep fake porn you just need to go ‘famous star naked riding an old man’s cock’ set 8 images for each seed and set a job of 100 images, turn the air con to antarctic and make misogynistic videos about why movies are woke while the job slowly cooks your studio

          Then when you finish you probably have some good images of whatever famous star you like getting railed by an old man and you can hop on YouTube and complain that people don’t think you are an artist.

          It requires almost no effort or talent to make a boatload of deep fake material. If you put any effort in you can orchestrate an image that looks pretty good.

          • Jojo, Lady of the West@lemmy.blahaj.zone
            link
            fedilink
            arrow-up
            8
            ·
            3 months ago

            Add to that the fact that before ai, unless you’re already pretty famous, no one cares enough to make nonconsensual porn of you. After, anyone vaguely attracted to you can snap or find a few pictures and do a decent job of it without any skill or practice.

          • Daxtron2@startrek.website
            link
            fedilink
            arrow-up
            1
            ·
            3 months ago

            Ease of creation shouldn’t have a bearing on whether or not the final result is illegal. A handmade vs AI generated fake nude should be treated the same way.

            • foo@programming.dev
              link
              fedilink
              arrow-up
              6
              ·
              3 months ago

              I didn’t argue that it shouldn’t. The difference is the ease of creation. It now requires no skill or talent to produce it so the game has changed and it needs to be addressed and not dismissed

        • Jojo, Lady of the West@lemmy.blahaj.zone
          link
          fedilink
          arrow-up
          1
          ·
          3 months ago

          Well, the word deep fake is literally from the ai boom, but I understand you to mean doctored images to make it look like someone was doing a porn when they didn’t was already a thing.

          And yeah, it very much was. But unless you were already a high profile individual like a popular celebrity, or mayyybe if you happened to be attractive to the one guy making them, they didn’t tend to get made of you, and certainly not well. Now, anyone with a crush and a photo of you can make your face and a pretty decent approximation of your naked body move around and make noises while doing the nasty. And they can do it many orders of magnitude faster and with less skill than before.

          So no, you don’t need ai for it to exist and be somewhat problematic, but ai makes it much more problematic.

    • Katana314@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      3 months ago

      One ethics quandary is AI child porn. It at least provides a non-harmful outlet for an otherwise harmful act, but it could also feed addictions and feel insufficient.

  • Omega_Haxors@lemmy.ml
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    2
    ·
    edit-2
    3 months ago

    100% that porn is not legal. It’s also pretty easy to tell which demographic they’re targeting with this.

  • CliveRosfield@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    2
    ·
    3 months ago

    Eh I agree with the reasonable takes here. Nothing wrong with generating that sort of stuff until it starts resembling the likeliness of a real living person. Then I think it’s just creepy; especially if for some reason you are sharing it 💀