Imagine reading that headline 20 years ago.
God that would sound so dystopian and futuristic…but to be honest, most articles about AI today would sound like that back then. Damn people would freak out about privacy.
pretty sure they didn’t.
We should still freak out about privacy
BOINC came out 21 years ago, so it wouldn’t be that unreasonable.
So, it’s like folding@home, but instead of donating your spare compute to science, you sell it to generate porn?
Porning@home
Can we at least see it?
This… This was inevitable.
“Selling” it for digital copies of images and some variable tweaks
So… this AI company gets gaming teens to “donate” their computing power, rather than pay for render farms / GPU clouds?
And then oblivious parents pay the power bills, effectively covering the computing costs of the AI porn company?
Sounds completely ethical to me /s.
No no, they’re getting copies of digital images out of it. It’s a totally fair trade!
Capitalism breeds innovation
This feels exploitative AF on multiple levels.
If I’m reading this right, it’s a program that users sign up for to donate their processing power (and can opt in or out of adult content), which is then used by client companies to generate their own users’ content? It even says that Salad can’t view or moderate the images, so what exactly are they doing wrong besides providing service to potentially questionable companies? It makes as much sense as blaming Nvidia or Microsoft, am I missing something?
Based on the rewards, I’m assuming it’s being done by very young people. Presumably the value of rewards is really low, but these kids haven’t done the cost-benefit analysis. If I had to guess, for the vast majority it costs more in electricity than they get back, but the parents don’t know it’s happening.
This could be totally wrong. I haven’t looked into it. This is how most of these things work though. They prey on the youth and their desire for these products to take advantage of them.
Honestly what roblox kids are willing to do for pitiful pay is scary, if you work in any kind of creative digital medium those kids will do days of your job for a fiver if any real money at all. It won’t be industry quality or anything but damn we got a whole digital version of sending kids down the mines. (And some of these roblox games can have unexpectedly big players behind them exploiting kids)
Right, so it’s not like they’re being tricked into generating porn or anything. It’s not some option that they would have turned off if they’d known about it, they just don’t care what’s happening because they only want the reward. Again I’m not saying I agree with it or that Salad’s right to do it, but if they say that’s potentially what it can be used for (and they do because the opt-out is available) then the focus should be on the client companies using the tool for questionable purposes.
so what exactly are they doing wrong besides providing service to potentially questionable companies?
Well I think that is the main point of what is wrong. I think the big question is whether the mature content toggle is on by default or not. The company says it’s off, but some users said otherwise. Dunno why the author didn’t install it and check.
They said they did.
However, by default the software settings opt users into generating adult content. An option exists to “configure workload types manually” which enables users to uncheck the “Adult Content Workloads” option (via 404 media), however this is easily missed in the setup process, which I duly tested for myself to confirm.
Honestly, and I’m not saying I support what’s being done here, the way I see it if you’re tech savvy enough to be interested in using a program like this you should be looking through all of the options properly anyway. If users don’t care what they’re doing and are only interested in the rewards that’s kind of on them.
I just think the article is focused on the wrong company, Salad is selling a tool that is being potentially misused by users of their client’s service. I can certainly see why that can be a problem, but based on the information given in the article I don’t think it’s really theirs. If that’s ALL Salad’s used for then that’s a different story.
It’s Roblox stuff you can buy, it’s not power users that are the target demographic
Ah thanks I think I forgot that sentence by the end of the article and thought it was just a user report that it was checked by default. I really don’t think that it should be checked by default, depending on where you are it could even get you in trouble. App setup for this kind of stuff isn’t necessarily only for power users now, it has gotten very streamlined and tested for conversion.
I kinda fail to see the problem. The GPU owner doesn’t see what workload they are processing. The pr0n company is willing to pay for GPU power. The GPU owner wants to earn money with his hardware. There’s a demand, there’s an offer, nobody is getting hurt (ai pr0n is not illegal, at least for now) so let people what they want to do
The problem is that they are clearly targeting minors who don’t pay their own electricity bill, and dont even neccessarily have awareness that they are paying for their fortnite skins with their parents money. Also: there is a good chance that the generated pictures are at some point present on in the filesystem of the generating computer, and that alone is a giant can of worms that can even lead to legal troubles, if the person lives in a country where some or all kinds of pronography are illegal.
This is a shitty grift, abusing people who don’t understand the consequences of the software.
Agreed. Preying on children who don’t understand what they’re signing up for is shitty to begin with.
Then, add that deepfake AI porn is unethical and likely illegal (and who knows what other kinds of potentially-illegal images are being generated…)
And, as you point out, the files having existed in the computer could, alone, be illegal.
Then, as and extra fuck you, burning GPU cycles to make AI images is causing CO2 emissions, GPU wear, waste heat that might trigger AC, and other negative externalities too, I’m sure…
It’s shit all around.
You would think they would do this to mine Bitcoin too.
Whose to say they aren’t doing that on the side?
Boring Dystopia
Great. Now we’re trading pre-made traditional artwork to kids in exchange for fresh robot porn!
and the kids are getting the traditional art! would not have called it.
What? Seems like porn generation is the new crypto mining.
I’d rather have a wealth of new porn around rather than thousands random Blockchains going around.
At least the porn will probably be useful for someone long term haha
I don’t get the hate for AI porn.
On its own, it’s just the same as hate for porn. But there’s also deep fake porn, ai porn of real people, and that’s potentially far more problematic.
But that’s the same issue of making fakes that we’ve had for 30+ years since digital manipulation became feasible.
Yeah sure except now to make deep fake porn you just need to go ‘famous star naked riding an old man’s cock’ set 8 images for each seed and set a job of 100 images, turn the air con to antarctic and make misogynistic videos about why movies are woke while the job slowly cooks your studio
Then when you finish you probably have some good images of whatever famous star you like getting railed by an old man and you can hop on YouTube and complain that people don’t think you are an artist.
It requires almost no effort or talent to make a boatload of deep fake material. If you put any effort in you can orchestrate an image that looks pretty good.
Add to that the fact that before ai, unless you’re already pretty famous, no one cares enough to make nonconsensual porn of you. After, anyone vaguely attracted to you can snap or find a few pictures and do a decent job of it without any skill or practice.
Ease of creation shouldn’t have a bearing on whether or not the final result is illegal. A handmade vs AI generated fake nude should be treated the same way.
I didn’t argue that it shouldn’t. The difference is the ease of creation. It now requires no skill or talent to produce it so the game has changed and it needs to be addressed and not dismissed
deepfakes predate the ai boom. you don’t need ai for deepfakes
Well, the word deep fake is literally from the ai boom, but I understand you to mean doctored images to make it look like someone was doing a porn when they didn’t was already a thing.
And yeah, it very much was. But unless you were already a high profile individual like a popular celebrity, or mayyybe if you happened to be attractive to the one guy making them, they didn’t tend to get made of you, and certainly not well. Now, anyone with a crush and a photo of you can make your face and a pretty decent approximation of your naked body move around and make noises while doing the nasty. And they can do it many orders of magnitude faster and with less skill than before.
So no, you don’t need ai for it to exist and be somewhat problematic, but ai makes it much more problematic.
One ethics quandary is AI child porn. It at least provides a non-harmful outlet for an otherwise harmful act, but it could also feed addictions and feel insufficient.
You clearly haven’t seen it, nor know anyone affected by it. It’s like 99% noncon shit from people who are too creepy for artists to work with.
EDIT: Sums it up https://www.youtube.com/watch?v=3aS97RKjEdI
deleted by creator
100% that porn is not legal. It’s also pretty easy to tell which demographic they’re targeting with this.
Eh I agree with the reasonable takes here. Nothing wrong with generating that sort of stuff until it starts resembling the likeliness of a real living person. Then I think it’s just creepy; especially if for some reason you are sharing it 💀