this post was submitted on 20 Apr 2024
485 points (98.2% liked)

Gaming

19977 readers
88 users here now

Sub for any gaming related content!

Rules:

founded 5 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 170 points 6 months ago (2 children)

Imagine reading that headline 20 years ago.

[–] [email protected] 53 points 6 months ago (2 children)

God that would sound so dystopian and futuristic...but to be honest, most articles about AI today would sound like that back then. Damn people would freak out about privacy.

[–] [email protected] 8 points 6 months ago

pretty sure they didn't.

load more comments (1 replies)
[–] [email protected] 12 points 6 months ago

BOINC came out 21 years ago, so it wouldn't be that unreasonable.

[–] [email protected] 137 points 6 months ago (4 children)

So, it's like folding@home, but instead of donating your spare compute to science, you sell it to generate porn?

[–] [email protected] 68 points 6 months ago

Porning@home

[–] [email protected] 20 points 6 months ago

Can we at least see it?

[–] [email protected] 8 points 6 months ago

This... This was inevitable.

load more comments (1 replies)
[–] [email protected] 93 points 6 months ago (1 children)

So... this AI company gets gaming teens to "donate" their computing power, rather than pay for render farms / GPU clouds?

And then oblivious parents pay the power bills, effectively covering the computing costs of the AI porn company?

Sounds completely ethical to me /s.

[–] [email protected] 9 points 6 months ago

No no, they're getting copies of digital images out of it. It's a totally fair trade!

[–] [email protected] 84 points 6 months ago (21 children)

I’ll be a minority voice considering the other comments. But maybe just pay for onlyfans or whatever you guys use. I’m a generally attractive woman (I can surmise from interactions while trying to date) and I really don’t like the idea that my likeness would be used for something like this. Get your jollies off, but try and be a bit consensual about it. Is that so much to ask?

[–] [email protected] 67 points 6 months ago* (last edited 6 months ago) (3 children)

It isn't too much to ask. According to Dr. K of HealthyGamerGG (Harvard Psychiatrist/Instructor), research shows that the release of non-consensual porn makes the unwilling subjects suicidal over half the time. Non-consensual porn = deepfakes, revenge porn, etc. It's seriously harmful, and there are other effects like depression, shame, PTSD, anxiety, and so on. There is functionally unlimited porn out there that is made with consent, and if someone doesn't want to be publicly sexually explicit then that's their choice.

I'm not against AI porn in general (I consider it the modern version of dirty drawings/cartoons), but when it comes to specific likenesses as with deepfakes then there's clear proof of harm and that's enough for me to oppose it. I don't believe there's some inherent right to see specific people naked against their will.

[–] [email protected] 10 points 6 months ago

I think it would be too big of a privacy overreach to try to ban it outright as I think what people do on their own computers is their own business and there's no way to enforce a full ban without being incredibly intrusive, but as soon as it gets distributed in any way I think it should be prosecuted as heavily as real non consensual porn that was taken against someone's will.

load more comments (2 replies)
[–] [email protected] 27 points 6 months ago

I think the key is a lot of people don't want to pay for porn. And in the case of deep fakes, it's stuff they literally cannot pay money to get.

[–] [email protected] 20 points 6 months ago (7 children)

Ai porn isn't deepfake porn. The default is just a random ai generated face and body. Unless you want to it's difficult to deepfake someone.

load more comments (7 replies)
[–] [email protected] 12 points 6 months ago (1 children)

So I’m not disagreeing with you, but you’re assuming they’re making deepfake images, and the article doesn’t specify that. In fact I’d bet that it’s just AI generated “people” that don’t exist.

What about AI porn of a person that doesn’t exist?

[–] [email protected] 25 points 6 months ago (1 children)

However, one of Salad's clients is CivitAi, a platform for sharing AI generated images which has previously been investigated by 404 media. It found that the service hosts image generating AI models of specific people, whose image can then be combined with pornographic AI models to generate non-consensual sexual images.

[–] [email protected] 12 points 6 months ago (1 children)

Fair, somehow I missed that

load more comments (1 replies)
[–] [email protected] 7 points 6 months ago

I know someone who’s into really dark romance stuff, like really hardcore stuff, but she’d never do some of this due to safety reasons. I can totally see her generating scenes of herself in those situations.

load more comments (16 replies)
[–] [email protected] 54 points 6 months ago (1 children)

Capitalism breeds innovation

load more comments (1 replies)
[–] [email protected] 49 points 6 months ago

This feels exploitative AF on multiple levels.

[–] [email protected] 49 points 6 months ago (1 children)

I remember when GPUs were used to fold proteins...

[–] [email protected] 15 points 6 months ago (1 children)

I wore an onion on my belt

[–] [email protected] 8 points 6 months ago

As was the fashion at the time

[–] [email protected] 41 points 6 months ago (2 children)

If I'm reading this right, it's a program that users sign up for to donate their processing power (and can opt in or out of adult content), which is then used by client companies to generate their own users' content? It even says that Salad can't view or moderate the images, so what exactly are they doing wrong besides providing service to potentially questionable companies? It makes as much sense as blaming Nvidia or Microsoft, am I missing something?

[–] [email protected] 24 points 6 months ago (2 children)

Based on the rewards, I'm assuming it's being done by very young people. Presumably the value of rewards is really low, but these kids haven't done the cost-benefit analysis. If I had to guess, for the vast majority it costs more in electricity than they get back, but the parents don't know it's happening.

This could be totally wrong. I haven't looked into it. This is how most of these things work though. They prey on the youth and their desire for these products to take advantage of them.

load more comments (2 replies)
[–] [email protected] 8 points 6 months ago (1 children)

so what exactly are they doing wrong besides providing service to potentially questionable companies?

Well I think that is the main point of what is wrong. I think the big question is whether the mature content toggle is on by default or not. The company says it's off, but some users said otherwise. Dunno why the author didn't install it and check.

[–] [email protected] 7 points 6 months ago (2 children)

They said they did.

However, by default the software settings opt users into generating adult content. An option exists to "configure workload types manually" which enables users to uncheck the "Adult Content Workloads" option (via 404 media), however this is easily missed in the setup process, which I duly tested for myself to confirm.

Honestly, and I'm not saying I support what's being done here, the way I see it if you're tech savvy enough to be interested in using a program like this you should be looking through all of the options properly anyway. If users don't care what they're doing and are only interested in the rewards that's kind of on them.

I just think the article is focused on the wrong company, Salad is selling a tool that is being potentially misused by users of their client's service. I can certainly see why that can be a problem, but based on the information given in the article I don't think it's really theirs. If that's ALL Salad's used for then that's a different story.

load more comments (2 replies)
[–] [email protected] 33 points 6 months ago (1 children)

explain this to a person in 1998

load more comments (1 replies)
[–] [email protected] 27 points 6 months ago (2 children)

I kinda fail to see the problem. The GPU owner doesn't see what workload they are processing. The pr0n company is willing to pay for GPU power. The GPU owner wants to earn money with his hardware. There's a demand, there's an offer, nobody is getting hurt (ai pr0n is not illegal, at least for now) so let people what they want to do

[–] [email protected] 14 points 6 months ago (1 children)

The problem is that they are clearly targeting minors who don't pay their own electricity bill, and dont even neccessarily have awareness that they are paying for their fortnite skins with their parents money. Also: there is a good chance that the generated pictures are at some point present on in the filesystem of the generating computer, and that alone is a giant can of worms that can even lead to legal troubles, if the person lives in a country where some or all kinds of pronography are illegal.

This is a shitty grift, abusing people who don't understand the consequences of the software.

load more comments (1 replies)
load more comments (1 replies)
[–] [email protected] 20 points 6 months ago

wow. Imagine burning out your expensive GPU for a fortnite skin.

[–] [email protected] 14 points 6 months ago (1 children)

Great. Now we're trading pre-made traditional artwork to kids in exchange for fresh robot porn!

load more comments (1 replies)
[–] [email protected] 13 points 6 months ago (1 children)

You would think they would do this to mine Bitcoin too.

load more comments (1 replies)
[–] [email protected] 12 points 6 months ago

Boring Dystopia

[–] [email protected] 11 points 6 months ago (1 children)

What? Seems like porn generation is the new crypto mining.

[–] [email protected] 16 points 6 months ago

I'd rather have a wealth of new porn around rather than thousands random Blockchains going around.

At least the porn will probably be useful for someone long term haha

load more comments
view more: next ›