this post was submitted on 20 Apr 2024
485 points (98.2% liked)

Gaming

19991 readers
119 users here now

Sub for any gaming related content!

Rules:

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 14 points 6 months ago (1 children)

The problem is that they are clearly targeting minors who don't pay their own electricity bill, and dont even neccessarily have awareness that they are paying for their fortnite skins with their parents money. Also: there is a good chance that the generated pictures are at some point present on in the filesystem of the generating computer, and that alone is a giant can of worms that can even lead to legal troubles, if the person lives in a country where some or all kinds of pronography are illegal.

This is a shitty grift, abusing people who don't understand the consequences of the software.

[–] [email protected] 3 points 6 months ago

Agreed. Preying on children who don't understand what they're signing up for is shitty to begin with.

Then, add that deepfake AI porn is unethical and likely illegal (and who knows what other kinds of potentially-illegal images are being generated...)

And, as you point out, the files having existed in the computer could, alone, be illegal.

Then, as and extra fuck you, burning GPU cycles to make AI images is causing CO2 emissions, GPU wear, waste heat that might trigger AC, and other negative externalities too, I'm sure...

It's shit all around.