this post was submitted on 04 Nov 2023
36 points (62.9% liked)

Privacy

32159 readers
203 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

much thanks to @gary_host_laptop for the logo design :)

founded 5 years ago
MODERATORS
 

In this video I discuss how generative AI technology has grown far past the governments ability to effectively control it and how the current legislative measures could lead to innocent people being jailed.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 30 points 1 year ago (3 children)

There is no such thing.

God dammit, the entire point of calling it CSAM is to distinguish photographic evidence of child rape from made-up images that make people feel icky.

If you want them treated the same, legally - go nuts. Have that argument. But stop treating the two as the same thing, and fucking up clear discussion of the worst thing on the internet.

You can't generate assault. It is impossible to abuse children who do not exist.

[–] [email protected] 30 points 1 year ago

Did nobody in this comment section read the video at all?

The only case mentioned by this video is a case where highschool students distributed (counterfeit) sexually explicit images of their classmates which had been generated by an AI model.

I don't know if it meets the definition of CSAM because the events depicted in the images are fictional, but the subjects are real.

These children do exist, some have doubtlessly been traumatized by this. This crime has victims.

[–] [email protected] 7 points 1 year ago (1 children)

I think a lot of people are arguing that the models which are used to generate these types of content are trained on literal CSAM. So it's like CSAM with extra steps.

[–] [email protected] 4 points 1 year ago

Those people are morons.

[–] [email protected] 2 points 1 year ago

In most (all?) countries no such distinction is made, the material is illegal all the same.