this post was submitted on 08 Dec 2023
395 points (93.2% liked)

Technology

59232 readers
3248 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 13 points 11 months ago (1 children)

It's a problem for adults too. Circulating an AI generated nude of a female coworker is likely to be just as harmful as a real picture. Just as objectifying, humiliating and hurtful. Neighbors or other "friends" doing it could be just as bad.

It's sexual harassment even if fake.

[–] [email protected] 7 points 11 months ago

I think it should officially be considered sexual harassment. Obtain a picture of someone, generate nudes from that picture, it seems pretty obvious. Maybe it should include intent to harm, harass, exploit, or intimidate to make it official.