this post was submitted on 07 Oct 2023
934 points (96.8% liked)

Technology

59525 readers
3224 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

A lawsuit filed by more victims of the sex trafficking operation claims that Pornhub’s moderation staff ignored reports of their abuse videos.


Sixty-one additional women are suing Pornhub’s parent company, claiming that the company failed to take down videos of their abuse as part of the sex trafficking operation Girls Do Porn. They’re suing the company and its sites for sex trafficking, racketeering, conspiracy to commit racketeering, and human trafficking.

The complaint, filed on Tuesday, includes what it claims are internal emails obtained by the plaintiffs, represented by Holm Law Group, between Pornhub moderation staff. The emails allegedly show that Pornhub had only one moderator to review 700,000 potentially abusive videos, and that the company intentionally ignored repeated reports from victims in those videos.

The damages and restitution they seek amounts to more than $311,100,000. They demand a jury trial, and seek damages of $5 million per plaintiff, as well as restitution for all the money Aylo, the new name for Pornhub’s parent company, earned “marketing, selling and exploiting Plaintiffs’ videos in an amount that exceeds one hundred thousand dollars for each plaintiff.”

The plaintiffs are 61 more unnamed “Jane Doe” victims of Girls Do Porn, adding to the 60 that sued Pornhub in 2020 for similar claims.
Girls Do Porn was a federally-convicted sex trafficking ring that coerced young women into filming pornographic videos under the pretense of “modeling” gigs. In some cases, the women were violently abused. The operators told them that the videos would never appear online, so that their home communities wouldn’t find out, but they uploaded the footage to sites like Pornhub, where the videos went viral—and in many instances, destroyed their lives. Girls Do Porn was an official Pornhub content partner, with its videos frequently appearing on the front page, where they gathered millions of views.

read more: https://www.404media.co/girls-do-porn-victims-sue-pornhub-for-300-million/

archive: https://archive.ph/zQWt3#selection-593.0-609.599

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 22 points 1 year ago* (last edited 1 year ago) (1 children)

We've had that for about a year now. Youtube is especially full of it (the somewhat SFW-kind).

[–] [email protected] 7 points 1 year ago (1 children)

Just pictures. Stable diffusion type models have huge problems with flickering right now for videos. No consistency.

[–] [email protected] 3 points 1 year ago* (last edited 1 year ago) (1 children)

AI as filter over real footage is getting reasonably stable.

[–] [email protected] 7 points 1 year ago (1 children)

That seems like it would defeat the humane purpose, but maybe. Perhaps use a rough cg as the base layer?

[–] [email protected] 0 points 1 year ago

This is the way. For other content it works reasonably well (especially if you properly mask image zones/depth), so I don't see why 18+ content would be different.