94
submitted 3 weeks ago* (last edited 3 weeks ago) by Ja7sh_The_Donkey@piefed.social to c/fuck_ai@lemmy.world

A substantial number of AI images generated or edited with Grok are targeting women in religious and cultural clothing.

Wow, Grok is very safe and follows the appstore and playstore rules.

top 3 comments
sorted by: hot top new old
[-] Darkcoffee@sh.itjust.works 20 points 3 weeks ago

So, Google and Apple agree with this staying on their app stores. Canada and other countries aren't even considering kicking it to the curb either

[-] CannonFodder@lemmy.world 16 points 3 weeks ago

Yeah, that's fucked up. But how can it possibly create an image even related to the person if their face and body is covered in the original image?

[-] Ja7sh_The_Donkey@piefed.social 28 points 3 weeks ago

it would make up details from the billions of porn videos its trained on. Very unfortunate that Apple and Googleare allowing this. 😢

this post was submitted on 12 Jan 2026
94 points (98.0% liked)

Fuck AI

5580 readers
1537 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.

founded 2 years ago
MODERATORS