this post was submitted on 25 Apr 2024
180 points (100.0% liked)

Technology

37712 readers
155 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

Instagram is profiting from several ads that invite people to create nonconsensual nude images with AI image generation apps, once again showing that some of the most harmful applications of AI tools are not hidden on the dark corners of the internet, but are actively promoted to users by social media companies unable or unwilling to enforce their policies about who can buy ads on their platforms.

While parent company Meta’s Ad Library, which archives ads on its platforms, who paid for them, and where and when they were posted, shows that the company has taken down several of these ads previously, many ads that explicitly invited users to create nudes and some ad buyers were up until I reached out to Meta for comment. Some of these ads were for the best known nonconsensual “undress” or “nudify” services on the internet.


God damn I hate this fucking AI bullshit so god damn much.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 49 points 6 months ago (18 children)

I can't help but wonder how in the long term deep fakes are going to change society. I've seen this article making the rounds on other social media, and there's inevitably some dude who shows up who makes the claim that this will make nudes more acceptable because there will be no way to know if a nude is deep faked or not. It's sadly a rather privileged take from someone who suffers from no possible consequences of nude photos of themselves on the internet, but I do think in the long run (20+ years) they might be right. Unfortunately between now and some ephemeral then, many women, POC, and other folks will get fired, harassed, blackmailed and otherwise hurt by people using tools like these to make fake nude images of them.

But it does also make me think a lot about fake news and AI and how we've increasingly been interacting in a world in which "real" things are just harder to find. Want to search for someone's actual opinion on something? Too bad, for profit companies don't want that, and instead you're gonna get an AI generated website spun up by a fake alias which offers a "best of " list where their product is the first option. Want to understand an issue better? Too bad, politics is throwing money left and right on news platforms and using AI to write biased articles to poison the well with information meant to emotionally charge you to their side. Pretty soon you're going to have no idea whether pictures or videos of things that happened really happened and inevitably some of those will be viral marketing or other forms of coercion.

It's kind of hard to see all these misuses of information and technology, especially ones like this which are clearly malicious in nature, and the complete inaction of government and corporations to regulate or stop this and not wonder how much worse it needs to get before people bother to take action.

[–] [email protected] 8 points 6 months ago (8 children)

Time to go back to books for information.

[–] [email protected] 21 points 6 months ago (2 children)

There are already AI-written books flooding the market, not to mention other forms of written misinformation.

[–] [email protected] 5 points 6 months ago

The flood of AI generated anything is really annoying when looking for an actual human created source.

load more comments (1 replies)
load more comments (6 replies)
load more comments (15 replies)