this post was submitted on 29 Dec 2023
412 points (90.7% liked)

Technology

59581 readers
2941 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

AI-created “virtual influencers” are stealing business from humans::Brands are turning to hyper-realistic, AI-generated influencers for promotions.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 3 points 10 months ago (1 children)

Whoah there: Who says AI influencers aren't the result of individual's honest work? You don't need an entire data center of computers to make your own AI influencer!

Don't assume there's a corporation behind every AI persona. It could just be one guy with a lot of VRAM getting creative with prompts in his parent's basement.

[–] [email protected] -5 points 10 months ago (1 children)

Well they are products of the tech industry, so they are inherently not honest or ethical.

[–] [email protected] 6 points 10 months ago (1 children)

Ah yes. Like that damn internet and those cursed devices people use to access it. Anyone using those is inherently not honest or ethical.

[–] [email protected] -5 points 10 months ago (1 children)

The internet is the worst mistake in human history. I’m surprised you’d use that as your example.

[–] [email protected] 5 points 10 months ago

No worries my fellow unethical dishonest internet-using homie. It's not like nuance exists and things can be both good and bad. Everything is black and white, after all.