this post was submitted on 23 Mar 2025
1237 points (98.2% liked)

Technology

67669 readers
6882 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 34 points 5 days ago (4 children)

Idk if it’s the biggest problem, but it’s probably top three.

Other problems could include:

  • Power usage
  • Adding noise to our communication channels
  • AGI fears if you buy that (I don’t personally)
[–] [email protected] 18 points 5 days ago (1 children)

Dead Internet theory has never been a bigger threat. I believe that’s the number one danger - endless quantities of advertising and spam shoved down our throats from every possible direction.

[–] [email protected] 7 points 5 days ago

We’re pretty close to it, most videos on YouTube and websites that exist are purely just for some advertiser to pay that person for a review or recommendation

[–] [email protected] 3 points 5 days ago* (last edited 5 days ago)

Could also put up:

  • Massive collections of people are exploited in order to train various AI systems.
  • Machine learning apps that create text or images from prompts are supposed to be supplementary but businesses are actively trying to replace their workers with this software.
  • Machine learning image generation currently has diminishing returns for training as we pump exponentially more content into them.
  • Machine learning text and image generated content self-poisons their generater's sample pool, greatly diminishing the ability for these systems to learn from real world content.

There's actually a much longer list if we expand to talking about other AI systems, like the robot systems we're currently training to use in automatic warfare. There's also the angle of these image and text generation systems being used for political manipulation and scams. There's alot of terrible problems created from this tech.

[–] [email protected] 2 points 5 days ago

Power usage

I'm generally a huge eco guy but on power usage particularly I view this largely as a government failure. We have had to incredible energy resources that the government has chosen not to implement or effectively dismantled.

It reminds me a lot of how Recycling has been pushed so hard into the general public instead of and government laws on plastic usage and waste disposal.

It's always easier to wave your hands and blame "society" than the is to hold the actual wealthy and powerful accountable.

[–] [email protected] 1 points 4 days ago* (last edited 4 days ago)

Power usage probably won't be a major issue; the main take-home message of the Deepseek brouhaha is that training and inference can be much more efficiently than we had thought (our estimates had been based on well-funded Western companies that didn't have to bother with optimization).

AI spam is an annoyance, but it's not really AI-specific but the continuation of a trend; the Internet was already drowning in human-created slop before LLMs came along. At some point, we will probably all have to rely on AI tools to filter it out. This isn't something that can be unwound, any more than you can undo computers being able to play chess well.