this post was submitted on 05 Sep 2024
924 points (96.7% liked)

Technology

59143 readers
2299 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 6 points 2 months ago (1 children)

LLMs don't "understand" anything, and it's unfortunate that we've taken to using language related to human thinking to talk about software. It's all data processing and models.

[–] [email protected] 4 points 2 months ago (1 children)

Yup, 100% this. And there's a crowd of muppets arguing "ackshyually wut u're definishun of unrurrstandin/intellijanse?" or "but hyumans do...", but come on - that's bullshit, and more often than not sealioning.

Don't get me wrong - model-based data processing is still useful in quite a few situations. But they're only a fraction of what big tech pretends that LLMs are useful for.

[–] [email protected] 4 points 2 months ago

Yeah, I'm far from anti-AI, but we're just not anywhere close to where people think we are with it. And I'm pretty sick of corporate leadership saying "We need to make more use of AI" without knowing the difference between an LLM and a machine learning application, or having any idea *how" their company could make use of one of the technologies.

It really feels like one of those hammer in search of a nail things.