25
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 20 Apr 2025
25 points (100.0% liked)
TechTakes
1999 readers
146 users here now
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community
founded 2 years ago
MODERATORS
The link opened up another google search with the same query, tho without the AI summary.
image of a google search result description
Query: “a bear fries bacon meaning”AI summary:
It really aggressively tries to match it up to something with similar keywords and structure, which is kind of interesting in its own right. It pattern-matched every variant I could come up with for "when all you have is..." for example.
Honestly it's kind of an interesting question and limitation for this kind of LLM. How should you respond when someone asks about an idiom neither of you know? The answer is really contextual. Sometimes it's better to try and help them piece together what it means, other times it's more important to acknowledge that this isn't actually a common expression or to try and provide accurate sourcing. The LLM, of course, has none of that context and because the patterns it replicates don't allow expressions of uncertainty or digressions it can't actually do both.
You, a human can respond like that, a llm, esp a search one with the implied authority it has should admit it doesnt know things. It shouldn't make up things, or use sensational clickbait headlines to make up a story.