60
submitted 1 month ago* (last edited 1 month ago) by [email protected] to c/[email protected]

This is peak laziness. It seems that the reading list's author used autoplag to extrude the entire 60 page supplemental insert. The author also super-promises this has never happened before.

you are viewing a single comment's thread
view the rest of the comments
[-] [email protected] 8 points 1 month ago

AI assistants such as ChatGPT are well-known for creating plausible-sounding errors known as confabulations especially when lacking detailed information on a particular topic.

No, they are hallucinations or bullshit. I won’t accept any other terms.

[-] [email protected] 12 points 1 month ago* (last edited 1 month ago)

If it makes you feel better, I've heard good folks like Emily Bender of Stochastic Parrots fame suggest confabulation is a better term. "Hallucination" implies that LLMs have qualia and are accidentally sprinkling falsehoods over a true story. Confabulation better illustrates that it's producing a bullshit milkshake from its training data that can only be correct accidentally.

[-] [email protected] 7 points 1 month ago

You’ve swayed me. I’m now down with all three. Thanks for the explaination.

this post was submitted on 20 May 2025
60 points (100.0% liked)

TechTakes

1991 readers
61 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS