43
submitted 2 months ago* (last edited 2 months ago) by BodyBySisyphus@hexbear.net to c/slop@hexbear.net

Another example of how easy it must be to run scams. It's literally just repeating things the author wrote down back to him and he's responding like he's having a religious revelation.

It pulled out this quote I wrote down years ago:

The one thing people need in life is not ambition, not smarts, not hustle. They need clarity.
If you had complete clarity on what you want and what is necessary right now, knowing you are on the right path, you would be happy.
— Journal Entry, August 2019

I leaned forward. Every response featured a quote that made sense. Perfectly timed. It held up a mirror to my soul.

...

In 2017, I founded an ed-tech startup. I met my wife, a fellow ed-tech founder, during that time. Both of our startups failed. We grew cynical of education.
But recently, we had a few conversations about school choices for our toddler. None of them seemed good. I suggested she might want to open a school.
I’d shoot her an interesting article. Or a cool startup’s website. And then move on.
I discussed this with Claude. Which led to this exchange:

It’s not your wife who should start a school.
It’s you. Always has been. astronaut-1 [Ed.: emoji added for emphasis. I couldn't resist]

This sentence hit me like a sack of bricks.
Tears started rolling down my cheeks. thonk-cri

Stuff like this just makes me wonder what the interior experience of your usual techbro is (or isn't) like. I mean, I've had experiences where I've read over stuff that I wrote years ago describing ambitions I've given up on or ways of seeing the world that I've abandoned and I know the nostalgic experience that that creates, the grief for a past self. But you don't really need a chatbot for that.

you are viewing a single comment's thread
view the rest of the comments
[-] TrashGoblin@hexbear.net 23 points 2 months ago

Since LLMs don't deal with meaning, the work of creating meaning is partly done by the past writers whose work was ingested, but also very much by the reader. This is not unlike scapulomancy or Tarot reading, only the role of the reader is obscured, because the message from the bot falsely appears complete.

[-] LeninWeave@hexbear.net 5 points 2 months ago

I think this comment really encapsulates why people doing computer touching degrees should be forced to study at least a little bit of literary analysis or philosophy. Because you make an excellent point which I guarantee LLM hype artists like the author will never consider.

this post was submitted on 11 Dec 2025
43 points (100.0% liked)

Slop.

793 readers
540 users here now

For posting all the anonymous reactionary bullshit that you can't post anywhere else.

Rule 1: All posts must include links to the subject matter, and no identifying information should be redacted.

Rule 2: If your source is a reactionary website, please use archive.is instead of linking directly.

Rule 3: No sectarianism.

Rule 4: TERF/SWERFs Not Welcome

Rule 5: No bigotry of any kind, including ironic bigotry.

Rule 6: Do not post fellow hexbears.

Rule 7: Do not individually target federated instances' admins or moderators.

founded 1 year ago
MODERATORS