641
submitted 1 week ago by [email protected] to c/[email protected]
you are viewing a single comment's thread
view the rest of the comments
[-] [email protected] 56 points 6 days ago* (last edited 6 days ago)

I love how the LLM just tells that it has done something bad with no emotion and then proceeds to give detailed information and steps on how.

It feels like mockery.

[-] [email protected] 30 points 6 days ago
[-] [email protected] 2 points 5 days ago

Yes man would do this for sure, but only if you actually gave it permission. Hence the name.

[-] [email protected] 29 points 6 days ago

I wouldn’t even trust what it tells you it did, since that is based on what you asked it and what it thinks you expect

[-] [email protected] 9 points 5 days ago

It doesn’t think.

It has no awareness.

It has no way of forming memories.

It is autocorrect with enough processing power to make the NSA blush. It just guesses what the next word in a sentence should be. Just because it sounds like a human doesn’t mean it has any capacity to have human memory or thought.

[-] [email protected] 1 points 4 days ago

Okay, what it predicts you to expect /s

[-] [email protected] 1 points 5 days ago

It's just a prank bro

this post was submitted on 21 Jul 2025
641 points (98.8% liked)

Technology

296 readers
320 users here now

Share interesting Technology news and links.

Rules:

  1. No paywalled sites at all.
  2. News articles has to be recent, not older than 2 weeks (14 days).
  3. No videos.
  4. Post only direct links.

To encourage more original sources and keep this space commercial free as much as I could, the following websites are Blacklisted:

More sites will be added to the blacklist as needed.

Encouraged:

founded 2 months ago
MODERATORS