this post was submitted on 27 Nov 2023
1036 points (98.2% liked)

People Twitter

5290 readers
2798 users here now

People tweeting stuff. We allow tweets from anyone.

RULES:

  1. Mark NSFW content.
  2. No doxxing people.
  3. Must be a tweet or similar
  4. No bullying or international politcs
  5. Be excellent to each other.

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 3 points 1 year ago (1 children)

The industry word for it is "hallucination", but I'm not sure that fits either.

[–] [email protected] 2 points 1 year ago (2 children)

It's better than lying, but it still implies consciousness. It also implies that it's doing something different than what it normally does.

In reality, it's always just generating plausible words.

[–] [email protected] 1 points 1 year ago (1 children)

It's bullshitting... Faking it till it makes it, if you will.

[–] [email protected] 2 points 1 year ago

No, that implies a goal. It's just spicy autocomplete.

[–] [email protected] 0 points 1 year ago* (last edited 1 year ago) (1 children)

It is certainly more complex than a predictive text machine. It does seem to understand the concept of objective truth, and facts, vs interpretation and inaccurate information. It never intentionally provides false information, but sometimes it thinks it is giving factual information when really it is using an abundance of inaccurate information that it was trained with. I'm honestly surprised at how accurate it usually is, considering it was trained with public data from places like Reddit, where common inaccuracies have reached the level of folklore.

[–] [email protected] 2 points 1 year ago

It is certainly more complex than a predictive text machine

No, it literally isn't. That's literally all it is.

It does seem to understand

Because people are easily fooled, but what it seems like isn't what's actually happening.

but sometimes it thinks it is giving factual information

It's incapable of thinking. All it does is generate a plausible sequence of words.