this post was submitted on 27 Jun 2023
163 points (100.0% liked)

Chat

7498 readers
15 users here now

Relaxed section for discussion and debate that doesn't fit anywhere else. Whether it's advice, how your week is going, a link that's at the back of your mind, or something like that, it can likely go here.


Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

For me I say that a truck with a cab longer than its bed is not a truck, but an SUV with an overgrown bumper.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 1 year ago (1 children)

Nope, it's only matching the prompt with the most likely answer from its training set. Do you remember in the early days when it would be asked slightly tweaked riddles and it would get them incorrectly, it'd just spew out something that sounded like the original answer but was completely wrong in the current context? Or how it just made up nonexistent court cases for that one lawyer that tried to use it without actually checking if it's correct?

LLMs are just guessing the answer based on millions of similar answers they have been trained with. It's a language syntax generator, it has no clue what it is actually saying.

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago)

I know this, I've worked on LLMs and other neural networks so I was wondering what kind of difference you could make out. Humans do the same thing, they just have more neurons and use more sophisticated training modes and activation mechanisms as well as propagation patterns.

So what I'm saying is that you can't tie intelligence to the fundamental mechanism because it's the same, only humans are more developed. And maturity on the other hand is a highly subjective and arbitrary criterion—when is the system mature enough to be considered intelligent?