1048
True (i.postimg.cc)
submitted 2 years ago by [email protected] to c/[email protected]
you are viewing a single comment's thread
view the rest of the comments
[-] [email protected] 5 points 2 years ago

PrivateGPT + CS books = ask books questions while self learning?

[-] [email protected] 7 points 2 years ago

The issue with that is that LLMs tend to lie when they dont know something. The best tool for that is stackoverflow, lemmy, matrix, etc.

[-] [email protected] 5 points 2 years ago

Yeah, and they don't just lie. They lie extremely convincingly. They're very confident. If you ask them to write code, they can make up non existent libraries.

In theory, it may even be possible to use this as an attack vector. You could ask an AI repeatedly to generate code and whenever it hallucinates, claim that package for yourself with a malicious package. Then you just wait for some future victim to do the same.

this post was submitted on 13 Jul 2023
1048 points (98.5% liked)

Programmer Humor

36554 readers
537 users here now

Post funny things about programming here! (Or just rant about your favourite programming language.)

Rules:

founded 5 years ago
MODERATORS