this post was submitted on 21 Jul 2023
36 points (100.0% liked)

Technology

12 readers
1 users here now

This magazine is dedicated to discussions on the latest developments, trends, and innovations in the world of technology. Whether you are a tech enthusiast, a developer, or simply curious about the latest gadgets and software, this is the place for you. Here you can share your knowledge, ask questions, and engage in discussions on topics such as artificial intelligence, robotics, cloud computing, cybersecurity, and more. From the impact of technology on society to the ethical considerations of new technologies, this category covers a wide range of topics related to technology. Join the conversation and let's explore the ever-evolving world of technology together!

founded 2 years ago
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 9 points 1 year ago (2 children)

Whenever I see this I have to chuckle. Humans do this all the time, whip stuff up as they go, lie, pretend. The AI can only do what it has learned from us people. So it lies, makes stuff up and pretends. Why is this so surprising?

[–] [email protected] 5 points 1 year ago (1 children)

I prefer the term "confabulation" to "lying", both because it's more accurate and it's more fun to say. Confabulation is when you don't know that you're lying, it's just your dumb brain coming up with stuff that turns out not to be real. Like if you're asked "are there any red cars parked on the street in the neighborhood where you live?" Your brain hears "I want a memory of a red car parked on the street" and it helpfully delivers exactly that.

[–] [email protected] 0 points 1 year ago* (last edited 1 year ago) (1 children)

I confabulate way too much. Hear something, remember thing; but thing I remember isn't from thing I remember it being from. Now I am "spreading misinformation." No... I just suffer from the dumb. 😩

[–] [email protected] 1 points 1 year ago

It turns out that it's super easy to provoke the human brain to generate false memories about stuff. I've read about some of the research that Elizabeth Loftus has done and it's eerie.

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago)

I think it's just humorous. AI chat models have no capacity to understand the subject matter, it's job is simply to regurgitate it's findings on request. Naturally, it's a bad liar.