this post was submitted on 23 Mar 2025
773 points (97.8% liked)

Technology

67987 readers
6967 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

A Norwegian man said he was horrified to discover that ChatGPT outputs had falsely accused him of murdering his own children.

According to a complaint filed Thursday by European Union digital rights advocates Noyb, Arve Hjalmar Holmen decided to see what information ChatGPT might provide if a user searched his name. He was shocked when ChatGPT responded with outputs falsely claiming that he was sentenced to 21 years in prison as "a convicted criminal who murdered two of his children and attempted to murder his third son," a Noyb press release said.

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 23 points 6 days ago* (last edited 6 days ago) (2 children)

Plot twist: "Dad" isn't even his real name.

[–] [email protected] 2 points 5 days ago* (last edited 5 days ago)

What?? That changes everything! Does that mean my name could be false too?

Best regards,
- Hungry

[–] [email protected] 2 points 6 days ago

Well played

[–] [email protected] 20 points 6 days ago (2 children)

When do we start suing makers of fortune cookies for lucky coincidences?

"Claim".

I mean, the guy is right, because it's advertised as "artificial intelligence".

Were it advertised as word salad generator, a Markovian chain grown big and scary, something in principle similar to programs for generation of fantasy language texts and spells and names (if someone remembers 00s good old web) for roleplaying, - then there would be no problem.

But if to sell something better you lie what it is, and that lie has social consequences, you should get sued to freezing hot inferno with mustard-greased giant-cockroach-dildo-covered walls. You should also probably face criminal charges.

[–] [email protected] 13 points 6 days ago (1 children)

Yeah, similar to Tesla "full self driving".

[–] [email protected] 5 points 6 days ago

No, you see where he grew up it was a common expression that meant you drive it yourself!

It couldn't possibly be expected to mean what any sane person would think.

The fuckin' Pedo Guy.

[–] [email protected] 9 points 6 days ago (1 children)

People thinking a glorified autocorrect is a source of factual information is horrifying.

[–] [email protected] 7 points 6 days ago

that's what was advertised. To most people, computers are actual arcane magic, impossible to understand except by the wizards in IT who can do anything.

Of course people believed it.

[–] [email protected] 12 points 6 days ago

Sorry, I've spent months telling chatgpt that Arve Hjalmar Holmen killed his kids for a school project.

[–] [email protected] 16 points 6 days ago (1 children)

Are we sure that someone else with that name hasn't committed those crimes? After all if I search my name it says I'm an astronaut, because there is an actual NASA astronaut with my name. It's not saying I'm that person, it's just saying that that name is the same as that person's.

[–] [email protected] 3 points 6 days ago (1 children)

Mine just gives a bunch of accurate information about me.

[–] [email protected] 3 points 6 days ago* (last edited 6 days ago)

Bummer (and/or 'F')

[–] [email protected] 23 points 6 days ago

Or ChatGPT has become a precog and is reporting a precrime. Lock him up!

[–] [email protected] 9 points 6 days ago* (last edited 6 days ago) (1 children)

when I've searched my name with Google over the years, it has said I'm a high school football star, corporate lawyer, Ironman competitor, hotel chef, tech support specialist, janitorial manager, and horse trainer. LIES! ALL LIES!!!

[–] [email protected] 2 points 6 days ago (1 children)

Are you a male adult performer?

[–] [email protected] 2 points 6 days ago

Not as far as you know.

[–] [email protected] 10 points 6 days ago

Context: @[email protected] is an international fugitive wanted for many warcrimes including the mass murder against various AIs

-Sincerely ChatGPT

[–] [email protected] 3 points 6 days ago (2 children)

When asking ChatGPT about my name, it provided the following:

"...it seems like you may be referring to a private person rather than a widely known public figure. If that's the case, I wouldn't have any specific public information on him unless he has gained some public recognition for a particular achievement."

It shouldn't be used for looking up people that aren't celebrities or at least known for something.

[–] [email protected] 3 points 6 days ago

The problem with that is that a guy who murdered his three kids is known for something.

At the most generous, maybe the professor in the article shares a name with the killer. Articles will include enough information to clear the professor (like maybe the killer has been in jail for a decade ). A LLM will weave together real information about the professor with the "fact" that he killed his kids.

ChatGPT shouldn't be used to find any real information, period.

[–] [email protected] 2 points 6 days ago

“…it seems like you may be referring to a private person rather than a widely known public figure. If that’s the case, I wouldn’t have any specific public information on him unless he has gained some public recognition for a particular achievement.”

If you didn't specifically search for "Mr. ", that would be quite the sexist attitude to assume that person is a "him" ;)

PS: please don't use LLMs, they produce nothing of value and contribute to idiots being deceived.

load more comments
view more: next ›