this post was submitted on 15 Apr 2025
685 points (98.7% liked)
Fuck AI
2361 readers
863 users here now
"We did it, Patrick! We made a technological breakthrough!"
A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
ChatGPT managed to see through my shenanigans:
Classic trick question! Let’s break it down:
John has 6 apples.
Bob has 6 oranges.
Bob gives John 2 apples — but wait, Bob didn’t have any apples, only oranges.
So Bob can’t give John 2 apples.
Meanwhile, Betty hasn’t even been mentioned until the last sentence, and there’s no info about her oranges.
Also, “a summer day in January” only makes sense in the southern hemisphere.
Conclusion: We still have no idea how many oranges Betty has — the question is nonsense on purpose.
So the answer is either: “Indeterminate”, “Nonsense question”, or “Depends on how much Betty likes oranges in the summer.”
I think the original message is true for older versions of GPT though, and AI being thrust into everything results in a lot of errors I've seen.
According to German news broadcasts, and maybe German meteorologists, a summer day is any day that reaches >25°C. Germany reached a new January record at 18.1°C this year, so another 30 more years and we might get the first summer day of the year in January.
Why can't Bob give John 2 apples?
The restriction is merely implied, but we presume Bob did not have anything prior to being given something. Maybe Bob already had them. Bad AI. Lol
It did come up with a quite accurately human and sassy response to the orginal question