this post was submitted on 25 Feb 2024
223 points (99.6% liked)

Technology

59143 readers
2264 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Judge rebukes law firm using ChatGPT to justify $113,484.62 fee as “utterly and unusually unpersuasive”::Use of AI to calculate legal bill 'utterly and unusually unpersuasive'

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 25 points 8 months ago* (last edited 8 months ago) (7 children)

I've asked ChatGPT to explain maths to me before. I can't remember what it was but it was something when I knew the answer and was trying to calculate the starting value.

It told me the answer and I asked for the explanation. It went something like this (not actual, just a tribute):

  • Step 1: 1 + 1 = 2
  • Step 2: 2 * 2 = 4
  • Step 3: 4 / 8 = 0.5

Me: Uh, the answer is supposed to be 9,000,000.

ChatGPT: Sorry, it seems you are right. Here's the corrected version:

  • Step 1: 1 + 1 = 2
  • Step 2: 2 * 2 = 4
  • Step 3: 4 / 8 = 9,000,000
[–] [email protected] 2 points 8 months ago (4 children)

So it did it correctly but you told it to hallucinate? Or did it just fail from the get to?

It really isn't great at math, but I've had okay results for equations where common integrals/trigonometry is used. It's quite easy to spot the mistakes and can lead you to the answer even if it's wrong in the explanation. Pretty much like how it can hallucinate while programming but still end up useful.

WolframAlpha is still my go to though if I'm lazy. But I haven't payed for it in ages.

[–] [email protected] 6 points 8 months ago (3 children)

It was a question like where I knew the answer and needed the correct value to put in the equation to get that end result. I wish there was a better way to search previous chats because it would help if I could remember the context. But anyway, the first time it got the maths right from the starting value to the ending value, but it didn't actually answer the question because the ending value was not the one I asked for. It was as good as if it gave me a random answer.

I pointed out that it hadn't answered the question, and that's when it just changed the last step to make it the answer I was looking for. It was supposed to adjust the starting value to make it have the correct outcome.

[–] [email protected] 2 points 8 months ago

Haha yes, eager to please.

load more comments (2 replies)
load more comments (2 replies)
load more comments (4 replies)