145
you are viewing a single comment's thread
view the rest of the comments
[-] TraschcanOfIdeology@hexbear.net 4 points 22 hours ago* (last edited 22 hours ago)

I mean, it could've worked well at the beginning, then fallen off the rails for some reason or another.

That's the dumb and scary thing about AI stuff: it might work today, it might work for years (if you're lucky), but every time you execute a prompt, you're rolling the dice on whether the mystery box will decide to just make up some shit from here on out. If you need a person to check the AI's output to make sure it's not hallucinating, might as well cut the Ai off from the loop altogether and use the checker's output from the get-go.

this post was submitted on 16 Feb 2026
145 points (100.0% liked)

technology

24251 readers
416 users here now

On the road to fully automated luxury gay space communism.

Spreading Linux propaganda since 2020

Rules:

founded 5 years ago
MODERATORS