308
submitted 1 day ago by [email protected] to c/[email protected]

Update: engineers updated the @Grok system prompt, removing a line that encouraged it to be politically incorrect when the evidence in its training data supported it.

you are viewing a single comment's thread
view the rest of the comments
[-] [email protected] 7 points 1 day ago* (last edited 1 day ago)

Why is PC even factored in? Shouldn't the LLM just favour evidence from the outset?

[-] [email protected] 9 points 1 day ago

no one understands how these models work, they just throw shit at it and hope it sticks

[-] [email protected] 4 points 1 day ago

Well thats just not true, I mean LLMs really are not extremely complicated. At the end of the day it’s just algorithmic sorting of information

So in practice any given flavor of LLM is basically like a librarian. Your librarian can be a well adjusted human or an antisemitic nutjob, but so long as they sort information and can point it out to you technically they are doing their job equally as well. The real problem doesnt begin until youve trained the librarian to recommend Mein Kampf when people ask for information about the water cycle or whatever

[-] [email protected] 8 points 1 day ago

I think they meant people don't know how these models work in practice. On a theoretical level they are well understood. But in practice they behave in a chaotic way (chaotic in the math sense of the word). A small change in the input can lead to wild swings in the output. So when people want to change the way the models acts by changing the system prompt, it's basically impossible to say what change should be made to achieve the desired outcome. And often such a change doesn't even exist, only something that's close enough is possible. So they have to resort to trial and error, trying to tweak things like the system prompt and seeing what happens.

[-] [email protected] 5 points 1 day ago

^-- to my knowledge, this is accurate.

System prompts are the easy but wildly unpredictable way to change LLM output, but we really can't back-trace or debug that output, we guess at what impact the s.p. edits will have.

load more comments (1 replies)
this post was submitted on 09 Jul 2025
308 points (97.5% liked)

Technology

72646 readers
5700 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS