this post was submitted on 22 Dec 2024
391 points (95.6% liked)

Technology

60047 readers
2941 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 4 points 15 hours ago* (last edited 11 hours ago) (2 children)

Being able to summarize and answer questions about a specific corpus of text was a use case I was excited for even knowing that LLMs can't really answer general questions or logically reason.

But if Google search summaries are any indication they can't even do that. And I'm not just talking about the screenshots people post, this is my own experience with it.

Maybe if you could run the LLM in an entirely different way such that you could enter a question and then it tells you which part of the source text statistically correlates the most with the words you typed; instead of trying to generate new text. That way in a worse case scenario it just points you to a part of the source text that's irrelevant instead of giving you answers that are subtly wrong or misleading.

Even then I'm not sure the huge computational requirements make it worth it over ctrl-f or a slightly more sophisticated search algorithm.

[–] [email protected] 3 points 11 hours ago* (last edited 11 hours ago)

Multiple times now, I've seen people post AI summaries of articles on Lemmy which miss out really, really important points.

[–] [email protected] 2 points 14 hours ago* (last edited 11 hours ago)

Well an example of something I think it could solve would be: "I'm trying to set this application up to run locally. I'm getting this error message. Here's my configuration files. What is not set up correctly, or if that's not clear, what steps can I take to provide more helpful information?"

ChatGPT is always okay at that as long as you have everything set up according to the most common scenarios, but it tells you a lot of things that don't apply or are wrong in the specific case. I would like to get answers that are informed by our specific setup instructions, security policies, design standards, etc. I don't want to have to repeat "this is a Java spring boot application running on GCP integrating with redis on docker.... blah blah blah".

I can't say whether it's worth it yet, but I'm hopeful. I might do the same with ChatGPT and custom GPTs, but since I use my personal account for that, it's on very shaky ground to upload company files to something like that, and I couldn't share with the team anyway. It's great to ask questions that don't require specific knowledge, but I think I'd be violating company policy to upload anything.

We are encouraged to use NotebookLLM, however.