this post was submitted on 27 Jul 2023
240 points (84.1% liked)

Technology

59197 readers
3770 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 3 points 1 year ago (1 children)

Any automated system (cars, LLMs, etc) only need to be better than a human doing that job. Your example, for, um, example, ignores that self-driving trucks don’t need to take sleep breaks, or bathroom breaks, or spend time with their families, etc.

I'm not ignoring that, but you're ignoring that these systems have their own associated costs. A self driving truck might not need bathroom breaks, but it does need regular maintenance which given the increased complexity of such a system is going to be significantly more expensive than a normal truck and require more skilled labor to properly maintain. That's why I said it's more expensive, but large companies can make it up in volume. The extra expense only makes sense if you can take advantage of the E.G. increased transport capacity provided.

Using the assumption that this is the bottom of the curve for this LLM technology and that we still have a lot of expansion in the tech coming in a relatively short amount of time, then I would guess that any job that makes art that is “work for hire” will cease to exist, and I imagine programming is going to take a pretty big hit in available jobs. I don’t think you’ll be able to get rid of human programmers altogether, but you’ll need way fewer of them.

You're assuming that LLMs can ever be made accurate. I think you might be able to make them somewhat more accurate, but you'll never be able to trust their output implicitly. You will always need someone reviewing and fixing what they produce. For something entirely subjective like art that's probably acceptable, but not for anything that requires any amount of accuracy.

As a programmer I am absolutely not worried in the slightest that LLMs are coming for my job. I've seen LLM produced programs, they're an absolute trash fire, most of them won't even compile let alone produce correct output. LLMs might be coming for really really bad programmers jobs, but anyone with even a shred of talent has nothing to worry about.

There's a famous saying out there about programming that goes:

You can write a program that's so simple there's obviously no problems, or a program that's so complicated there's no obvious problems.

LLMs are very much an exercise in the later not the former. I'm sure there will be a bunch of jobs soon for programmers to check and fix LLM generated code, but you couldn't pay me to do that job. That's going to be absolutely miserable work and way harder than just writing the code yourself in the first place. Ultimately companies are going to figure out it's cheaper to just skip the LLM in the first place and then the whole thing will be dead. One things for sure though, you won't need fewer programmers, you'll need more of them.

[–] [email protected] 5 points 1 year ago

That’s why I said it’s more expensive, but large companies can make it up in volume. The extra expense only makes sense if you can take advantage of the E.G. increased transport capacity provided.

Isn't this functionally the same thing? What happens to smaller companies in this hypothetical? Are you not assuming that they get pushed out of the market shortly thereafter?

You’re assuming that LLMs can ever be made accurate. I think you might be able to make them somewhat more accurate, but you’ll never be able to trust their output implicitly.

I am assuming this. I am assuming that we're at the bottom of this technology's sigmoid curve, there is going to be a ton of growth in a relatively short amount of time. I guess we'll have to wait to see which one of us has a better prediction.

As a programmer I am absolutely not worried in the slightest that LLMs are coming for my job. I’ve seen LLM produced programs, they’re an absolute trash fire, most of them won’t even compile let alone produce correct output. LLMs might be coming for really really bad programmers jobs, but anyone with even a shred of talent has nothing to worry about.

You have described the state of LLMs right now. Programming languages seem like a perfect fit for a LLM; they're extremely structured and meticulously (well, mostly) defined. The concepts and algorithms used not overly complex for a LLM. There doesn't need to be much in the way of novel creativity create solutions for standard use cases. The biggest difficulty I've seen is just getting the prompting clear enough. I think a majority of the software engineering field is on the chopping block, just like the "art for hire" crowd. People pushing the limits of the fields will be safe but that's a catch 22, isn't it? If low-level entry is impossible, how does one get to be a high-level professional?

And even if we take your [implied] stance that this is the top of the S-curve and LLMs aren't going to get much better-- it will still be a useful tool for human programmers to increase productivity and reduce available jobs.