344
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 23 Mar 2026
344 points (98.3% liked)
Technology
84377 readers
3754 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
Me on beefy Linux distro with a metric ton of garbage installed: 3.2GB
Good luck vibe coding memory efficiency...
Hahahahahah
You no want 6GB LLM running as a system service???
Our Microslop overlords have said that I do, so I guess I do? I don't know. Hand on. I'm asking Copi--- I mean... I'm thinking...
You could download more RAM by buying Copilot cloud subscription!
Now there's an idea!
16GB RAM. ~3500 packages building up a nightmarish franken-debian of like ~250ish apps. 1.5GB RAM usage on boot. optimisation really does wonders
edit: yes, true story btw
is it even possible to vibe code efficient memory use properly?
Possible? Yes. Likely? No.
ok. is it even remotely more cost-effective than doing the old-fashioned optimizing?
Probably not but maybe. It's a gamble that I'm guessing most MBAs would love to make!
i guess a better way is just make something less RAM heavy from the get go.
Good luck with that if it's being vibe coded.
i don't think vibe coding will be that sophisticated any time soon.
As long as they are based on LLMs, I don't think it ever will be. But it will get better at pretending it is and passing tests.
the more i know about LLM and their numerous issues the less enthusiastic i am about this whole current AI thing.