this post was submitted on 28 Apr 2025
6 points (80.0% liked)
Programming Circlejerk
161 readers
20 users here now
Community to talk about enlightened programming takes
Rules:
- read and follow the programming.dev code of conduct
- no flamewars
- mark your unjerks
- only programming related content allowed
- link to the original source
- do not mention this community in places like hackernews, lobste.rs, or the general programming communities on Lemmy where we source jerk material from.
founded 2 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
A 70GB code base is not something that’s ever supposed to happen. Pretending that’s okay for any product is the actual madness here.
I guarantee you this is the sort of app that accidentally ships 3 full copies of Google Chrome.
In the comments, they state that it is point cloud data for gaussian splatting. I don't know what that is, but I suspect is is information that may not be best handled by git.
Edit: and high fidelity extended reality content.
Edit3: yeah, it looks like they are putting large amounts of binary data in the repo.
they're putting large amounts of binary data in git lfs, which is what it was designed for. lfs does have some rough edges though. if I had something really heavy on binary blobs, ex a large game or similar, idk if I'd be using git either. he extrapolates way too far though, most use cases don't run into any of these problems
and storing all that in a separate db is insane, it's stuff that should be versioned with the code. it's likely being created at a specific rev for the current code, and it evolves with it. if you git revert or create a pr, it should be part of that. it's not like they committed built binaries or smth. there should be tools able to handle this. git could be one of them if lfs was less rough
Sounds like he needs to lean about databases (or at least, about the fact you don’t commit the database to version control 😅)