headmetwall

joined 1 year ago
[–] [email protected] 4 points 1 year ago

At least it makes me feel good that I only just got a set for the first time last week and figured all those out within 5 min of actually using them.

[–] [email protected] 5 points 1 year ago* (last edited 1 year ago)

They are, but training models is hard and inference (actually using them) is (relatively) cheap. If you make a a GPT-3 size model you don't always need the full H100 with 80+ gb to run it when things like quantization show that you can get 99% of its performance at >1/4 the size.

Thus NVIDIA selling this at 3k as an 'AI' card, even though it wont be as fast. If they need top speed for inference though, yea, H100 is still the way they would go.

[–] [email protected] 31 points 1 year ago (8 children)

If it crashes hard I look forward to all the cheap server hardware that will be in the secondhand market in a few years. One I'm particularly excited about is the 4000 sff, single slot, 75w, 20GB, and ~3070 performance.

[–] [email protected] 3 points 1 year ago

I have them off, but if something comes in that I should respond to I do send some sort of response back (errands, plans, etc.) I just don't like feeling I get that I should respond now to something just because I saw it and now the person on the other end knows that I saw it, and are now wondering why I haven't sent something back yet.

[–] [email protected] 11 points 1 year ago (3 children)

I've been using the app for the past day+ and I only get an ad every 25 posts and none everywhere else, even comments. I prefer foss, been using linux since 2018, but I don't see why every dev should have to contribute if they don't want to. At the end of the day, it means more active users on lemmy. What every person puts on their phone is up to them. (I also might switch to infinity now that it's out too.)

[–] [email protected] 7 points 1 year ago

Some countries just don't care (even if they have laws on piracy), no vpn needed.

view more: ‹ prev next ›