this post was submitted on 18 Nov 2024
177 points (78.1% liked)

memes

10668 readers
2283 users here now

Community rules

1. Be civilNo trolling, bigotry or other insulting / annoying behaviour

2. No politicsThis is non-politics community. For political memes please go to [email protected]

3. No recent repostsCheck for reposts when posting a meme, you can only repost after 1 month

4. No botsNo bots without the express approval of the mods or the admins

5. No Spam/AdsNo advertisements or spam. This is an instance rule and the only way to live.

Sister communities

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 53 points 1 month ago (2 children)

All of which keep buying his products...

[–] [email protected] 36 points 1 month ago* (last edited 1 month ago) (3 children)

People buy Nvidia no matter what. Even when they aren't the best choice. Then those same people complain about Nvidia doing the anticompetitive things they do.

The best is when people cheer for AMD making something great, only so they can buy an Nvidia card cheaper, as if the only reason AMD exists is to subsidise their Nvidia purchase!

Nvidia's greatest asset is the mindshare they have.

[–] [email protected] 33 points 1 month ago* (last edited 1 month ago) (2 children)

Well that and CUDA still means a load of professionals in various fields are stuck using Nvidia whether they like it or not. This means data centers are incentivised to go with Nvidia if they want those customers, which ultimately means if someone gonna work on code/tools that run in those data centers, you want the same architecture on your local machine for development and testing.

It's getting better, but the gap is still real. Hopefully the guys that are working on SCALE can actually get it working on the CDNA GPUs one day, since data centers are where a lot of the CUDA is running or perhaps the UDNA stuff AMD just announced will enable this.

The fact this is all hinging on the third party that develops SCALE, should highlight that AMD still doesn't seem to be playing the same game as Nvidia, which is why we're still in this position.

[–] [email protected] 17 points 1 month ago* (last edited 1 month ago)

Definitely. CUDA has had a long headstart, and Nvidia were very clever in getting it entrenched early on, particularly in universities and such. It's also just... generally does the job.

My above comment was purely on the gaming side

[–] [email protected] 1 points 1 month ago

Wait, didn't Zluda legally become a thing?

[–] [email protected] 13 points 1 month ago

100%

"I want change!"

*Doesn't do anything to change*

"Why hasn't anything changed?"

[–] [email protected] 0 points 1 month ago

I would have much preferred giving AMD money instead, but at their best the lack of DLSS performance was meaningful when everyone thought Cyberpunk was the new standard of graphical fidelity with the 6000/3000 series.

[–] [email protected] 2 points 1 month ago* (last edited 3 weeks ago)

The linear algebraic computations performed on their GPU's tensor cores (since the Turing era) combined with their CUDA and cuDNN software stack have the fastest performance in training deep neural network algorithms.

That may not last forever, but it's the best in terms of dollars per TOPS an average DNN developer like myself has access to currently.