this post was submitted on 10 Dec 2024
247 points (100.0% liked)
TechTakes
1489 readers
83 users here now
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
these are compute GPUs that don't even have graphics ports
Yes, my point is that the compute from those chips can still be used. Maybe on actually useful machine learning tools that will be developed latter, or some other technology which might make use of parallel computing like this.
I'm waiting on the a100 fire sale next year
I know of at least one company that uses cuda for ray-tracing for I believe ground research, so there is definitely already some usefull things happening.
I mean there are a lot of applications for linear algebra, although I admit I don't fully know in what way "AI" uses linear algebra and what other uses overlap with it.