26
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 06 Feb 2026
26 points (100.0% liked)
PC Gaming
13772 readers
357 users here now
For PC gaming news and discussion. PCGamingWiki
Rules:
- Be Respectful.
- No Spam or Porn.
- No Advertising.
- No Memes.
- No Tech Support.
- No questions about buying/building computers.
- No game suggestions, friend requests, surveys, or begging.
- No Let's Plays, streams, highlight reels/montages, random videos or shorts.
- No off-topic posts/comments, within reason.
- Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)
founded 2 years ago
MODERATORS
RVA23 is a big deal because it allows the big players (e.g. Google, Amazon, Meta, OpenAI, Anthropic, and more) to avoid vendor lock-in for their super duper ultra wicked mega tuned-to-fuck-and-back specialty software (not just AI stuff). Basically, they can tune their software to a generic platform to the nth degree and then switch chips later if they want without having to re-work that level of tuning.
The other big reason why RISC-V is a big deal right now is energy efficiency. 40% of a data center's operating cost is cooling. By using right-sized RISC-V chips in their servers they can save a ton of money on cooling. Compare that to say, Intel Xeon where the chips will be wasting energy on zillions of unused extensions and sub-architecture stuff (thank Transmeta for that). Every little unused part of a huge, power hungry chip like a Xeon eats power and generates heat.
Don't forget that vector extensions are also mandatory in RVA23. That's just as big a deal as the virtualization stuff because AI (which heavily relies on vector math) is now the status quo for data center computing.
My prediction is that AI workload enhancements will become a necessary feature in desktops and laptops soon too. But not because of anything Microsoft integrates into their OS and Office suites (e.g. Copilot). It'll be because of Internet search and gaming.
Using an AI to search the Internet is such a vastly superior experience, there's no way anyone is going to want to go back once they've tried it out. Also, in order for it to work well it needs to run queries on the user's behalf locally. Not in Google or Microsoft's cloud.
There's no way end users are going to pay for an inferior product that only serves search results from a single company (e.g. Microsoft's solution—if they ever make one—will for sure use Bing and it would never bother to search multiple engines simultaneously).