this post was submitted on 07 Jan 2025
157 points (95.9% liked)

Games

33090 readers
1018 users here now

Welcome to the largest gaming community on Lemmy! Discussion for all kinds of games. Video games, tabletop games, card games etc.

Weekly Threads:

What Are You Playing?

The Weekly Discussion Topic

Rules:

  1. Submissions have to be related to games

  2. No bigotry or harassment, be civil

  3. No excessive self-promotion

  4. Stay on-topic; no memes, funny videos, giveaways, reposts, or low-effort posts

  5. Mark Spoilers and NSFW

  6. No linking to piracy

More information about the community rules can be found here.

founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 27 points 1 day ago

I'll just keep buying AMD thanks.

[–] [email protected] 48 points 1 day ago (7 children)

They're not even pretending to be affordable any more.

[–] [email protected] 8 points 17 hours ago

Nvidia is just doing what every monopoly does, and AMD is just playing into it like they did on CPUs with Intel. They'll keep competing for price performance for a few years then drop something that drops them back on top (or at least near it).

[–] [email protected] 9 points 1 day ago (1 children)

Buy 800 of those or buy a house. Pick.

[–] [email protected] 5 points 1 day ago (1 children)

I can't do either of those. I chose Option C, a hospital visit.

[–] [email protected] 5 points 1 day ago

Look at money bags over here. He can go to the HoSpItAl.

load more comments (5 replies)
[–] [email protected] 55 points 2 days ago (1 children)

By rendering only 25% of the frames we made DLSS4 100% faster than DLSS3. Which only renders 50% of the frames! - NVIDIA unironically

[–] [email protected] 25 points 2 days ago* (last edited 2 days ago)

You living in the past, rendering 100% of the frames is called Brute Force Rendering, that's for losers.

With only 2k trump coins our new graphic card can run Cyberpunk 2077, a game from 4 years ago, at 30 fps with RTX ON but you see with DLSS and all the other ~~crap~~ magic we can run at 280 FPS!!! Everything is blurry and ugly as fuck but look at the numbers!!!

[–] [email protected] 68 points 2 days ago (5 children)

Maybe I’m stuck in the last decade, but these prices seem insane. I know we’ve yet to see what a 5050 (lol) or 5060 would be capable of or its price point. However launching at $549 as your lowest card feels like a significant amount of the consumer base won’t be able to buy any of these.

[–] [email protected] 1 points 7 hours ago

Don't forget to mention the huge wattage.

More performance for me is more identical fps at the same amount of power.

[–] [email protected] 3 points 20 hours ago (1 children)

You have to keep inflation in mind. 550 would be 450 2019 dollars.

[–] [email protected] 1 points 17 hours ago

Yeah, I keep forgetting how much time has passed.

Bought my first GPU, an R9 Fury X, for MSRP when it launched. The R9 300 series and GTX 900 series seemed fairly priced then (aside from the Titan X). Bought another for Crossfire and mining, holding on until I upgraded to a 7800 XT.

Comparing prices, all but the 5090 are within $150 of each other when accounting for inflation. The 5090 is stupid expensive. A $150 increase in price over a 10-year period probably isn’t that bad.

I’m still gonna complain about it and embrace my inner “old man yells at prices” though.

[–] [email protected] 28 points 2 days ago* (last edited 2 days ago) (2 children)

Sadly I think this is the new normal. You could buy a decent GPU, or you could buy an entire game console. Unless you have some other reason to need a strong PC, it just doesn't seem worth the investment.

At least Intel are trying to keep their prices low. Until they either catch on, in which case they'll raise prices to match, or they fade out and leave everyone with unsupported hardware.

[–] [email protected] 2 points 20 hours ago

As always, buying a used previous gen flagship is the best value.

[–] [email protected] 19 points 2 days ago (6 children)

Actually AMD has said they're ditching their high end options and will also focus on budget and midrange cards. AMD has also promised better raytracing performance (compared to their older cards) so I don't think it will be the new norm if AMD also prices their cards competitively to Intel. The high end cards will be overpriced as it seems like the target audience doesn't care that they're paying shitton of money. But budget and midrange options might slip away from Nvidia and get cheaper, especially if the upscaler crutch breaks and devs have to start doing actual optimizations for their games.

load more comments (6 replies)
[–] [email protected] 21 points 2 days ago (9 children)

They'll sell out anyways due to lack of good competition. Intel is getting there but still have driver issues, AMD didn't announce their GPU prices yet but their entire strategy is following Nvidia and lowering the price by 10% or something.

load more comments (9 replies)
load more comments (1 replies)
[–] [email protected] 47 points 2 days ago (2 children)

This is absolutely 3dfx level of screwing over consumers and all about just faking frames to get their "performance".

[–] [email protected] 33 points 2 days ago (5 children)

They aren't making graphics cards anymore, they're making AI processors that happen to do graphics using AI.

load more comments (5 replies)
[–] [email protected] 5 points 1 day ago

"T-BUFFER! MOTION BLUR! External power supplies! Wait, why isn't anyone buying this?"

[–] [email protected] 72 points 2 days ago* (last edited 2 days ago) (1 children)

The performance improvements claims are a bit shady as they compare the old FG technique which only creates one frame for every legit frame, with the next gen FG which can generate up to 3.

All Nvidia performance plots I've seen mention this at the bottom, making comparison very favorable to the 5000 series GPU supposedly.

Edit:

[–] [email protected] 28 points 2 days ago (9 children)

Thanks for the heads up.

I really don't like that new Frame interpolation tech and think it's almost only useful to marketers but not for actual gaming.

At least I wouldn't touch it with any competitive game.

Hopefully we will get third party benchmarks soon without the bullshit perfs from Nvidia.

load more comments (9 replies)
[–] [email protected] 42 points 2 days ago (9 children)

LOL, their demo shows Cyberpunk running at a mere 27fps on the 5090 with DLSS off. Is that supposed to sell me on this product?

load more comments (9 replies)
[–] [email protected] 16 points 1 day ago (1 children)

I'm sure these will be great options in 5 years when the dust finally settles on the scalper market and they're about to roll out RTX 6xxx.

[–] [email protected] 14 points 1 day ago* (last edited 1 day ago) (2 children)

Scalpers were basically non existent in the 4xxx series. They're not some boogieman that always raises prices. They work under certain market conditions, conditions which don't currently exist in the GPU space, and there's no particular reason to think this generation will be much different than the last.

Maybe on the initial release, but not for long after.

[–] [email protected] 1 points 16 hours ago

Scalpers were basically non existent in the 4xxx series.

Bull fucking shit. I was trying to buy a 4090 for like a year. Couldn't find anything even approaching retail. Most were $2.3k+.

[–] [email protected] 10 points 1 day ago* (last edited 1 day ago) (1 children)

The 4090 basically never went for MSRP until Q4 2024.... and now it's OOS everywhere.

nobody scalped the 4080 because it was shit price/perf. 75% of the price of a 4090 too... so why not just pay the extra 25% and get the best?

the 4070ti (aka base 4080) was too pricey to scalp given that once you start cranking up the price then why not pay the scalper fee for a 4090.

Things below that are not scalp worthy.

[–] [email protected] 0 points 16 hours ago (1 children)

The 4090 basically never went for MSRP until Q4 2024

This had nothing to do with scalpers though. Just pure corporate greed.

[–] [email protected] 1 points 26 minutes ago

I'm not so sure. Companies were definitely buying many up, but they typically stick to business purchasing channels like CDW/Dell/HP etc.

Consumer boxed cards sold by retailers might have went to some small businesses/startups and independent developers but largely they were picked up by scalpers or gamers.

I work in IT and have never went to a store to buy a video card unless it was an emergency need to get a system functional again. It's vastly preferred to buy things through a VAR where warranties and support are much more robust than consumer channels.

[–] [email protected] 9 points 1 day ago (1 children)

The far cry benchmark is the most telling. Looks like it's around a 15% uplift based on that.

[–] [email protected] 3 points 1 day ago (2 children)

About two months ago I upgraded from 3090 to 4090. On my 1440p I basically couldn't tell. I play mostly MMOs and ARPGs.

[–] [email protected] 7 points 1 day ago

Shouldn’t have upgraded then….

[–] [email protected] 2 points 1 day ago* (last edited 1 day ago)

Those genres aren't really known for having brutal performance requirements. You have to play the bleeding edge stuff that adds prototype graphics postprocessing in their ultra or optional settings.

When you compare non RT performance the frame delta is tiny. When you compare RT it's a lot bigger. I think most of the RT implementations are very flawed today and that it's largely snake oil so far, but some people are obsessed.

I will say you can probably undervolt / underclock / power throttle that 4090 and get great frames per watt.

[–] [email protected] 13 points 2 days ago (3 children)

Two problems, they are big ones:

  1. The hardware is expensive for a marginal improvement
  2. The games coming out that best leverage the features like Ray tracing are also expensive and not good
load more comments (3 replies)
load more comments
view more: next ›