this post was submitted on 08 Jan 2025
37 points (91.1% liked)

Out of the loop

11157 readers
222 users here now

A community that helps people stay up to date with things going on.

founded 2 years ago
MODERATORS
 

I saw a meme about something of fake frames, but i don't know what happened.

top 19 comments
sorted by: hot top controversial new old
[–] [email protected] 25 points 1 day ago* (last edited 1 day ago) (4 children)

Fake frames is "frame generation" for Nvidia it's called ~~DLLS.~~

Rather than having the graphics card create 120 frames, you can crank the settings up to where you only get 60, then AI "guesses" what the next frame would show doubling it to 120 but keeping the higher settings.

This can make things blurry because the AI may guess wrong. So every odd frame is real, every even frame is just a guess.

Frame 1: real

Frame 2: guess

Frame 3: real

If the guess for #2 is accurate, everything is cool, if #2 guessed a target moves left when it moved right then #3 corrects and that "blink" is the problem.

The bigger issue is developers relying on that tech so they don't have to optimize code. So rather than DLSS being an extra ompf, it's going to be required for "acceptable" performance

[–] [email protected] 24 points 1 day ago (2 children)

Not to be nitpicky but DLSS is a different technology than frame generation, though it also involves AI guessing - just in a different way. DLSS (Deep Learning Super Sampling) means rendering the game at a lower resolution than your screen's output, then having it upscaled to the correct resolution via AI. This is much more performance friendly than native rendering and can often lead to a better looking visual end product than turning graphics features off and rendering natively - though it will depend on the game, genre and personal preference.

Frame generation is as you described. Worth noting is that DLSS without frame generation doesn't suffer issues like artifacts and input lag in the same manner as FG turned on. Frame generation also works better the higher your base frame rate is, so it's a bit of a "win-more". Using FG to go from 30 to 60 FPS will feel much worse than using it to go from 60 to 120.

The fake frames memes I believe stem from the updated frame generation technology in the 50 series guessing three frames at a time instead of one. So in effect you'll end up with a majority of the frames you see being "fake".

[–] [email protected] 6 points 1 day ago* (last edited 1 day ago)

On the other hand, NVIDIA has started to consolidate all these technologies as the NVIDIA DLSS suite a few months ago for some reason.

So it's DLSS Super Resolution and DLSS Frame Generation, DLSS Ray Reconstruction and so on, with the exception of DLAA. Probably because that would get too stupid even for them.

[–] [email protected] 1 points 1 day ago* (last edited 1 day ago)

DLSS is a different technology than frame generation

Thanks! Got them mixed up

2 fake frames instead of just one I hadn't heard about either. I already leave it off on a 4070 super because 1:1 is already bad enough

[–] [email protected] 17 points 1 day ago (1 children)

To add on to this, the 5000 series now generates 3 fake frames per real frame instead of just 1.

[–] [email protected] 1 points 1 day ago (2 children)

Is “fake” being used as a pejorative here?

[–] [email protected] 3 points 21 hours ago (1 children)

I was just using the term that the previous commenter used to keep terms consistent.

[–] [email protected] 2 points 21 hours ago

Yeah not sure if there’s a better word to use without coming across as pedantic.

Fake certainly implies these are worse (which they of course are), but I’m not sure if they’re that much worse. I think in many scenarios the proverbial juice would absolutely be worth the squeeze, but naysayers seem to disagree with that sentiment.

[–] [email protected] 1 points 22 hours ago
[–] [email protected] 8 points 1 day ago (2 children)

Can someone explain how AI can generate a frame faster than the conventional method?

[–] [email protected] 4 points 23 hours ago

It is image processing with statiatics rather than traditional rendering. It is a completely separate process. Also, NVidia GPUs (and the new upcoming AMD ones too) also have hardware built into the chip specifically for this.

[–] [email protected] -1 points 1 day ago (1 children)

(that's part of the grift)

[–] [email protected] 3 points 1 day ago (1 children)

Which part? I mean even if it isn't generating the frames well, it's still doing the work. So that capability is there. What's the grift?

[–] [email protected] 2 points 22 hours ago* (last edited 22 hours ago)

That it's reliable. The key point they're selling is that devs don't need to optimize their engines as much, of course obfuscated under a lot of other value-adds.

I'd go further than this and say part of our problems are generally that optimization of code isn't a focus anymore. Apps which merely interface with web APIs are more than 90mb sometimes. That's embarrassing.

That an AI can step in as savior for poor coding practices, is really a bandage stuck on the root cause.

[–] [email protected] 3 points 1 day ago

I see, thank you

[–] [email protected] 11 points 1 day ago (1 children)

A saw a graphic the other day that was comparing the number of frames generated between the 4x and 5x, and people in the comments were saying that the 5x uses AI frame generation to speed things up

People in the know would know that AI is largely hype, and the generated frames probably don't look as good as if they had been properly rendered

[–] [email protected] 3 points 22 hours ago (2 children)

Yeah, but if you have a high refresh rate monitor and you want 4k plus 240 hz then you probably need this.

[–] [email protected] 7 points 21 hours ago

What's the point of having it in 4K if it's slathered to hell and back with vaseline?

[–] [email protected] 2 points 17 hours ago

That's what VRR is for. Plus, bolstering marketing graphs with fake frames is just dishonest.