this post was submitted on 07 Sep 2023
433 points (93.9% liked)
PC Master Race
14970 readers
2 users here now
A community for PC Master Race.
Rules:
- No bigotry: Including racism, sexism, homophobia, transphobia, or xenophobia. Code of Conduct.
- Be respectful. Everyone should feel welcome here.
- No NSFW content.
- No Ads / Spamming.
- Be thoughtful and helpful: even with ‘stupid’ questions. The world won’t be made better or worse by snarky comments schooling naive newcomers on Lemmy.
Notes:
- PCMR Community Name - Our Response and the Survey
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Correct me if Im wrong but dont they limit frametimes so they can reduce tv stuttering? NTSC standard for TVs is 29.94 or 59.94 fps. I assume they chose the 30fps so it can be used more widely and if its scaled to 60 it would just increase frametime lag. Again, im not sure.
Also, comparing CE2 to CE1 is like comparing UE5 to UE4. Also, i dont remember but doesnt starfield use the havok engine for animations?
Edit: rather than downvote just tell me where I am wrong
Not to put too fine of a point in it but you're wrong because your understanding of frame generation and displays is slightly flawed.
Firstly most people's displays, whether it be a TV or a monitor, are at least minimally capable of 60hz which it seems you correctly assumed. With that said most TVs and monitors aren't capable of what's called variable refresh rate. VRR allows the display to match however many frames your graphics card is able to put out instead of the graphics card having to match your display's refresh rate. This eliminates screen tearing and allows you to get the best frame times at your disposal as the frame is generally created and then immediately displayed.
The part you might be mistaken about from my understanding is the frame time lag. Frame time is an inverse of FPS. The more frames generated per second the less time in between the frames. Now under circumstances where there is no VRR and the frame rate does not align with a displays native rate there can be frame misalignment. This occurs when the monitor is expecting a frame that is not yet ready. It'll use the previous frame or part of it until a new frame becomes available to be displayed. This can result in screen tearing or stuttering and yes in some cases this can add additional delay in between frames. In general though a >30 FPS framerate will feel smoother on a 60hz display than a locked 30 FPS because you're guaranteed to have every frame displayed twice.
Thanks, i was recently reading about monitor interlacing and i must have jumbled it all up.
Todd said they capped it at 30 for fidelity (= high quality settings). Series X supports variable refresh rate if your TV can utilize it (no tearing). Series X chooses applicable refresh rate which you can also override. All TVs support 60, many 120, and VRR is gaining traction, too.
Let's take Remnant II, it has setting for quality (30) balanced (60) and uncapped - pick what you like.
CE is still CE, all the same floaty npc, hitting through walls, bad utilisation of hardware have been there for ages. They can't fix it, so it's likely tech debt. They need to start fresh or jump to an already working modern engine.
That's for movies, I don't remember why, but films can be fine in 30fps. Games are kinda horrible at 30fps, all TVs I know have 60Hz or higher refresh rate for all PC signals
iIRC it's just because we're used to the lower framerate in movies. If you look up some 60 FPS videos on YouTube you'll notice how much smoother it looks.
Personally, I'd wish sports broadcasts would be in 60 FPS by default. Often the action is so fast that 30 FPS just isn't enough to capture it all.
Higher framerates make things look more real.
This is fine if what you're looking at is real, like a football match, but what the likes of The Hobbit showed us, is that what you're actually looking at Martin Freeman with rubber feet on. And that was just 48fps.
24fps cinema hides all those sins. The budget of the effects department is already massive. It's not ready to cover all the gaps left by higher framerates.
Even in scenes with few effects the difference can be staggering. I saw a clip from some Will Smith war movie (Gemini Man, I think), and the 120fps mode makes the same scene look like a bunch of guys playing paintball at the local club.
Movies have some blur in their frames. Lots of directors have justified this on the basis of looking more "dreamy". No matter if you buy that or not, the effect tends to allow lower fps to look like smooth motion to our eyes.
It also helps that they lock in that framerate for the whole movie. Your eyes get used to that. When games suddenly jump from 120fps down to 65fps, you can notice that as stutter. Past a certain point, consistency is better than going higher.
Starfield on PC, btw, is a wildly inconsistent game, even on top tier hardware. Todd can go fuck himself.
30fps in films looks okay because we're used to that. Early Hollywood had to limit framerates because film wasn't cheap.
60fps is better for gaming because it allows the game to be more responsive to user input.