this post was submitted on 16 Sep 2024
79 points (92.5% liked)
PC Gaming
8524 readers
1124 users here now
For PC gaming news and discussion.
PCGamingWiki
Rules:
- Be Respectful.
- No Spam or Porn.
- No Advertising.
- No Memes.
- No Tech Support.
- No questions about buying/building computers.
- No game suggestions, friend requests, surveys, or begging.
- No Let's Plays, streams, highlight reels/montages, random videos or shorts.
- No off-topic posts/comments.
- Use the original source, no clickbait titles, no duplicates.
(Submissions should be from the original source if possible, unless from paywalled or non-english sources.
If the title is clickbait or lacks context you may lightly edit the title.)
founded 1 year ago
MODERATORS
Yep. As more people buy GPUs that have the capabilities to use machine learning upscaling (the bandaid) then the more likely developers are to use it instead of spending time improving performance.
I see it the most in Unreal Engine games, Unreal Engine allows devs to make a "realistic" style game fast, but performance is often left in the dirt. UE also has some of the worst anti-aliasing out of the box, so DLSS for example, is a good catch all to try and improve framerates and provide some AA, but instead you just get a lot of blur and poor graphical fidelity. The issues probably don't exist at higher resolutions, like 4K (which is maybe what they develop with), but the majority of people still use 1080p.
Oops sorry for the rant! I just got pissed off with it again recently in Satisfactory!