this post was submitted on 16 Mar 2024
263 points (74.2% liked)

linuxmemes

21041 readers
648 users here now

Hint: :q!


Sister communities:


Community rules (click to expand)

1. Follow the site-wide rules

2. Be civil
  • Understand the difference between a joke and an insult.
  • Do not harrass or attack members of the community for any reason.
  • Leave remarks of "peasantry" to the PCMR community. If you dislike an OS/service/application, attack the thing you dislike, not the individuals who use it. Some people may not have a choice.
  • Bigotry will not be tolerated.
  • These rules are somewhat loosened when the subject is a public figure. Still, do not attack their person or incite harrassment.
  • 3. Post Linux-related content
  • Including Unix and BSD.
  • Non-Linux content is acceptable as long as it makes a reference to Linux. For example, the poorly made mockery of sudo in Windows.
  • No porn. Even if you watch it on a Linux machine.
  • 4. No recent reposts
  • Everybody uses Arch btw, can't quit Vim, and wants to interject for a moment. You can stop now.

  • Please report posts and comments that break these rules!

    founded 1 year ago
    MODERATORS
     

    It's necessary for my very important hobby of generating anime nudes.

    you are viewing a single comment's thread
    view the rest of the comments
    [–] [email protected] 17 points 7 months ago (4 children)

    Brother of "I need nVidia for raytracing" while only playing last decade games.

    [–] [email protected] 9 points 7 months ago

    I completely unironically know people who bought a 4090 exclusively to play League

    [–] [email protected] 7 points 7 months ago

    Not gonna lie, raytracing is cooler on older games than it is newer ones. Newer games use a lot of smoke and mirrors to simulate raytracing, which means raytracing isn't as obvious of an upgrade, or can even be a downgrade depending on the scene. Older games, however, don't have as much smoke and mirrors so raytracing can offer more of an improvement.

    Also, stylized games with raytracing are 10/10. Idk why, but applying rtx to highly stylized games always looks way cooler than on games with realistic graphics.

    [–] [email protected] 4 points 7 months ago* (last edited 7 months ago)

    Quake 2 does looks pretty rad in RTX mode

    [–] [email protected] 3 points 7 months ago (1 children)

    Playing old games with Ray tracing is just as amazing as playing new games with Ray tracing. I know quake rt gets too dark to play half way through, they should have added light sources in those areas.

    Then again, I played through cyberpunk 2077 at 27fps before the 2.0 update. Control was pretty good at 50fps, and I couldn't recommend portal enough at about 40fps on my 2070 super. I don't know if teardown leveraged rt cores but digital foundry said it ran better on Nvidia and I played through that game at 70fps.

    I love playing with new technologies. I wish graphics card prices stayed down because rt is too heavy nowadays for my first gen RT card. I play newer games with rt off and most setting turned down because of it.

    [–] [email protected] 2 points 7 months ago* (last edited 7 months ago) (1 children)

    I love playing with new technologies. I wish graphics card prices stayed down because rt is too heavy nowadays for my first gen RT card. I play newer games with rt off and most setting turned down because of it.

    I wish they stayed down because VR has the potential to bring back crossfire/SLI. Nvidia's gameworks already has support for using two GPUs to render different eyes and supposedly, when properly implemented, it results in a nearly 2x increase in fps. However, GPUs are way too expensive right now for people to buy two of them, so afaik there aren't any VR games that support splitting rendering between two GPUs.

    VR games could be a hell of a lot cooler if having 2 GPUs was widely affordable and developers developed for them, but instead it's being held back by single-gpu performance.

    [–] [email protected] 1 points 7 months ago

    Wasn't there an issue with memory transfer latency across the connector? I thought they killed it because the latency was too high for higher frame rates causing a consistent stuttering.

    They tried to reuse that enterprise connector with higher throughput but last I heard they never fully developed support for it because of a lack of interest from devs.