AMD doesn't know I selfhost generative AI models.
8GB is barely sufficient for my needs and I often need to use multigpu modifications to deploy parts of my workflow to my smaller GPU which lacks both enough cuda cores and enough vram to make an appreciable difference. I've been searching for a used K80 in my price range to solve this problem.
I know I'm not but that doesn't mean that gamers wouldn't benefit from more VRAM as well.
Just an example, Nvidia's implementation of MSAA is borked if you've only got 8gigs of VRAM, so all those new super pretty games need to have their gfx pipelines hijacked and the antialiasing replaced with older variants.
Like, I'm not gonna go around saying my use case is normal, but I also won't delude myself into thinking that the average gamer wouldn't benefit from more VRAM as well.