Once you guys answer this question we can discuss further about why i made this thread.
Presumably it's about
"VRAM future proofing for teh glorious sparkly bits". Whilst I agree that if you want 4K ultra-realz then you need to spend money, when it comes to more budget / mainstream I'm going to play devil's advocate here:-
1. Budget gamers who buy 2-3GB cards tend to not buy the worst optimised AAA games then deliberately cripple them to run as slowly as possible. They'll either buy them because they know in advance they'll be playing mostly older / lighter weight games, or use common sense and run it on Med/High instead of Ultra. GTX 1050 has half the shaders of GTX 1060 6GB, whilst GT 1030 / Raven Ridge are almost half the horsepower again, and all three are 2GB VRAM. And yet people buy them for a reason - they know what they
won't be playing...
2. The "rat race" of trying to overbuy vs the worst optimised future games is an unwinnable one as budgets are finite whilst the laziness of developers is literally an endless chasm. Ubisoft said it best -
Assassins Creed Producer says optimising for PC not important. "But that's all the more reason to future proof!" And in doing so, you send the wrong message that further normalises less optimisation effort, which in turn 'encourages' you to overbuy even more which in turn encourages them to get even lazier, which in turn... It's a never-ending unwinnable rat-race. Some examples come to mind:-
- Everybody's Gone To The Rapture needs 3GB
for these visuals to run about 60fps on a GTX 970.
- Mirror's Edge Catalyst (Hyper) needs 6GB
for these visuals to stutter along on same GTX 970 due to running out of VRAM.
- Bioshock Infinite (2013) runs roughly half the speed of Bioshock (2007), ie GTX 1060 = 360fps for
these visuals vs 190fps for
these visuals, so that's only a 50% drop in fps after 6 years. But many 2018 games run only 1/3 to 1/4 the speed of 2012-2015 games (same or less time period) with almost nothing to show for it. Eg, with GTX 1060:-
- Dishonored 2 (50fps for
these visuals) vs Dishonored 1 (200fps for
these visuals)
- DX:MD (55fps)
for these visuals vs DX:HR
for these visuals (250fps)
- Divinity Original Sin 2 (80fps) vs DOS1 (260fps)).
Visual comparison here (DOS1 left, DOS 2 right).
- Someone above mentioned Bioshock 2 Remastered using 5GB VRAM. The original uses 1.2GB VRAM.
Here's the visual comparison (for half the fps & +300% more VRAM). "Huge LOL" is all I can say...
Whilst textures have obviously driven VRAM up, it's also clear for many games that a far larger part of the recent VRAM bloat has more to do with sheer developer laziness resulting in consolized bad ports, ie, x86 consoles were supposed to make it easy to port / optimise to PC, but have amazingly ended up making it so easy that developers have actually stopped "porting" altogether and pretty much repackage the raw console code in the PC port complete with the "designed for flat unified memory system" leading to VRAM hyper-inflation vs making the PC version better optimised for separate RAM vs VRAM.
What's really needed isn't declaring $1,000 GPU's with 12GB VRAM as the new "minimum" (and then $2,000 24GB GTX 3080Ti's when the next gen consoles come out with their 16GB flat memory), but rather giving games devs an almighty kick up the rear and start learning how to properly optimise for PC ports again...
Edit: So my answer to the question
"Who is more happy?" is "Someone who buys what they need when they need it, and just stops worrying about winning an unwinnable rat-race".