8GB VRAM not enough (and 10 / 12)

Page 131 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
3,000
126
8GB
Horizon Forbidden West 3060 is faster than the 2080 Super despite the former usually competing with the 2070. Also 3060 has a better 1% low than 4060 and 4060Ti 8GB.
Resident Evil Village 3060TI/3070 tanks at 4K and is slower than the 3060/6700XT when ray tracing:
Company Of Heroes 3060 has a higher minimum than the 3070TI:

10GB / 12GB

Reasons why still shipping 8GB since 2014 isn't NV's fault.
  1. It's the player's fault.
  2. It's the reviewer's fault.
  3. It's the developer's fault.
  4. It's AMD's fault.
  5. It's the game's fault.
  6. It's the driver's fault.
  7. It's a system configuration issue.
  8. Wrong settings were tested.
  9. Wrong area was tested.
  10. Wrong games were tested.
  11. 4K is irrelevant.
  12. Texture quality is irrelevant as long as it matches a console's.
  13. Detail levels are irrelevant as long as they match a console's.
  14. There's no reason a game should use more than 8GB, because a random forum user said so.
  15. It's completely acceptable for the more expensive 3070/3070TI/3080 to turn down settings while the cheaper 3060/6700XT has no issue.
  16. It's an anomaly.
  17. It's a console port.
  18. It's a conspiracy against NV.
  19. 8GB cards aren't meant for 4K / 1440p / 1080p / 720p gaming.
  20. It's completely acceptable to disable ray tracing on NV while AMD has no issue.
  21. Polls, hardware market share, and game title count are evidence 8GB is enough, but are totally ignored when they don't suit the ray tracing agenda.
According to some people here, 8GB is neeeevaaaaah NV's fault and objective evidence "doesn't count" because of reasons(tm). If you have others please let me know and I'll add them to the list. Cheers!
 
Last edited:

Meteor Late

Senior member
Dec 15, 2023
287
310
96
Like I said, Nvidia and AMD are at fault here, especially Nvidia.
Take two of the most popular GPUs of both vendors all time, GTX 1060 and RX 480/580. These cards had 6 and 8GB. Guess how much total RAM consoles had? 8GB, which they could use less than that for game allocation, maybe 6GB max, maybe even less. So 6GB was perfectly fine overall for a long time.
Now consoles have 16GB, which they can use less than that but obviously much more than 8GB, maybe 12? 13?
So cards like RX 7600 or RTX 4060 should've always been 12GB or 16GB, maybe a much cheaper version with 8GB would've been ok at 200$ but not more than that.
 

Mopetar

Diamond Member
Jan 31, 2011
8,287
7,268
136
I can accept that not every card or user needs 12 GB and there is a market for people who primarily play games like DoTA or Counter Strike which are older or far less VRAM intensive.

My gripe has been that when the cards are retailing for $400, they should not have only 8 GB of VRAM. A $170 entry-level card doesn't have the same expectations.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
7,753
8,768
136
I can accept that not every card or user needs 12 GB and there is a market for people who primarily play games like DoTA or Counter Strike which are older or far less VRAM intensive.

My gripe has been that when the cards are retailing for $400, they should not have only 8 GB of VRAM. A $170 entry-level card doesn't have the same expectations.

-This thread is like the dictionary definition of cyclicity.

It always comes back to "8GB is fine if you're paying $150. It's not fine if you're paying $400" and then we get 10 pages of arguments about VRAM allocation and NUH UH and reviewer bias and two dozen videos from non-monetized Youtube channels and then someone says "Look, 8GB of Vram is fine if you're paying $150, but not $400" and it starts all over again.
 

NTMBK

Lifer
Nov 14, 2011
10,397
5,624
136
-This thread is like the dictionary definition of cyclicity.

It always comes back to "8GB is fine if you're paying $150. It's not fine if you're paying $400" and then we get 10 pages of arguments about VRAM allocation and NUH UH and reviewer bias and two dozen videos from non-monetized Youtube channels and then someone says "Look, 8GB of Vram is fine if you're paying $150, but not $400" and it starts all over again.
Look, the world is a scary and uncertain place right now. This thread is my safe space
 

Mopetar

Diamond Member
Jan 31, 2011
8,287
7,268
136
-This thread is like the dictionary definition of cyclicity.

It always comes back to "8GB is fine if you're paying $150. It's not fine if you're paying $400" and then we get 10 pages of arguments about VRAM allocation and NUH UH and reviewer bias and two dozen videos from non-monetized Youtube channels and then someone says "Look, 8GB of Vram is fine if you're paying $150, but not $400" and it starts all over again.

At least we don't have anyone trying to convince the rest of the thread that 8 GB is fine for all GPUs, so we're making some progress.

I assume eventually that both AMD and NVidia will stop selling cars with anything below 12 GB and this will have no reason to exist any longer.

Of course that'll be just in time for the 12 GB is no enough thread where we can go through another iteration of this discussion on a meta level. Think of the times we'll have.
 

mikeymikec

Lifer
May 19, 2011
19,646
13,460
136
Is there much of a call for 32 or 48GB VRAM? Historically (say over the last 20 years), how have flagship GPUs been loaded with VRAM relative to their era?

I just looked up the flagship GPU for 2004 and google reckoned it was the GeForce 6800 Ultra which apparently came with 256MB VRAM. I'm not seeing much in the way of VRAM usage figures for games of that era. I think in 2004 I still had a GeForce Ti4200 (64MB RAM allegedly) then I went up to the 7000 series, I believe I had a passively cooled 7600GS.

Admittedly I'm on the fence about the whole thing. IMO $1000USD/£1000UKP for a gaming graphics card is utterly absurd (and that's not even the flagship GPU price these days), so based on that opinion I logically end up asking myself, "what's an absurd amount of VRAM to go with that absurd price?".
 

jpiniero

Lifer
Oct 1, 2010
15,913
6,405
136

7600 XT can be over 50% faster in some Spiderman 2 benchmarks compared to the 7600. The 7600 is pretty much unplayable with RT.
 
Reactions: Tlh97

dr1337

Senior member
May 25, 2020
447
724
136
Theres a ton of demand coming from the AI crowd, unironically if tenstorrent or any other company were to release an AI ASIC card with 32gb+++ of memory I'd legit slam that into my spare pcie slot as fast as I could.

Also I really wonder how game devs feel about this topic. Games are getting seriously huge these days, 100gb new releases are pretty common. I can't imagine a PS6/XBOX2000 having any less than 32gb of ram.
 
Reactions: Mopetar

Mopetar

Diamond Member
Jan 31, 2011
8,287
7,268
136
The next generation consoles will have the advantage of bigger memory chips, but they may also reduce the bus size, limiting the overall capacity. 24 GB is achievable with a 25% reduction in bus size. For what the consoles will be capable of, it should be sufficient.

The reason Nvidia doesn't put more VRAM on their consumer GPUs is because they want the people who would buy those for an AI card to spend even more money for their professional card instead. It also has a side effect of ensuring that we don't see a repeat of the crypto boom where anything with 8 GB was being bought up because it was enough to mine ETH profitably even at double the MSRP. If NVidia put 48 GB on a 5090, there would be some people for whom that would be enough for their AI model that would buy up all the gaming cards even at prices above $3,000 because it's still much cheaper than the professional card with similar compute capabilities and larger VRAM capacity.
 

jpiniero

Lifer
Oct 1, 2010
15,913
6,405
136
The next generation consoles will have the advantage of bigger memory chips, but they may also reduce the bus size, limiting the overall capacity. 24 GB is achievable with a 25% reduction in bus size. For what the consoles will be capable of, it should be sufficient.

Not sure 4 GB chips would be available in time. 3 of course will be.
 

Mopetar

Diamond Member
Jan 31, 2011
8,287
7,268
136
Not sure 4 GB chips would be available in time. 3 of course will be.

You're right, bad math on my part. Going from 2 GB to 3 GB chips with 75% of the bus width is 18 GB, not 24 GB as I had indicated in my post.

I'd like it even more if one of them took advantage of some of the cutting edge die stacking and packaging technology to build a crazy APU with the memory stacked on top of the die or something like that. The bandwidth that would allow for means that the bus size could be a lot smaller.

I think that's at least another generation away still, but TSMC has been steadily advancing their capabilities in this area and that we'll get there eventually.
 

poke01

Diamond Member
Mar 8, 2022
3,330
4,583
106

download the benchmark

My result:


Everything set to High, DLSS set to Quality, no stupid FG enabled


This is Ultra.
 
Last edited:

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
30,783
28,240
146
That shader compilation is going to expose more degraded raptor lake CPUs.

I ran FSR Q and balanced from 4K with FG, native 4K with their AA combo, everything on ultra every run. Because the camera never moves fast it hid any resulting issues. The environments are such that there is no sizzling or other obvious eyesore issues. It's a damned good looking game. 7800X3D+ Pulse 7900XTX. Native 4K was rough, 59FPS avg. Might be a few fps to be had from beta drivers. Quality with FG was 135, and Balanced was 152. If the game looks as good as it did in the bench with Balanced+FG that is how I would play.
 

poke01

Diamond Member
Mar 8, 2022
3,330
4,583
106
That shader compilation is going to expose more degraded raptor lake CPUs.

I ran FSR Q and balanced from 4K with FG, native 4K with their AA combo, everything on ultra every run. Because the camera never moves fast it hid any resulting issues. The environments are such that there is no sizzling or other obvious eyesore issues. It's a damned good looking game. 7800X3D+ Pulse 7900XTX. Native 4K was rough, 59FPS avg. Might be a few fps to be had from beta drivers. Quality with FG was 135, and Balanced was 152. If the game looks as good as it did in the bench with Balanced+FG that is how I would play.
I loved the cutscenes, so nice and especially the part where they eat together. It’s not optimised that well but it’s got heart. Also the character models are top notch.

Edit: even on High it’s looked awesome. Might play it on high cause I saw they were no discernible frametime issue on high
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
30,783
28,240
146
Also thumbs up to Capcom for including all upscaling methods.
Hell yeah, good to see XeSS in the menu. I'll run this on my B580 later today. Shader compilation is going to take a while on a 5600X3D me thinks. If I get motivated, I'll swap in a 5800X3D. It's only a 10 minute job. And I'll probably get a 3rd of that time back from faster shader comp.
 
Reactions: Tlh97 and poke01

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
30,783
28,240
146
Remember how we talked about the VRAM debate goes way back? Then when it becomes too obvious to debate any longer, they vanish like ghosts?

Well, I found this chart from a nearly decade old thread here showing performance in Gears of War 4 maxed 4K. RIP cards with less than 4GB. While the last .5GB on the 970 was not as fast, Nvidia JuJu magic worked wonders, it seems.

 

Thunder 57

Diamond Member
Aug 19, 2007
3,405
5,631
136
Remember how we talked about the VRAM debate goes way back? Then when it becomes too obvious to debate any longer, they vanish like ghosts?

Well, I found this chart from a nearly decade old thread here showing performance in Gears of War 4 maxed 4K. RIP cards with less than 4GB. While the last .5GB on the 970 was not as fast, Nvidia JuJu magic worked wonders, it seems.

View attachment 116502

I think the debate is over. We will see though when we see how the 5060/5060 Ti compare to the B580 and 9060/XT.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |