8GB VRAM not enough (and 10 / 12)

Page 48 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
8GB
Horizon Forbidden West 3060 is faster than the 2080 Super despite the former usually competing with the 2070. Also 3060 has a better 1% low than 4060 and 4060Ti 8GB.
Resident Evil Village 3060TI/3070 tanks at 4K and is slower than the 3060/6700XT when ray tracing:
Company Of Heroes 3060 has a higher minimum than the 3070TI:

10GB / 12GB

Reasons why still shipping 8GB since 2014 isn't NV's fault.
  1. It's the player's fault.
  2. It's the reviewer's fault.
  3. It's the developer's fault.
  4. It's AMD's fault.
  5. It's the game's fault.
  6. It's the driver's fault.
  7. It's a system configuration issue.
  8. Wrong settings were tested.
  9. Wrong area was tested.
  10. Wrong games were tested.
  11. 4K is irrelevant.
  12. Texture quality is irrelevant as long as it matches a console's.
  13. Detail levels are irrelevant as long as they match a console's.
  14. There's no reason a game should use more than 8GB, because a random forum user said so.
  15. It's completely acceptable for the more expensive 3070/3070TI/3080 to turn down settings while the cheaper 3060/6700XT has no issue.
  16. It's an anomaly.
  17. It's a console port.
  18. It's a conspiracy against NV.
  19. 8GB cards aren't meant for 4K / 1440p / 1080p / 720p gaming.
  20. It's completely acceptable to disable ray tracing on NV while AMD has no issue.
  21. Polls, hardware market share, and game title count are evidence 8GB is enough, but are totally ignored when they don't suit the ray tracing agenda.
According to some people here, 8GB is neeeevaaaaah NV's fault and objective evidence "doesn't count" because of reasons(tm). If you have others please let me know and I'll add them to the list. Cheers!
 
Last edited:

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,824
21,601
146
No need I am honest, I don't really like amd in the video card space. The place that it broke me was three in a row amd cards failing rather fast. Not saying id never buy one again I am simply tired of their the performance is less but so it the cost and it has a ton of me too features that sort of work and sort of don't work. If they go back to being noticeably less money than the next option that is fine, I can deal with it.
3 cards in a row? You were either having a PICNIC, or you are the unluckiest DIYer I've ever met. Only bad experience I had with AMD was the launch 5700XT reference model. That is the same number of bad experiences I've had with Nvidia. The last time I had a bad experience with Nvidia, was way back with the FX 5800 leaf blower.

But all this only serves to obfuscate the discussion. Which once again, is about 8GB of vram being the new 4GB. That Nvidia is overcharging for ram is ancillary. That you and others keep somehow deflecting to why you hate AMD, or why being overcharged by Nvidia for ram is okay, is mildly depressing.
 
Feb 4, 2009
34,703
15,951
136
3 cards in a row? You were either having a PICNIC, or you are the unluckiest DIYer I've ever met. Only bad experience I had with AMD was the launch 5700XT reference model. That is the same number of bad experiences I've had with Nvidia. The last time I had a bad experience with Nvidia, was way back with the FX 5800 leaf blower.

But all this only serves to obfuscate the discussion. Which once again, is about 8GB of vram being the new 4GB. That Nvidia is overcharging for ram is ancillary. That you and others keep somehow deflecting to why you hate AMD, or why being overcharged by Nvidia for ram is okay, is mildly depressing.
fair enough
 

ZGR

Platinum Member
Oct 26, 2012
2,054
661
136
Here are some screenshots from the initial part of Rachet & Clank for which this discussion started.

These screenshots are at 4k/60/dlss balanced/very high raster/medium rt. As you can see the card is around at ~150W on average, meaning it has another 115W to spare, if need be, on heavier games or heavier scenes . If any other card, can do the same, at 800$ as people here keep mocking, let me know, hmmm....? (don't tell me about the 4070, it cannot go to 265W

Some games just sip power. Half Life Alyx at 7030x3557 can run under 150W on a 3080 at 1.5 GHz 0.736v at a locked 120fps. Into The Radius at 'just' 5408x2736 can push a 3080 to hit 300W below 0.9v.

I'd love to get my hands on undervolting RTX 4000. I have been able to undervolt GTX 1060, 1660 ti, 1070, 1080 ti, 2070, 2080, 3080, 3080 Ti, R9 290, and 7900 XTX. The default voltage curve for all Nvidia and AMD cards is way too high; with all of them pissing away efficiency over ~0.9v. I like the compare GPU's at their optimal voltage curves. Doesn't matter if you have a dud GPU or a golden one, they all have their own optimal voltage curve.

This is AMD's weak point with 7900 XT/XTX. The efficiency is quite bad in workloads below 250W. In the 300W range, the fps/watt is way more efficient against RTX 3000. Its stock voltage curve is insane, pushing 3 GHz clocks at 500W until it starts to throttle. Capping its clocks down to 2.2-2.7 GHz below 0.8v-0.9v drops power consumption significantly with minimal performance loss.

The games I want to play in VR at 120fps require next generation performance. My VRAM usage in some of these games will be like 6-8GB (Skyrim VR, Fallout 4 VR). But this is why I am not worried about running out of VRAM; my shader performance is so abysmal. I play at 4648x2352 at a locked 120fps in Skyrim. It sounds high, but to make the game look really crisp, I want to play at least 6576x3328; that is 2x more pixels.

This is with a last generation, $400 Quest 2. Next generation VR is coming soon.
 
Reactions: psolord and Tlh97

A///

Diamond Member
Feb 24, 2017
4,352
3,155
136
Sorry for double post, but what software is this? I would love to use this.
Radeon memory visualiser.


Doesn't look to be an official tool but if you want one for nvidia then nvidia makes nsight.
 
Reactions: psolord and ZGR

psolord

Platinum Member
Sep 16, 2009
2,015
1,225
136
Some games just sip power. Half Life Alyx at 7030x3557 can run under 150W on a 3080 at 1.5 GHz 0.736v at a locked 120fps. Into The Radius at 'just' 5408x2736 can push a 3080 to hit 300W below 0.9v.

I'd love to get my hands on undervolting RTX 4000. I have been able to undervolt GTX 1060, 1660 ti, 1070, 1080 ti, 2070, 2080, 3080, 3080 Ti, R9 290, and 7900 XTX. The default voltage curve for all Nvidia and AMD cards is way too high; with all of them pissing away efficiency over ~0.9v. I like the compare GPU's at their optimal voltage curves. Doesn't matter if you have a dud GPU or a golden one, they all have their own optimal voltage curve.

This is AMD's weak point with 7900 XT/XTX. The efficiency is quite bad in workloads below 250W. In the 300W range, the fps/watt is way more efficient against RTX 3000. Its stock voltage curve is insane, pushing 3 GHz clocks at 500W until it starts to throttle. Capping its clocks down to 2.2-2.7 GHz below 0.8v-0.9v drops power consumption significantly with minimal performance loss.

The games I want to play in VR at 120fps require next generation performance. My VRAM usage in some of these games will be like 6-8GB (Skyrim VR, Fallout 4 VR). But this is why I am not worried about running out of VRAM; my shader performance is so abysmal. I play at 4648x2352 at a locked 120fps in Skyrim. It sounds high, but to make the game look really crisp, I want to play at least 6576x3328; that is 2x more pixels.

This is with a last generation, $400 Quest 2. Next generation VR is coming soon.
Yes, some games do sip power.

However some others do not. I posted the screenshots of The Plague Tale Requiem and Ratchet & Clank, to highlight, how the upscalers do a great job, while helping keep power draw down and helping with the resources.


This is a very on topic angle for this thread, because one of these hardware resources that get helped, is also the video ram.

One more very recent example is Baldur's Gate 3. Here are some shots from the 4070ti. It runs the freaking game at 4k/ultra/60/dlss at ~80-90W ffs. And while we are at it, yeah it's WAY below 8GBs.




The game climbed at the pantheon of the most played games on steam (700k people simultaneously a couple days ago). So why would I pay attention to Forspoken for example, that had some problems with 8GB cards and its 2K users playing it? Not to mention it's much better now?
 
Reactions: Tlh97 and ZGR

psolord

Platinum Member
Sep 16, 2009
2,015
1,225
136
Happy? It reads like this -



You do you bro. Calling others toxic while defending a trillion dollar company is a bad look though. And LOL at bringing in the performance per watt to the discussion. This is about you having to pay $800 for 12GB. Maybe if you take off your green tinted glasses you'll see we are angry for you, not at you. Some of you that pay the extortion come off like you have Stockholm syndrome.

Thank you for not being angry at me sir. Your comment was not passive aggressive at all.

I am not defending nvidia at all. I don't give an F about them or AMD or Intel for that matter. What I do give an F though, is the technical advancements they bring. I have seen games looking better at 4k/dlss than native 4k. Whether we like it or not, Nvidia right now, is the better player. And they charge for it.

The 4070ti was indeed 200 more than it should(?) be. I have stated this before. Am I going to bitch about 200 coins more, for the 2-3 years the card will be my primary? No. That's like 5-8 coins per month. If that was all the problem I have in my life, I would be the happiest man on earth.

12GBs and 8GBs are enough for thousands of games, along with proper settings. See Baldur's Gate above. I can use 4k native, but why would I? I shave off 100W and yes that's important. It reduces power bills, component stress, noise, psu requirements, case thermals, while maintaining all of the fun. It's not a stockholm syndrome, it's just the smart thing to do. Same goes for FSR2 and kudos to AMD for keeping it open to all, but if they put some tensor cores to help with their vectors, yeah that would be great. Because in motion, DLSS and FSR2 are two different beasts.

Nvidia developed dlss for a reason. Because software is cheaper than hardware. I mean sure there are development costs, but after a point, it's money out of thin air. They did the smart thing. Less hardware for almost the same visual quality and some times even better. People in this thread just find corner cases on a handfull of games while dismissing the thousands of games any card can run.

It's not me wearing green tinted glasses sir. It's the AMD vs Nvidia toxicity most people demonstrate in this thread. And yes Nvidia could do better price wise (but so could AMD on their higher end models). I am not arguing this.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
You dont need a 900-1000 USD to play BG3, even at 4K (custom settings) the game is playable with 5-6 year old GPUs.

As for DLSS, NVIDIA developed it in order to be able to sell us the cost of integration of Tensor cores in to every GPU they make. It is not free performance, you need Tensor Cores to run DLSS, free performance is from FSR because it runs on plain shaders (that is why it supports older GPUs).
And thats my problem with DLSS, you need dedicated hardware AND gaming support in order to use it. Instead, we could have had more shader performance because at the same silicon size they would use more shaders instead of Tensor Cores and that performance would be used in every game without the need to implement and support preoperatory software.

Also, the problem with the new 8GB cards is not the Vram buffer but the price they are asking for such cards in the middle of 2023. I dont have a problem with 8GB cards at the 200-250 USD price point and bellow, but for 300 USD and above 8GB in the middle of 2023 is just laughable and complete ripoff.
 
Last edited:

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,824
21,601
146
Thank you for not being angry at me sir. Your comment was not passive aggressive at all.

I am not defending nvidia at all. I don't give an F about them or AMD or Intel for that matter. What I do give an F though, is the technical advancements they bring. I have seen games looking better at 4k/dlss than native 4k. Whether we like it or not, Nvidia right now, is the better player. And they charge for it.

The 4070ti was indeed 200 more than it should(?) be. I have stated this before. Am I going to bitch about 200 coins more, for the 2-3 years the card will be my primary? No. That's like 5-8 coins per month. If that was all the problem I have in my life, I would be the happiest man on earth.

12GBs and 8GBs are enough for thousands of games, along with proper settings. See Baldur's Gate above. I can use 4k native, but why would I? I shave off 100W and yes that's important. It reduces power bills, component stress, noise, psu requirements, case thermals, while maintaining all of the fun. It's not a stockholm syndrome, it's just the smart thing to do. Same goes for FSR2 and kudos to AMD for keeping it open to all, but if they put some tensor cores to help with their vectors, yeah that would be great. Because in motion, DLSS and FSR2 are two different beasts.

Nvidia developed dlss for a reason. Because software is cheaper than hardware. I mean sure there are development costs, but after a point, it's money out of thin air. They did the smart thing. Less hardware for almost the same visual quality and some times even better. People in this thread just find corner cases on a handfull of games while dismissing the thousands of games any card can run.

It's not me wearing green tinted glasses sir. It's the AMD vs Nvidia toxicity most people demonstrate in this thread. And yes Nvidia could do better price wise (but so could AMD on their higher end models). I am not arguing this.
I don't do passive aggressive. After over 40yrs of combat sports I do assertive. All forms of aggression reflect hostility and that's almost always uncalled for. You got my unvarnished opinion. You don't have to like it or agree with it.

As to it being the smart thing to do; $800 for 12GB of vram isn't smart as I see it. Obviously we disagree on that.
 

Timorous

Golden Member
Oct 27, 2008
1,727
3,152
136
Less hardware for almost the same visual quality and some times even better. People in this thread just find corner cases on a handfull of games while dismissing the thousands of games any card can run.

This is just BS when you compare like for like.

It can be true when you compare in game TAA with DLSS Q at 4K and the much better NV TAA makes up for the lower input resolution. When you compare DLAA to DLSS Q though the difference is night and day when you are using the same TAA solution and just comparing upscaling quality.

I have said before though that my biggest issue with DLSS / FSR is that publishers will end up using it as an excuse to push stuff out before it is properly ready because you can turn on DLSS / FSR to get playable frame rates. So it goes from a tech that can allow your GPU to punch above its weight or to extend the life of an older GPU to a tech that gets used to make up for a shortfall in game QA time.

In an ideal world in game TAA would be close to what NV have so the native 4K image would look like the DLAA image and if that were true there would never be a situation where someone thinks DLSS matches the native image.



That is just one example but the vegitation, the brigde, background windows and so on look substantially better with DLAA vs DLSS and it just proves how terrible in game TAA often is that DLSS Q can sometimes match a native image in IQ overall.
 

psolord

Platinum Member
Sep 16, 2009
2,015
1,225
136
You dont need a 900-1000 USD to play BG3, even at 4K (custom settings) the game is playable with 5-6 year old GPUs.
I agree, but that was kind of my point. People are easily grabbing their pitchforks when a shred of a vram problem appears in some obscure game, but they stay whisper quiet when a game with hundrends of thousands of a userbase appears, that runs fine.

As for DLSS, NVIDIA developed it in order to be able to sell us the cost of integration of Tensor cores in to every GPU they make. It is not free performance, you need Tensor Cores to run DLSS, free performance is from FSR because it runs on plain shaders (that is why it supports older GPUs).
And thats my problem with DLSS, you need dedicated hardware AND gaming support in order to use it. Instead, we could have had more shader performance because at the same silicon size they would use more shaders instead of Tensor Cores and that performance would be used in every game without needing to implement and support preoperatory software.

Eh? DLSS runs on hardware tensor cores.

It may need special hardware, but it gives better image quality, especially in motion. And that's not to talk down on fsr. It was a great idea too, with limited hardware resources. I played some games with it, when DLSS was not available. And it was not available due to AMD's doing. They are forcing devs not to use the better upscaling and I cannot begin to stress how shameful that is. This is literally "I hope my neighbour's goat dies" mentality, from the known joke, on AMD's part. I see no one on these forums critisizing this. As long it's not nvidia doing something, everything is OK.

And I don't know, has there been a study of what kind of transistor savings are we talking about? Tensor cores take up die space, sure, but also saves up on resources required for tmus, rops, vram size, vram bandwidth and power? Even the despised DLSS3 with its fake frames, in terms of alleviating some cpu limits, it may not be that bad for some corner cases. But no, when AMD brings FSR3 everything will be cotton candy and roses.

The issue is super compicated and only nvidia knows the truth. They calculated all that and concluded to proceed with tensor cores and dlss. Personally as a user of all that, I say they did the right thing. Let's bash them for having stronger RT too.
Also, the problem with the new 8GB cards is not the Vram buffer but the price they are asking for such cards in the middle of 2023. I dont have a problem with 8GB cards at the 200-250 USD price point and bellow, but for 300 USD and above 8GB in the middle of 2023 is just laughable and complete ripoff.
I can partially agree with that.
 
Last edited:

psolord

Platinum Member
Sep 16, 2009
2,015
1,225
136
This is just BS when you compare like for like.

It can be true when you compare in game TAA with DLSS Q at 4K and the much better NV TAA makes up for the lower input resolution. When you compare DLAA to DLSS Q though the difference is night and day when you are using the same TAA solution and just comparing upscaling quality.

I have said before though that my biggest issue with DLSS / FSR is that publishers will end up using it as an excuse to push stuff out before it is properly ready because you can turn on DLSS / FSR to get playable frame rates. So it goes from a tech that can allow your GPU to punch above its weight or to extend the life of an older GPU to a tech that gets used to make up for a shortfall in game QA time.

In an ideal world in game TAA would be close to what NV have so the native 4K image would look like the DLAA image and if that were true there would never be a situation where someone thinks DLSS matches the native image.



That is just one example but the vegitation, the brigde, background windows and so on look substantially better with DLAA vs DLSS and it just proves how terrible in game TAA often is that DLSS Q can sometimes match a native image in IQ overall.
Is DLAA available on AMD cards though?

Also I bet there are some graphs in Nvidia's and AMD's (and Intel's) headquarters, that study the psychovisual perception of a user, at various settings, resolutions and framerates, according to the distance from the screen and its size and all that divided by what die space is need what die savings it can bring, which will translate in extra income in their pockets.

According to the above image, I swear to god, instead of having dlaa 50fps, I would use the best dlss setting that would give me 60fps. Please see the screenshots I posted above of R&C. The game is great with DLSS. You cannot possibly see Rachet's hairs in the midst of all hell breaking loose. That's why I say we are getting extremely nitpicky and we are losing focus. Missing lower/res textures, yes that's an issue, but as TLOU showed us, the dev has more control than what we think.

For me it's always framerate>settings>resolution and thats why proper upscaling is so important in my book. The "resolution" part, which directly affects vram, becomes less of a factor.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
People are easily grabbing their pitchforks when a shred of a vram problem appears in some obscure game,...
There are two dozen games in this thread with objective proof showing issues on 8GB, virtually all AAA titles. Which "obscure game" are you referring to?

The issue is super compicated and only nvidia knows the truth.
Nah, it really isn't. DLSS "solves" the problem NV caused, namely the ridiculous performance hit of ray tracing. You can't sell cards with a feature that's unusably slow, so NV slaps "aye eye" in front of it.

Interestingly ray tracing and DLSS3 destroy 8GB cards, so there's a warped poetic irony about the whole thing.

Even the despised DLSS3 with its fake frames, in terms of alleviating some cpu limits, it may not be that bad for some corner cases.
Were you cheering 15 years ago when motion interpolation TVs arrived? They automatically work in every game and on every GPU (unlike the drop-in-the-ocean DLSS3), so surely such TVs are the holy grail for you, amirite?

But no, when AMD brings FSR3 everything will be cotton candy and roses
Nah, I said right from the start ray tracing and upscaling are optional side extras and should never be used as the primary reason to sell GPUs.

There's only one vendor pushing this as a primary upgrade reason and encouraging deceptive benchmark labels on charts by redefining what a frame and pixel means.

It's the same vendor with a monopoly in desktop GPU space, and the same vendor giving us 14+++++++ nm 8GB for the last nine years.

We're seeing an exact repeat of Intel shipping quad-cores with forced motherboard upgrades and +5% performance every 18 months for seven years. This is what companies look like when they're "competing with themselves".

Did you also defend Intel in the same way? "Oh, find me another quad-core CPU for $1000 that delivers the same performance!" It's funny how immediately and radically things changed for all consumers as soon as Ryzen hit.
 
Last edited:

ZGR

Platinum Member
Oct 26, 2012
2,054
661
136
…it just proves how terrible in game TAA often is that DLSS Q can sometimes match a native image in IQ overall.

I used to hate TAA until I discovered AMD Fidelity FX sharpening filter. This is available in Reshade and in VRToolkit so we can use it all the time. It really does remove the blur. Can’t live without it in VR!

But yeah, many games implement TAA horribly and require mods and tweaks to fix. vanilla Fallout 4 is a great example since the game engine lacks MSAA support(!) and the grass shimmering is awful.
 

Thunder 57

Platinum Member
Aug 19, 2007
2,814
4,105
136
I used to hate TAA until I discovered AMD Fidelity FX sharpening filter. This is available in Reshade and in VRToolkit so we can use it all the time. It really does remove the blur. Can’t live without it in VR!

But yeah, many games implement TAA horribly and require mods and tweaks to fix. vanilla Fallout 4 is a great example since the game engine lacks MSAA support(!) and the grass shimmering is awful.

Unfortunatly MSAA is all but dead :/
 
Reactions: Tlh97 and ZGR

Mopetar

Diamond Member
Jan 31, 2011
8,008
6,454
136
Given how long BG3 has been in development it's not surprising that it works fine with smaller amounts of VRAM. 8 GB was still common in the top of the line cards when it was first unveiled and that's what the consoles were sporting as well.

I wouldn't conflate popularity with memory requires for the average game though. Neither games like Minecraft or DotA require massive quantities of VRAM (or graphical power in general for that matter) either and they're among some of the most played games on the planet. Games don't need to have insane levels of graphics to either look good or be enjoyable.
 

CakeMonster

Golden Member
Nov 22, 2012
1,428
535
136
I'm not that excited about TAA (although not as fanatically opposed as some either), but sharpening is something I'm absolutely allergic to. These things are subjective, but I find any amount of sharpening post-processing to look extremely distracting and unappealing. So for now I'm ok with DLAA or DLSS Quality depending on the game (and hoping that there's no forced sharpening, which there has been less of lately). The ultimate goal for me is high enough DPI monitors so that the softness of TAA/DL** implementations become a non-issue.
 
Reactions: ZGR
Feb 4, 2009
34,703
15,951
136
Slightly off topic and certainly related.
What’s in store for 2024 regarding AMD & nVidia?
I am aware intel’s Battle Mage should appear mid way thru the year and it should be upper mid range performance.
 

A///

Diamond Member
Feb 24, 2017
4,352
3,155
136
nothing. @Geddagod mentioned a 32 GB rumor to me a while back but it appears that rumored card aka a refresh turned out to be an ada based workstation card. it can however imply the next generation 5090 is gonna be a 32 gb card.

nvidia doesn't need to release a ti right now this year. the 4090 is incredibly good and nothing comes close to it except the 4080 ti and even that isn't very close. rdna 3 is best served on a warm bun with cold lettuce. nvidia is dedicated their time towards ai/ml workloads and are letting the 40 series float. amd doesn't plan on having any flagship rdna 4 cards.
 

psolord

Platinum Member
Sep 16, 2009
2,015
1,225
136
There are two dozen games in this thread with objective proof showing issues on 8GB, virtually all AAA titles. Which "obscure game" are you referring to?
24 games amongst thousands is nothing. And these games can probably look and play fine with the correct settings. If you crank everything over 9000, trying to play a AAA game on a non AAA card, you are doing something wrong (not you personally-the user).

I said it before. I have THREE 8GB cards. They are nothing alike. VRAM is important but not the be-all and end-all. And yes 8GBs is a lot. If the dev is capable, he can do marvels, at least for 1080p that 61.5% of people still use, according to steam.

In any case, we have a very recent example of what more vram does. The 4060ti 16GB. It's 2% faster than the 8GB model, according to TPU.


In the notorious TLOU, at 4K, the 16GB model gives 27fps and the 8GB model gives 22fps. They are equally useless. One less useless than the other, but both are useless anyway.

So if you want 4k/60, you are getting a 4070, enable DLSS and call it a day. Or you play with the settings a bit, as I usually say. Personally instead of giving an extra 100$ for the 16GB model, I would hands down be getting a 4070 for 200$ extra.

Nah, it really isn't. DLSS "solves" the problem NV caused, namely the ridiculous performance hit of ray tracing. You can't sell cards with a feature that's unusably slow, so NV slaps "aye eye" in front of it.

Interestingly ray tracing and DLSS3 destroy 8GB cards, so there's a warped poetic irony about the whole thing.
I have used dlss on non RT games a lot. It's not there just to alleviate RT performance woes, that's for sure. And how did nvidia caused a problem by implementing RT? We need to go forward, step by step.

DLSS3 + RT on 8GB cards, is using AAA features on non AAA cards, as I said above. This is a non argument.


Were you cheering 15 years ago when motion interpolation TVs arrived? They automatically work in every game and on every GPU (unlike the drop-in-the-ocean DLSS3), so surely such TVs are the holy grail for you, amirite?
I am not praising DLSS3. I only said that even that, can be helpful. I specifically said "corner cases". And no, the framerate interpolation of TVs is far inferior. I have used the interpolation of my 12yo Phillips TV and DLSS3. They are nothing alike.

Nah, I said right from the start ray tracing and upscaling are optional side extras and should never be used as the primary reason to sell GPUs.

There's only one vendor pushing this as a primary upgrade reason and encouraging deceptive benchmark labels on charts by redefining what a frame and pixel means.
You can safely disregard DLSS3 benchmarks, as we all do. Marketing will do marketing...

I want to see you bitching about FSR3 too, when AMD does the same tho...

It's the same vendor with a monopoly in desktop GPU space, and the same vendor giving us 14+++++++ nm 8GB for the last nine years.
This vendor has choices from 8GBs to 24GBs, but I guess you mean on the lower end. Yes this is true. Nvidia is trying to push people on upper tiers. Seriously though, what did you expect them to do and how is this monopoly getting fueled? They have the best transistor density with Ampere, the best power draw, the best features, they execute like clockwork and have released most of their stack. On the other hand, the competition still struggles...And they are not saints either...


I appreciate the raising of awareness, but what I don't appreciate, is the whole e-waste mentality that is thrown towards 8GB cards. They have a place. People just need to do a market survey and see if their needs are met. This whole bitch, bitch, moan, omg, rofl-ing, regarding vram is counter constructive.
 

Mopetar

Diamond Member
Jan 31, 2011
8,008
6,454
136
24 games amongst thousands is nothing. And these games can probably look and play fine with the correct settings.

Why even get an 8 GB card then? There are thousands of games that will run with less VRAM with the correct settings.

8 GB is on the way out. This is a bit like the start of a new console generation. There are only a handful of games that run on the new PSX, but thousands that will run on the PSX-1.

For some people, older hardware is fine. Anyone who's mostly going to play DotA or other e-sport titles doesn't even need 8 GB. But anyone buying now and expecting to get 6 years out of their card is going to be in for a rough time, just like anyone hoping that new console games get backported would be.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,824
21,601
146
24 games amongst thousands is nothing. And these games can probably look and play fine with the correct settings. If you crank everything over 9000, trying to play a AAA game on a non AAA card, you are doing something wrong (not you personally-the user).
So many poor debating tactics; I'll bite.

The 24 vs 1000s example is a version of the Texas sharpshooter fallacy. I am going to constrain my remarks to AAA games as you established that criteria.

Most gamers buy a new card because they want to play the latest greatest and upcoming games. Not just the older games they already have in their libraries. The problematic titles are newer, and that list will continue to grow with new releases. All of those preceding games are irrelevant to the premise that 8GB is no longer enough.

Not surprised you immediately invoked one of the list of excuses posted by BFG10K. Correct settings eh? Like having to immediately reduce textures on a $400 8GB card you just bought? Preposterous. That's not correct settings, that's reduced settings. Perhaps the most important graphics fidelity setting.

Just what is a non AAA card? That's a rhetorical question, because this is the no true Scotsman fallacy. AAA games have existed for decades. The fact is 8GB cards were AAA cards, now not so much. You are making his point for him by calling them non AAA cards.
I said it before. I have THREE 8GB cards. They are nothing alike. VRAM is important but not the be-all and end-all. And yes 8GBs is a lot. If the dev is capable, he can do marvels, at least for 1080p that 61.5% of people still use, according to steam.
More excuses aka The developers fault. No need to dissect this, the OP did it already.
In any case, we have a very recent example of what more vram does. The 4060ti 16GB. It's 2% faster than the 8GB model, according to TPU.
The division fallacy - What's true of the whole is true of the parts. Except when it isn't. Daniel-San's comparison above demonstrates that. The performance is basically the same until it isn't. Then there can be massive differences in fps, especially lows. Textures not loading or being reduced quality are not represented by that 2% either.

In the notorious TLOU, at 4K, the 16GB model gives 27fps and the 8GB model gives 22fps. They are equally useless. One less useless than the other, but both are useless anyway.
This ignores the fact that no matter what settings you have to reduce to get playable fps, the 16GB card will be able to keep the texture settings much higher. So when we get to that playable experience the 16GB will look better even if it plays the same.
So if you want 4k/60, you are getting a 4070, enable DLSS and call it a day. Or you play with the settings a bit, as I usually say. Personally instead of giving an extra 100$ for the 16GB model, I would hands down be getting a 4070 for 200$ extra.
This is the moving the goalpost and red herring fallacies. You moved the goalpost to 4K60 when we have examples of 8GB being insufficient at 1080 and 1440. The 4070 is a red herring, it has nothing to do with 8GB being not enough. That you assert all of us would be getting an 4070 for 4K/60 is very telling. Revealed yourself you have.
I have used dlss on non RT games a lot. It's not there just to alleviate RT performance woes, that's for sure. And how did nvidia caused a problem by implementing RT? We need to go forward, step by step.
Ray tracing existed before Nvidia added hardware support for it to their RTX cards. It doesn't require their hardware to run. It does use vram though. You know, what this thread is all about.
DLSS3 + RT on 8GB cards, is using AAA features on non AAA cards, as I said above. This is a non argument.
Strawman fallacy. We have, through ample precedence, established 8GB cards are AAA cards, just not so much for newer titles and moving forward. Again, you make his point for him by calling 8GB cards non AAA. In your attempt to hand wave those pesky facts away, you ended up proving his argument. Well done.
I want to see you bitching about FSR3 too, when AMD does the same tho...
Why? Does it upset you that members here might only pick on the trillion dollar company? You keep revealing yourself with remarks like this.

You continue to either imply, or outright assert, the basis of the negativity is driven by AMD nuthuggers. Rest assured, most of us roasting Nvidia for cheaping out on vram have already said FSR3 will be just as bad and they should feel bad. You don't even have to wait, it is written here in the forum. If you missed those posts, that's on you.
This vendor has choices from 8GBs to 24GBs, but I guess you mean on the lower end. Yes this is true. Nvidia is trying to push people on upper tiers. Seriously though, what did you expect them to do and how is this monopoly getting fueled? They have the best transistor density with Ampere, the best power draw, the best features, they execute like clockwork and have released most of their stack. On the other hand, the competition still struggles...And they are not saints either...
If there was any doubt still in the reader's mind about your motivations, this made it crystal clear.
I appreciate the raising of awareness, but what I don't appreciate, is the whole e-waste mentality that is thrown towards 8GB cards. They have a place. People just need to do a market survey and see if their needs are met. This whole bitch, bitch, moan, omg, rofl-ing, regarding vram is counter constructive.
E-waste remark is the strawman and false attribution fallacies.

Can you explain how roasting Nvidia for overcharging for vram is counter constructive as you called it? Because that's what all of the derision is about. Nvidia charging $300 or more for 8GB. No one is bemoaning the A750 or 6600 being 8GB as they are priced more appropriately for that amount.

Let me break it down this way. There are 2 primary emphasis to this thread. The first is a PSA/FYI that 8GB is aging out, so don't overpay for a card with that much. The second is that Nvidia has overcharged for 8GB for generations now. The third point, which is ancillary, is that Nvidia overcharges for ram period.

You may not appreciate it, but you don't speak for me. If we don't hold their feet to the fire who will? The DIY gaming community has to have a voice, a loud one. LTT, GN, HUB, all owe their success to us. They hear us and represent us, so that our voices are heard collectively.

I feel like you want us to stop doing that. Again, counter productive to what?
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |