- Feb 16, 2018
- 41
- 4
- 11
Hey there, i have maybe an interesting question for you guys.
We see alot of video cards out there with 2gb minimum now of GDDR5 ram, then 3gb, 4gb 6gb 8gb normally,
Now, Im playing Battlefield 1 there with Ultra setting at 1080p ok, I've only got 16gb Ram in this machine, and my gfx card has 4gb ram on it it. It's a GTX 960
in MSI Afterburner, it's reporting that at Ultra settings, 1080p, like most of the time when i look i'm using 3.5gigs of the card Video memory, and the game is using 8.6Gigs my DDR3 system ram, and the game is still one of the best looking games ever
This is a recnetly upgraded machine however, a week ago, the machine only had 8gb ram, and the graphics card it had was a GTX 680 with only 3gb GDDR5, but i was still playing BF1 on Ultra settings at 1080p, and when i was looking back on my screengrabs in afterburner from them, it was using 1.8Gigs of memory of the gfx cards 2gb, and the game was only using 6.6gb of my systems 8gb DDR3....
But, it looked identical, samr setting, nothing changed, Ultra, 1080, absolutely exact same settings, so my question is, what the hell is going on, whats being compromised to allow the game to use so much less ram when it's simply not available? Like 1.5Ghz less video memory being used on average, and nearly 3gb System memory being used on average, yet the game looks completely identical, and plays almost completely identical, the only difference is the FPS is different on average by about 18fps.. is that all it is? like the machine isn't cutting any corners with the graphics, all that extra memory is simply being converted into grunt to churn out more FPS is it?
I know it'd probably be too much to ask for the programmers of the game at Dice(probably not dice but ea) to make a beautifully scalable engine that cuts back gradually on certain graphical featurs itself depending on your system specs..
i hope you can understand that! is all that extra memory just adding more FPS and nothing else?
Thanks Guys
We see alot of video cards out there with 2gb minimum now of GDDR5 ram, then 3gb, 4gb 6gb 8gb normally,
Now, Im playing Battlefield 1 there with Ultra setting at 1080p ok, I've only got 16gb Ram in this machine, and my gfx card has 4gb ram on it it. It's a GTX 960
in MSI Afterburner, it's reporting that at Ultra settings, 1080p, like most of the time when i look i'm using 3.5gigs of the card Video memory, and the game is using 8.6Gigs my DDR3 system ram, and the game is still one of the best looking games ever
This is a recnetly upgraded machine however, a week ago, the machine only had 8gb ram, and the graphics card it had was a GTX 680 with only 3gb GDDR5, but i was still playing BF1 on Ultra settings at 1080p, and when i was looking back on my screengrabs in afterburner from them, it was using 1.8Gigs of memory of the gfx cards 2gb, and the game was only using 6.6gb of my systems 8gb DDR3....
But, it looked identical, samr setting, nothing changed, Ultra, 1080, absolutely exact same settings, so my question is, what the hell is going on, whats being compromised to allow the game to use so much less ram when it's simply not available? Like 1.5Ghz less video memory being used on average, and nearly 3gb System memory being used on average, yet the game looks completely identical, and plays almost completely identical, the only difference is the FPS is different on average by about 18fps.. is that all it is? like the machine isn't cutting any corners with the graphics, all that extra memory is simply being converted into grunt to churn out more FPS is it?
I know it'd probably be too much to ask for the programmers of the game at Dice(probably not dice but ea) to make a beautifully scalable engine that cuts back gradually on certain graphical featurs itself depending on your system specs..
i hope you can understand that! is all that extra memory just adding more FPS and nothing else?
Thanks Guys