- Oct 12, 2010
- 1,469
- 21
- 81
No kidding. It has to render a huge city with ultra quality textures, of course its going to require 3GB of memory.
I think if you want to be future proof you should be buying GPU's with 3+GB memory, I mean most games use 1GB up to a max of 2gb according to guru3d, but newer games I imagine will slowly but surely start using more and more.
So I'd say for now a 2GB card is very good and you can always disable AA or reduce resolution, but in the future a 3GB card will be needed for super quality gaming.
for the reason I just said. some of these games look like crap yet are demanding and eating up vram.Average RAM count on cards goes up, game devs start using it.
We should be surprised why?
Average RAM count on cards goes up, game devs start using it.
We should be surprised why?
lol I like how everyone keeps avoiding my point. go look at screenshots or videos of the actual games and they are a joke for needing that much vram and/or gpu power.Because developers should only develop for 10 year old hardware, 256MBs of RAM maximum. Only 128KB texture allowed. Hardware should be supported indefinitely.
Learned this watching people bitch about the higher system requirements in Watch Dogs, Wolfenstein, and Galactic Civilizations 3.
Because developers should only develop for 10 year old hardware, 256MBs of RAM maximum. Only 128KB texture allowed. Hardware should be supported indefinitely.
Learned this watching people bitch about the higher system requirements in Watch Dogs, Wolfenstein, and Galactic Civilizations 3.
I wouldn't want to get caught with insufficient VRAM, but I highly suspect the difference between ultra quality textures and high quality textures in this game will need screenshots blow up and dissected with a magnifying glass to see the difference.
Anyways, calling it now - a 770 and/or 280x won't have the horsepower to come close to getting 60fps @ 1080p ultra settings regardless of how much vram either card has equipped.
The the recommended GPUs aren't that high, though. This seems to be a CPU-centric game.
Assassin's Creed 4 recommended a gtx470, look at it rape a r290 and gtx780 at 1440p:
http://www.techpowerup.com/reviews/MSI/R9_280X_Gaming_6_GB/6.html
Those cards can barely break 50fps at 1080p with AA on.
Case in point: Watch Dogs will destroy 770's or 280x's if people try to run the game on super high settings.
You linked to a review that did not test at 1440p, then claimed that AC4 "[raped] a r290 and gtx780 at 1440p." :whiste:
The review did test 1600p which is 11.1% more pixels and if resolution:fps scaling were perfect, then the R290/GTX780 would get 39/37 fps, respectively. That's not exactly "rape," and I find it objectionable that you used that term to describe video cards, anyway.
A pair of 280X 3GB cards would do quite well in any game mentioned in this thread at 2560x1440. They wouldn't be shut out of using the highest settings, due to how they have 3GB of VRAM. A pair of 2GB cards would have more issues.
lol I like how everyone keeps avoiding my point. go look at screenshots or videos of the actual games and they are a joke for needing that much vram and/or gpu power.
http://forums.anandtech.com/showpost.php?p=36386490&postcount=556
and do i even need to post pics of that graphical joke, Wolfenstein.
A pair of 280X 3GB cards would do quite well in any game mentioned in this thread at 2560x1440. They wouldn't be shut out of using the highest settings, due to how they have 3GB of VRAM.
A pair of 2GB cards might not cut it for highest settings, but for some people second-best is good enough.