thesmokingman
Platinum Member
- May 6, 2010
- 2,307
- 231
- 106
While there is 4GB per GPU, I wonder if the drivers could use more VRAM and do something along the lines of pageswaps between GPUs, kinda like the 970 3.5GB + .5GB allocation if not in xfire mode?It's two gpus w/ 8gb total which in reality means 4gb per gpu.
This card would be a failure for a 'regular' desktop experience due to only 4GB per GPU.
Call of duty aw & bo3, shadow of mordor and few other games can break over 4GB.I wonder why people keep saying that.
CF Fury/Nano/Fury X run all the modern games just fine, once AMD sorts out the drivers, ofc.
4GB has not been shown to be a limitation at all, I would love some evidence to prove me wrong. And no, not 4K + SSAA or 8x MSAA where it's unplayable slow. I don't care for slideshows.
Call of duty aw & bo3, shadow of mordor and few other games can break over 4GB.
I would not pay high end prices for a 4GB card in 2016. Except the <$500 nano.
I wonder why people keep saying that.
CF Fury/Nano/Fury X run all the modern games just fine, once AMD sorts out the drivers, ofc.
4GB has not been shown to be a limitation at all, I would love some evidence to prove me wrong. And no, not 4K + SSAA or 8x MSAA where it's unplayable slow. I don't care for slideshows.
Do you put the little stickers on your case? Cute.
I do not, but whoever designed this case obviously does:
If you look at the other photos, you can clearly see it is using an MSI motherboard, HyperX memory, Samsung SSD, and a Maingear rebranded PSU while the CPU is conspicuously covered by a Corsair cooler. No attempt was made to conceal the brands of any parts in this marketing system except the CPU. Considering how important the CPU is to a system like this, intentionally hiding the maker of it is deceiving at best, and borders much closer to outright dishonesty as they placed an AMD logo below the Radeon branding on the front of the case where you would expect the logo of the CPU maker. If an Intel CPU is required to maximize the potential of AMD's video card, then an Intel logo should be visible somewhere on the system to give them the credit for producing a product that AMD was unable to produce themselves.
O-M-G putting a cooler over the CPU? Scandalous and dastardly!
Call of duty aw & bo3, shadow of mordor and few other games can break over 4GB.
I would not pay high end prices for a 4GB card in 2016. Except the <$500 nano.
Great job intentionally trying to distract from the point.
Great job intentionally trying to distract from the point.
Great job trying to make a mountain out of a molehill. :thumbsup:
Shortest dual GPU card. It is as long as a R9 290x. Maybe shorter.
Price is high, but it is expected with dual GPU cards.
Look is pretty neat.
4GB is not enough but we don't know what is enough with VR.
Great Output ports (3x DP)
Probably no HDMI 2.0 (in 2016)
Great job intentionally trying to distract from the point.
So can Rise of the Tomb Raider. Over 7GB vram usage in fact.
All of those games dynamically cache into vram, but it makes no difference to actual gameplay though if you have 4GB.
Even [H] themselves who have been bashing Fury X for 4GB, when they tried to vram bottleneck it with RotTR, didn't happen, Fury X puts out impressive performance the same as 980Ti and Titan X, including minimum FPS.
You could say "for future games" or more about general future proofing and you may have a case, because when it comes to the unknown, there's always a maybe.
However, for 1080 and 1440p, 4GB will be fine for a few more years.
So you are going to spend $1.5k on a card for 1080p?
The fury is not a well balanced card - they famously stutter until AMD produce specific fury fixes I'm guessing to make the dynamic caching smarter. Now we have 2 fury's, still 4GB of ram and the same main memory bandwidth. No way even with the smartest drivers they aren't going to run into serious limitations. Then there's the fact you are reliant on AMD to keep patching drivers for what is going to be a super rare card - do you trust them to do that?
Really I'd be surprised if they sell any?
The DX12 will use both RAMS and stuff. Except that's exactly what WON'T actually be happening...
DX12 won't do anything... it just opens this options for the devs.The DX12 will use both RAMS and stuff. Except that's exactly what WON'T actually be happening...