Fury XT and Pro prices

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

TemjinGold

Diamond Member
Dec 16, 2006
3,050
65
91
Not to insult, but that is nonsense. Stop it.

AMD has already said, as has Nvidia, that HBM offers no extra storage, capacity, or usage, at the same size level.

4GB is 4GB, regardless of whether it's GDDR or HBM. Period, end of story. So for those of us at 4k now, who will be running Crossfire setups with ALL settings maxed, 4GB is a massive limiting factor.

For those of us who desperately need higher VRAM counts, the Fury X is a dangerous buy. I'm sure AMD would have preferred to ship 8GB SKUs with the X of course, it just wasn't possible.

The ONLY hope for this particular scenario is that MS isn't bullshitting when it comes to DX12's "stacked vram" capability (2 x 4GB cards would equal a true 8GB of framebuffer). But that's a hell of a gamble to bet on.

I'm sorry but for those of you running 4k now with ALL settings maxed, the ram is the least of your worries right now.
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
From TweakTown's unbiased VRAM tests last week:

Far Cry 4 @ 4k with AA: 5.7GB VRAM needed
GTA V @ 4k with AA: 6.3GB VRAM needed
Shadow of Morder @ 4k with AA: 5.4GB VRAM needed

That's why those of us who own NO stock in these companies, and have owned MULTIPLE cards from both ATi and Nvidia are concerned about the Fury X having just 4GB.

Do you mean this:

http://www.tweaktown.com/tweakipedia/90/much-vram-need-1080p-1440p-4k-aa-enabled/index.html

Because if so it actually doesn't tell us anything useful. The only way to really see how a game is impacted by RAM is to have the same GPU with 2 different amounts of RAM on it and run FPS (minimum, average, maximum, rate) tests. So if you want to see the impact of 4GB of RAM as a possible limit you'd need to test something like 290 4GB vs 290 8GB since I'm not aware of any Nvidia high end cards with a 4GB and greater than 4GB version atm.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
They're not, for the reasons I've already outlined. Way too little framebuffer, especially for an SLI/Crossfire setup where you have the GPU horsepower to set everything high with AA...which then triggers the VRAM issues.

6GB+ cards only for SLI/Crossfire setups running 4k, otherwise it's disappointment as the Tweaktown tests show.

For anything below that (1440p, etc), 4GB is more than enough.

I don't disagree, that's why I stated a 980Ti would be a better choice...

980/970/390/390x all are now decidedly mid-range and if you want to do 4K gaming, the 980Ti/TX or Fury are the real options there. I just don't see the added cost of 8GB for other cards, if you are planning to do 1440P or less. If you want more, get a better GPU rather than more VRAM.

8GB mid-range doesn't make much sense right now. Maybe when Pascal/next gen AMD come around next year, but currently its just not worth the extra $$$.
 
Feb 6, 2007
16,432
1
81
For anything below that (1440p, etc), 4GB is more than enough.

For right now maybe. I had the exact same thinking when I bought a 2 GB 680 three years ago; when would I possibly need more than 2 GB of VRAM when current games weren't hitting anywhere near that at 1080P? Fast forward a couple years and I can't run max texture settings in Shadow Of Mordor or GTA V because despite the card having plenty of headroom in the processing department to maintain high framerates, it lacks the VRAM needed to use those higher resolution textures without stuttering. So if you only care about buying a card to run games that are out now, yeah, 4 GB VRAM is plenty @ 1080P. But for people who are thinking a card might last them a few years, VRAM is actually pretty valid concern. Increasing texture resolution has a nominal impact on performance while drastically improving IQ, so developers will probably start taking advantage of higher resolution textures. The only thing that really matters there? VRAM. I was hopeful that AMD would be pushing 6-8GB cards out for that reason; looking at 4 GB as "plenty" makes me nervous.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
I'm sorry but for those of you running 4k now with ALL settings maxed, the ram is the least of your worries right now.

Not sure this is true anymore. If a single Fury can deliver what AMD is claiming, 2-3 of these could really start to be a potent 4K setup. VRAM definitely IS a concern.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
For right now maybe. I had the exact same thinking when I bought a 2 GB 680 three years ago; when would I possibly need more than 2 GB of VRAM when current games weren't hitting anywhere near that at 1080P? Fast forward a couple years and I can't run max texture settings in Shadow Of Mordor or GTA V because despite the card having plenty of headroom in the processing department to maintain high framerates, it lacks the VRAM needed to use those higher resolution textures without stuttering. So if you only care about buying a card to run games that are out now, yeah, 4 GB VRAM is plenty @ 1080P. But for people who are thinking a card might last them a few years, VRAM is actually pretty valid concern. Increasing texture resolution has a nominal impact on performance while drastically improving IQ, so developers will probably start taking advantage of higher resolution textures. The only thing that really matters there? VRAM. I was hopeful that AMD would be pushing 6-8GB cards out for that reason; looking at 4 GB as "plenty" makes me nervous.

Agree.

4GB will be low/mid by the end of next year...

Edit: It is kind of a confusing time. We are adding a lot of resolution increases, new techs (HBM) and the potential for Win10 to help with framebuffer issues. Lots to get our head around...
 
Last edited:

Mako88

Member
Jan 4, 2009
129
0
0
I'm sorry but for those of you running 4k now with ALL settings maxed, the ram is the least of your worries right now.

As I said, dual card setups. Which have no problem at all running 4k with AA with everything on...provided you're running dual 980 Ti, or Titan X, cards.

With a total GPU budget of say $1300-$1500, the Fury X in Crossfire is an awesome deal (faster than 980 Ti + $200 cheaper). But the VRAM limitation is the fear.

Do you mean this:

http://www.tweaktown.com/tweakipedia/90/much-vram-need-1080p-1440p-4k-aa-enabled/index.html

Because if so it actually doesn't tell us anything useful. The only way to really see how a game is impacted by RAM is to have the same GPU with 2 different amounts of RAM on it and run FPS (minimum, average, maximum, rate) tests. So if you want to see the impact of 4GB of RAM as a possible limit you'd need to test something like 290 4GB vs 290 8GB since I'm not aware of any Nvidia high end cards with a 4GB and greater than 4GB version atm.

That's true, and that's why there's a faint hint of optimism still out there that we can hold on to. This speaks otherwise though (as the rez goes up, Fury X falls farther and farther behind the 6GB+ cards):

http://cdn3.wccftech.com/wp-content/uploads/2015/06/AMD-Radeon-R9-Fury-X-3DMark-Firestrike.png

I don't disagree, that's why I stated a 980Ti would be a better choice...

980/970/390/390x all are now decidedly mid-range and if you want to do 4K gaming, the 980Ti/TX or Fury are the real options there. I just don't see the added cost of 8GB for these cards, if you are planning to do 1440P or less. If you want more, get a better GPU rather than more VRAM.

8GB mid-range doesn't make much sense right now. Maybe when Pascal/next gen AMD come around next year, but currently its just not worth the extra $$$.

Agree, and that's the killer when it comes to Fury X, because in a Crossfire world you have the strength with it to go 4k with AA no problem, 50+ fps in any game currently out there. But that requires 6GB+ to do, and the 4GB limit is a threat.

But you never know, maybe the rumored AMD compression combined with the new structure of the VRAM itself it's impact is minimized. Crossing fingers.
 

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
Conjecture is hard. Have we seen most of the RAM size increase after the console generation shift, will it keep going up, or will it decrease now that optimization effort is going to making the most of VRAM and we're going to exclusives for newer APIs with better texture compression support?
 

flopper

Senior member
Dec 16, 2005
739
19
76
For right now maybe. I had the exact same thinking when I bought a 2 GB 680 three years ago; when would I possibly need more than 2 GB of VRAM when current games weren't hitting anywhere near that at 1080P? Fast forward a couple years and I can't run max texture settings in Shadow Of Mordor or GTA V because despite the card having plenty of headroom in the processing department to maintain high framerates, it lacks the VRAM needed to use those higher resolution textures without stuttering. So if you only care about buying a card to run games that are out now, yeah, 4 GB VRAM is plenty @ 1080P. But for people who are thinking a card might last them a few years, VRAM is actually pretty valid concern. Increasing texture resolution has a nominal impact on performance while drastically improving IQ, so developers will probably start taking advantage of higher resolution textures. The only thing that really matters there? VRAM. I was hopeful that AMD would be pushing 6-8GB cards out for that reason; looking at 4 GB as "plenty" makes me nervous.

I run a 5040x1050 set up and 4gb is plenty.
You want as a user a mix between resolution and fps.
120hz set up means for me I want to push fps into 120+ which I cant do with all settings maxed. I dont go the dual card etc..as I am highly sensitive to any issues with framerates.
My 290 push 115fps in BF4 and I expect the Fury to help allow such framerates and then add some higher settings along the way.

4gb wont be and isnt an issue for me in years. 4 years from now I upgrade again unless something special happens with cards in the meantime.
 

Mako88

Member
Jan 4, 2009
129
0
0
Conjecture is hard. Have we seen most of the RAM size increase after the console generation shift, will it keep going up, or will it decrease now that optimization effort is going to making the most of VRAM and we're going to exclusives for newer APIs with better texture compression support?

Nothing ever decreases. For 25+ years devs have gobbled up system ram, video ram, and CPU/GPU cycles, and that will never change, ever.

Even with DX12's lofty (and ridiculous frankly) promises, it doesn't matter...in ten years games will be a terabyte in size, using 256GB of ram as a min, and require 128GB of VRAM heheh

I run a 5040x1050 set up and 4gb is plenty.
You want as a user a mix between resolution and fps.
120hz set up means for me I want to push fps into 120+ which I cant do with all settings maxed. I dont go the dual card etc..as I am highly sensitive to any issues with framerates.
My 290 push 115fps in BF4 and I expect the Fury to help allow such framerates and then add some higher settings along the way.

4gb wont be and isnt an issue for me in years. 4 years from now I upgrade again unless something special happens with cards in the meantime.

And we're not talking about that low resolution, of course 4GB works to push 5.2M pixels. But 4k is 8.2M pixels to throw around, a F-ton more, and that's why we're concerned 4GB isn't nearly enough for those of us at 3840x2160.
 
Last edited:

TemjinGold

Diamond Member
Dec 16, 2006
3,050
65
91
We should all just calm down. Can't even buy these cards yet. How about we wait for a bunch of reviews to test this first and then kick and scream?
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
Conjecture is hard. Have we seen most of the RAM size increase after the console generation shift, will it keep going up, or will it decrease now that optimization effort is going to making the most of VRAM and we're going to exclusives for newer APIs with better texture compression support?

Hoping so. This will help keep some of the prices in-check too. I think we will stick around 8GB or so for a while. Win10 is also a wild-card here too...

With the proposed pro/X prices, I am still holding out hope for a 6/8GB version later this year for a small premium (~$100?).
 

Udgnim

Diamond Member
Apr 16, 2008
3,665
112
106
From TweakTown's unbiased VRAM tests last week:

Far Cry 4 @ 4k with AA: 5.7GB VRAM needed
GTA V @ 4k with AA: 6.3GB VRAM needed
Shadow of Morder @ 4k with AA: 5.4GB VRAM needed

That's why those of us who own NO stock in these companies, and have owned MULTIPLE cards from both ATi and Nvidia are concerned about the Fury X having just 4GB.

It's not bias, fanboy nonsense, or anything else. The Fury X is $200 cheaper than the 980 Ti when buying two cards, and is likely faster as well...but those of us already at 4k seem to be locked out of it as an option because the 4GB framebuffer is way too low for dual cards (all options on).



No amount of magic compression is going to turn 6GB of framebuffer need into 4GB without a massive penalty. But I'm hopeful you're right, maybe they've pulled something off here. But in looking at those leaked benchmarks today at 5k and 8k, the Fury X drops off massively versus both Titan X and 980 Ti, so my optimism isn't real strong right now...

total VRAM usage != VRAM needed to render a scene

we'll find out soon enough how much an issue 4GB is at 4K and whether GPU or VRAM becomes the limiting factor first as settings get pushed to increase VRAM usage

if you want to game at 5K / 8K and can afford that resolution, then go ahead and buy 4 Titan Xs to drive that
 

Mako88

Member
Jan 4, 2009
129
0
0
With the proposed pro/X prices, I am still holding out hope for a 6/8GB version later this year for a small premium (~$100?).

All the SKUs are set for 2015, any fresh high-end SKUs would be a Q1/2016 target at a minimum.

But who knows, you would think that if Fury X cards are sitting on the shelf due to feedback of "vram is too low", and 980 Ti is outselling it 2:1 despite being slower, it would definitely spur AMD's management to rush/fast-track an 8GB version to market.

Would gladly pay $749 each for two Fury X 8GB cards with AIO hybrid coolers, perfect dream setup really.

total VRAM usage != VRAM needed to render a scene

we'll find out soon enough how much an issue 4GB is at 4K and whether GPU or VRAM becomes the limiting factor first as settings get pushed to increase VRAM usage

Prefer that you are right and the rest of us are wrong, hopefully the case for sure.
 
Last edited:

Osjur

Member
Sep 21, 2013
92
19
81
From TweakTown's unbiased VRAM tests last week:

Far Cry 4 @ 4k with AA: 5.7GB VRAM needed
GTA V @ 4k with AA: 6.3GB VRAM needed
Shadow of Morder @ 4k with AA: 5.4GB VRAM needed

We have no software which can precisely tell how much memory the game actually needs... All the VRAM utilization charts from sites like HardOCP, Tweak town etc. uses Afterburner to measure how many MBytes of memory is being allocated but it doesn't tell how much the program actually needs.

I can make a graphic program which allocates full 12GB of Titan X memory but in reality needs only 10MB of memory, and then come here to tell how 12GB of memory on TX isn't enough.

Look at the charts:



Dying light is using 6GB of memory on TX and it's still only 35% faster than 980, which is how much faster it should be. Now if 980 actually would hit VRAM wall, it's performance would drop like 80% because that is what happens when you hit the VRAM wall. There is no drops and sudden lagspikes on that chart at all so the game actually needs less than 4GB but it is still allocating more if it can.

I've seen over the years that most people have no frigging idea what really happens when you hit the magical vram wall. I've owned 2560x1600p monitor starting from 2007, when most cards had 512mb of memory so vram wall is something which I did encounter many times in the past.

EDIT:
It's not my problem if sites like HardOCP etc. use VRAM charts to brag that 4GB is not enough even though their fps charts says otherwise, it just means they have a hidden agenda or are incompetent at their work.
 
Last edited:

Mako88

Member
Jan 4, 2009
129
0
0
We have no software which can precisely tell how much memory the game actually needs... All the VRAM utilization charts from sites like HardOCP, Tweak town etc. uses Afterburner to measure how many MBytes of memory is being allocated but it doesn't tell how much the program actually needs.

I can make a graphic program which allocates full 12GB of Titan X memory but in reality needs only 10MB of memory, and then come here to tell how 12GB of memory on TX isn't enough.I've seen over the years that most people have no frigging idea what really happens when you hit the magical vram wall. I've owned 2560x1600p monitor starting from 2007, when most cards had 512mb of memory so vram wall is something which I did encounter many times in the past.

Same. Gamed at 1920x1200 in 2002 via a ridiculously ghosting/laggy $3995 work LCD (Samsung 240T) and had the very first Dell 30" at 1600p when it hit the market four years later. At no time did VRAM limits come into play really...

You guys are making a good case, optimisim growing.
 

flopper

Senior member
Dec 16, 2005
739
19
76
And we're not talking about that low resolution, of course 4GB works to push 5.2M pixels. But 4k is 8.2M pixels to throw around, a F-ton more, and that's why we're concerned 4GB isn't nearly enough for those of us at 3840x2160.

AMD says its fine so guess that should count for something.
 

IEC

Elite Member
Super Moderator
Jun 10, 2004
14,440
5,429
136
As those of us already running 4K/UHD resolutions know, you're going to hit a wall in GPU performance before you hit a VRAM limitation. And forget about enabling MSAA with a single GPU in newer titles, even with the Titan-X.

I don't anticipate needing more than 4GB VRAM. Will there be corner cases where this is a limitation? Possibly, but the difference between 15fps and 7fps isn't going to make me happy either way...

The 24th can't come soon enough
 

Mako88

Member
Jan 4, 2009
129
0
0
AMD says its fine so guess that should count for something.

Yes, that too. They went out of their way to mention it specifically, was actually surprised they would even mention it due to potential backlash over the VRAM being low.
 

Mako88

Member
Jan 4, 2009
129
0
0
As those of us already running 4K/UHD resolutions know, you're going to hit a wall in GPU performance before you hit a VRAM limitation. And forget about enabling MSAA with a single GPU in newer titles, even with the Titan-X.

Except we're talking about dual card setups, which DO have the power to handle 4k without an issue, and hit VRAM limits BEFORE GPU limits.
 

IEC

Elite Member
Super Moderator
Jun 10, 2004
14,440
5,429
136
Except we're talking about dual card setups, which DO have the power to handle 4k without an issue, and hit VRAM limits BEFORE GPU limits.

The slides in this thread actually don't support that. 290X Crossfire and 980 SLI do just fine.
 

Osjur

Member
Sep 21, 2013
92
19
81
Same. Gamed at 1920x1200 in 2002 via a ridiculously ghosting/laggy $3995 work LCD (Samsung 240T) and had the very first Dell 30" at 1600p when it hit the market four years later. At no time did VRAM limits come into play really...

You guys are making a good case, optimisim growing.

Nah Crysis was fine example when it suddenly dropped from 15fps to 1fps and then came back to 15fps and the cycle continued. That is the vram wall.

Ofc it really wasn't playable in any case @ 1600p because my gfx cards
wasn't fast enough even if they would have had enough vram.
 

Mako88

Member
Jan 4, 2009
129
0
0
Nah Crysis was fine example when it suddenly dropped from 15fps to 1fps and then came back to 15fps and the cycle continued. That is the vram wall.

Ofc it really wasn't playable in any case @ 1600p because my gfx cards
wasn't fast enough even if they would have had enough vram.

It was brutal back then, very early SLI/Crossfire being absolutely mandatory to even contemplate playing at 1920x1200 (2002) or 2560x1600 (2006).

Every GPU upgrade cycle required two cards, and you'd pray just to get to 30fps haha...

Ah the old days of LCD...I don't miss them. Does feel a bit similar though now at 4k, the pressure has returned lol. And back then the cards weren't $650-$750 EACH either...cry.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |