Fury XT and Pro prices

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
When the Hell is dual Fury coming, I'm already getting 5fps out of VRAM slow downs in Arma 3 and I'm not waiting 6 months as the next generation will then be just around the corner. I don't want to go 980Ti but need more VRAM today not 'sometime soonish'. I don't see why they have to launch dual GPU card so much later, it can't be that difficult to design can it?

AMD is probably going to wait and see the reaction/acceptance of Fury to decide on market pricing. That's what I would do anyway.
 

Mako88

Member
Jan 4, 2009
129
0
0
AMD is probably going to wait and see the reaction/acceptance of Fury to decide on market pricing. That's what I would do anyway.

And such a niche market, far more niche than even the average SLI/Crossfire buyer is..dual GPUs on a single PCB are now in that insane $1500 price stratosphere, which is a tough sell versus the more common "well I'll just buy one card now and if it gets slow later I'll buy a second card" cop-out that budget limited gamers do.

You don't want all that tdp running through a single slot anyway...plus the cooling, even if AIO, will likely not feature dual rads and be as efficient as a two card AIO solution is.

Not worth waiting for IMO, better to Crossfire now instead.
 

chimaxi83

Diamond Member
May 18, 2003
5,456
61
101
And such a niche market, far more niche than even the average SLI/Crossfire buyer is..dual GPUs on a single PCB are now in that insane $1500 price stratosphere, which is a tough sell versus the more common "well I'll just buy one card now and if it gets slow later I'll buy a second card" cop-out that budget limited gamers do.

You don't want all that tdp running through a single slot anyway...plus the cooling, even if AIO, will likely not feature dual rads and be as efficient as a two card AIO solution is.

Not worth waiting for IMO, better to Crossfire now instead.

The slot is limited in power, "all that TDP" is coming from the power connectors on the card itself.

So what does any of this "is 4GB enough?" have to do with the thread? I mean, don't worry, a select group of individuals here have already started the not enough VRAM crusade.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Not to insult, but that is nonsense. Stop it.

AMD has already said, as has Nvidia, that HBM offers no extra storage, capacity, or usage, at the same size level.

4GB is 4GB, regardless of whether it's GDDR or HBM. Period, end of story. So for those of us at 4k now, who will be running Crossfire setups with ALL settings maxed, 4GB is a massive limiting factor.

For those of us who desperately need higher VRAM counts, the Fury X is a dangerous buy. I'm sure AMD would have preferred to ship 8GB SKUs with the X of course, it just wasn't possible.

The ONLY hope for this particular scenario is that MS isn't bullshitting when it comes to DX12's "stacked vram" capability (2 x 4GB cards would equal a true 8GB of framebuffer). But that's a hell of a gamble to bet on.

Nonsense is judging before we see the results. I'm not saying it's more than 4GB.

AMD said that will be enough and we will see. Microsoft and AMD have said DX12 (and Mantle and Vulkan)) can add VRAM from multi gpu setups. We'll see. I'm not going to assume that all of these entities are lying.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
It isn't nonsense to say 4GB is 4GB. It actually makes a lot of sense. So much so that you'd be hard pressed to find a statement that makes more sense than that. HBM has a packaging, speed and power advantage over GDDR5. It's still 4GB though. If a game uses 2GB of GDDR5 it will use 2GB of HBM. If it uses 6GB of GDDR5 it will use 6GB of HBM unless of course you don't have 6GB, then it will use 4GB but you will have 2GB of assets that are not there ready to go. It won't magically store 6GB worth of assets into 4GB of space.
 

maddie

Diamond Member
Jul 18, 2010
4,881
4,951
136
A couple unknowns by me. Can someone here clarify.

Can you load new data into GDDR5 at the same time as reading the ram for game use?
Will the ability of HBM to do Dual CMD allow it to load new data while still using the ram as normal?

Can this allow continous updating of game data thus preventing a stutter versus traditional use when you need to stop reading to update, causing a stutter.
 

96Firebird

Diamond Member
Nov 8, 2010
5,714
316
126
According to a poster at the TechSpot forums, GDDR5 allows simultaneous read/write:

•Physically, a GDDR5 controller/IC doubles the I/O of DDR3 - With DDR, I/O handles an input (written to memory), or output (read from memory) but not both on the same cycle. GDDR handles input and output on the same cycle.

Source

I'll look for a better source, if I get time.
 

werepossum

Elite Member
Jul 10, 2006
29,873
463
126
For right now maybe. I had the exact same thinking when I bought a 2 GB 680 three years ago; when would I possibly need more than 2 GB of VRAM when current games weren't hitting anywhere near that at 1080P? Fast forward a couple years and I can't run max texture settings in Shadow Of Mordor or GTA V because despite the card having plenty of headroom in the processing department to maintain high framerates, it lacks the VRAM needed to use those higher resolution textures without stuttering. So if you only care about buying a card to run games that are out now, yeah, 4 GB VRAM is plenty @ 1080P. But for people who are thinking a card might last them a few years, VRAM is actually pretty valid concern. Increasing texture resolution has a nominal impact on performance while drastically improving IQ, so developers will probably start taking advantage of higher resolution textures. The only thing that really matters there? VRAM. I was hopeful that AMD would be pushing 6-8GB cards out for that reason; looking at 4 GB as "plenty" makes me nervous.
I think that you are certainly correct in theory. The consoles have 4GB VRAM, so if you are running higher resolutions than the consoles can drive or using modded textures, surely you'll run low with only 4GB VRAM on any port designed to take maximum advantage of the consoles. Whether that has any impact in practice depends on how fast the card can move data in and out of VRAM though, right? As long as needing more VRAM doesn't actually slow the game to an unacceptable amount, 4GB might well be enough considering that people who buy $500+ video cards probably don't keep them all that long.

Sheesh, I can remember boasting that the first computer I built had the "full 640KB" of "fast 120ns RAM".
 

garagisti

Senior member
Aug 7, 2007
592
7
81
All the SKUs are set for 2015, any fresh high-end SKUs would be a Q1/2016 target at a minimum.

But who knows, you would think that if Fury X cards are sitting on the shelf due to feedback of "vram is too low", and 980 Ti is outselling it 2:1 despite being slower, it would definitely spur AMD's management to rush/fast-track an 8GB version to market.

Would gladly pay $749 each for two Fury X 8GB cards with AIO hybrid coolers, perfect dream setup really.



Prefer that you are right and the rest of us are wrong, hopefully the case for sure.
Vram is too low, like 3.5GB? Something which people here are still recommending, and were still recommending 980 which also has only 4gb. For what it is worth, AMD since HD6xxx (iirc even before) have fared better at higher resolutions. They have been gearing this at 4k, and if that screenshot of FC4 at 4k with ultra settings and getting 40+ minimum is anything to go by, the card will not be lacking.
 
Feb 19, 2009
10,457
10
76
Frustrated that the Fury X is only 4GB, it's almost a non-starter for those of us at 4k right now.

You need to educate yourself and stop spouting stuff that are blatantly wrong, proven wrong with cold hard facts:

Read & learn:
http://forums.anandtech.com/showpost.php?p=37488175&postcount=21

Measuring with AB to show vram usage, does nothing to indicate whether the game actually needs that vram or it just allocates it.

And no. Titan X SLI isn't enough to max 4K with MSAA, it isn't even enough with FXAA maxing:



This doesn't even have HairWorks on, forget about maxing! The AA used in Witcher 3? A derivative of FXAA/TXAA. NOT vram intensive or performance crippling MSAA.
 
Last edited:

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
You need to educate yourself and stop spouting stuff that are blatantly wrong, proven wrong with cold hard facts:

Read & learn:
http://forums.anandtech.com/showpost.php?p=37488175&postcount=21

Measuring with AB to show vram usage, does nothing to indicate whether the game actually needs that vram or it just allocates it.

And no. Titan X SLI isn't enough to max 4K with MSAA, it isn't even enough with FXAA maxing:

This doesn't even have HairWorks on, forget about maxing! The AA used in Witcher 3? A derivative of FXAA/TXAA. NOT vram intensive or performance crippling MSAA.

You need to rethink your logic.

Proving at the current time that 4GB is perfectly sufficient does not mean that 4GB will be fine a few years down the road.

Buying a Fury now means keeping it until at least midrange GPUs appear on 16nm, something that will probably take ~12-18 months (mid to late 2016). And these midrange GPUs will likely perform similarily to fury (maybe 10-20% faster) meaning for a meaningful upgrade one will wait even longer

The question is if 4GB will be sufficient for games in 6-12 months and possibly 18 months.

Will game be playable? Sure, certain settings can be turned down. This is however, a $649 card and a lot more is expected from it.
 

Mako88

Member
Jan 4, 2009
129
0
0
You need to rethink your logic.

Proving at the current time that 4GB is perfectly sufficient does not mean that 4GB will be fine a few years down the road.

Buying a Fury now means keeping it until at least midrange GPUs appear on 16nm, something that will probably take ~12-18 months (mid to late 2016). And these midrange GPUs will likely perform similarily to fury (maybe 10-20% faster) meaning for a meaningful upgrade one will wait even longer

The question is if 4GB will be sufficient for games in 6-12 months and possibly 18 months.

Will game be playable? Sure, certain settings can be turned down. This is however, a $649 card and a lot more is expected from it.

Exactly.

Hopefully we'll get some guidance from the first batch of tests, it should be easy to see. If the Fury X outpaces the 980 Ti at 1080p and 1440p but suddenly falls a bit behind at 2160p in certain vram intensive games like GTA V and Shadow of Mordor we'll know the reason.

Hopeful that's not the case but as you correctly pointed out, 4GB is 4GB...now and two years from now.
 
Feb 19, 2009
10,457
10
76
You need to rethink your logic.

Proving at the current time that 4GB is perfectly sufficient does not mean that 4GB will be fine a few years down the road.

Buying a Fury now means keeping it until at least midrange GPUs appear on 16nm, something that will probably take ~12-18 months (mid to late 2016). And these midrange GPUs will likely perform similarily to fury (maybe 10-20% faster) meaning for a meaningful upgrade one will wait even longer

The question is if 4GB will be sufficient for games in 6-12 months and possibly 18 months.

Will game be playable? Sure, certain settings can be turned down. This is however, a $649 card and a lot more is expected from it.

Well then, why didn't these shills come out and say it, that 4GB may not be enough down the road?! They are harping on all this time how its not enough NOW. Even [H] said the minimum at 4K is 6GB vram (when their own bench results show otherwise!!) NOW.

4GB will be enough for 1 and 2 GPU at 4K in the next 2 years. You can quote me on this. You wanna know the secret?

Games will get more demanding, now we're seeing SLI Titan X fall on its face at 4K in GTA V, Witcher 3, Dying Light etc. It simply cannot run it maxed. Settings need to be turned down, MSAA needs to be OFF as a start. As soon as you do that, you are not pushing vram requirements as MSAA is one of the biggest vram/perf killer.

CF Fury X (assuming Titan X + 10-20% perf) now won't be able to max 4K. It won't be able to max 4K in the next 6 - 18 months, it has to turn down settings. At which point the bottleneck isn't vram, its just processing power. This idea that one or two GPU is capable of maxing 4K with AA where vram actually matters is a fallacy because they are unplayably slow due to lack of processing power.

Capiche?

IF you guys want to rephrase that and say "4GB vram may not be enough for QuadFire or Quad SLI", then I will fully agree with you. Only then will you have enough processing power to max 4K.
 
Last edited:

jackstar7

Lifer
Jun 26, 2009
11,679
1,944
126
Can someone post 980s in SLI and 295x2 hitting the VRAM wall in someone's article/testing vs a 980ti and or Titan X?

Certainly there are texture-mods and such that can demonstrate this scenario, right?

I think that's what's missing here is the examples/data.
 

Mako88

Member
Jan 4, 2009
129
0
0
Capiche?

IF you guys want to rephrase that and say "4GB vram may not be enough for QuadFire or Quad SLI", then I will fully agree with you. Only then will you have enough processing power to max 4K.

No offense "Silverforce" but you sound a bit too worked up to have an adult discussion as we have today. We're all in the market for this card and actually want it to succeed as its one hell of a bargain. We're working out the issues and determining if there's merit to the 4GB limitation fear.

Absolutely nothing wrong with that, and if you can't sit at the adult table without letting some nvidia "shills" get you upset, then I'll have to send you to the corner for a timeout at the kiddie table.

Caprice?


Personal attacks are not allowed here.
Markfw900
 
Last edited by a moderator:

Mako88

Member
Jan 4, 2009
129
0
0
Can someone post 980s in SLI and 295x2 hitting the VRAM wall in someone's article/testing vs a 980ti and or Titan X?

Certainly there are texture-mods and such that can demonstrate this scenario, right?

I think that's what's missing here is the examples/data.

When you look at various benchmarks around the web you really don't see vram limits affecting the numbers, even at 4k.

Meaning a 980 versus a 980 Ti and Titan X doesn't show a different scaling at 4k than it does at lower resolutions...they're about 35% apart regardless of rez, even with GTA V and Shadow of Mordor.

So far that demonstrates that 4GB may be enough for 4k with AA, for now. Optimistic for Fury X in other words.

Even the 3.5GB 970 doesn't drop off much at 4k versus lower resolutions, which was a similar debate when that controversy hit.
 

garagisti

Senior member
Aug 7, 2007
592
7
81
You need to rethink your logic.

Proving at the current time that 4GB is perfectly sufficient does not mean that 4GB will be fine a few years down the road.

Buying a Fury now means keeping it until at least midrange GPUs appear on 16nm, something that will probably take ~12-18 months (mid to late 2016). And these midrange GPUs will likely perform similarily to fury (maybe 10-20% faster) meaning for a meaningful upgrade one will wait even longer

The question is if 4GB will be sufficient for games in 6-12 months and possibly 18 months.

Will game be playable? Sure, certain settings can be turned down. This is however, a $649 card and a lot more is expected from it.

What you need to understand that newer standards are coming in. Those with good cards (290s on AMD side)should wait this one out unless you game at UHD, or have a multi-monitor setup. That goes for cards from BOTH Nvidia and AMD at present. Having a card that meet standards like ones for UHD will be quite useful, but that will happen with HDMI 2.+ and HDCP 2.2 etc. That will happen quite likely with the ones coming with the node shrink. Now that would be either Arctic Islands or Pascal.

Ideally this is a fairly good/ great upgrade for those with older cards, and yes, 4gb of HBM with new APIs will be enough for even 4k. May be not for 8k, which one of the charts included, but 8k isn't going to be a standard for 3-4 years now.

For the standard 980Ti money, you get a Fiji which reportedly (well AMD claimed in presentation) does 40+ fps minimum, and averaging over 50+ while running FarCry4 with ultra settings at 4k resolution. Now even if that was without the game"doesn't or barely"works settings turned off, it is nothing to scoff at. Considering that a Titan X doesn't do that... Fiji which comes with a water block, runs as cool and quiet as it does, is a seriously good buy for present time. That is without considering if it will overclock and how much.

You can't have future proofing when a node change is about to hit and as imminent as it is, with hbm2 for that matter around the curve. If you think you will have something which you wouldn't want rid in 2 years or so, then imho you either don't game much, or running at highest settings with best equipment isn't for you. With the node shrink, i'd expect performance to move up by another 40-50 percent odd if not more in that time frame. If you're running UHD display, like i do, you will change your card(s). Personally, i'm going to get this, while i know full well that i will be buying something in less than 2 years. That would be true with cards from both AMD or Nvidia. If you think 6gb/12gb vram is all that matters, you're simply mistaken.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
4GB will be enough for 1 and 2 GPU at 4K in the next 2 years

Why would you buy 2 gpu's for 4k? You said it yourself "even Titan x falls on its face @ 4k"
You will buy 3 gtx980ti's or 3 Fury's and then when you have 50/60 fps you crank up the settings and the Fury will fall on its face.
I think that's the point. Why would you buy 2 700$ cards and a 800$ 4k monitor to play games at lower setings because you only have a 4gb card?

How about next year ? are vram requirements going to go down?
Its just not very smart Silverforce.
 
Last edited:

maddie

Diamond Member
Jul 18, 2010
4,881
4,951
136
When you look at various benchmarks around the web you really don't see vram limits affecting the numbers, even at 4k.

Meaning a 980 versus a 980 Ti and Titan X doesn't show a different scaling at 4k than it does at lower resolutions...they're about 35% apart regardless of rez, even with GTA V and Shadow of Mordor.

So far that demonstrates that 4GB may be enough for 4k with AA, for now. Optimistic for Fury X in other words.

Even the 3.5GB 970 doesn't drop off much at 4k versus lower resolutions, which was a similar debate when that controversy hit.

The most you can do is look at trends and project from the past. The problem with this is it won't allow for disruptive changes. The history of tech is new solutions break the limits of the past.

Is this the case now? None of us know enough now to make an educated guess. We will have to wait and see the tests to see if a new paradigm has arisen.
 

Mako88

Member
Jan 4, 2009
129
0
0
The most you can do is look at trends and project from the past. The problem with this is it won't allow for disruptive changes. The history of tech is new solutions break the limits of the past.

Is this the case now? None of us know enough now to make an educated guess. We will have to wait and see the tests to see if a new paradigm has arisen.

That's well said. Basically you either see disruption in the Fury X (and DX12), or you don't.

I wouldn't trust MS to deliver stacked VRAM and mixed GPUs with 100% working perfect status as far as I could throw them. VERY skeptical on their hype.

For the Fury X, it's going to come down to what we believe we'll need over the next 18-24 months that we own the card. At 4k, that need is more serious in terms of vram than for those running 1440p or less.

But even then, as others have said, devs aren't suddenly going to be wasting resources on added texture packs or similar improvements that only a small percentage of the market can utilize.

The more I think about what we've said today, the more I tend to lean that 4GB, even in an SLI/Crossfire environment at 4k with AA, will likely be enough 90% of the time over the next two years.
 
Feb 19, 2009
10,457
10
76
Why would you buy 2 gpu's for 4k? You said it yourself "even Titan x falls on its face @ 4k"
You will buy 3 gtx980ti's or 3 Fury's and then when you have 50/60 fps you crank up the settings and the Fury will fall on its face.
I think that's the point. Why would you buy 2 700$ cards and a 800$ 4k monitor to play games at lower setings because you only have a 4gb card?

How about next year ? are vram requirements going to go down?
Its just not very smart Silverforce.

I can run 4K on my R290s, doesn't have to be maxed. High at 4K is more enjoyable than Ultra at 1080p.

4K is a trade off with settings unless you have Quad GPUs. However, many games look great on High and not much different to Ultra.

Witcher 3 at 1080p on Ultra requires 2x R290s, but I can run it on High with Ultra textures at 45 fps on my 7950, the game looks very close in quality, one would have to examine screenshots to tell the difference.

As such, this talk of maxing 4K needs more than 4GB vram, sure, I agree with that. Except you can't do it on two GPUs in newer titles.
 
Feb 19, 2009
10,457
10
76
No offense "Silverforce" but you sound a bit too worked up to have an adult discussion as we have today. We're all in the market for this card and actually want it to succeed as its one hell of a bargain. We're working out the issues and determining if there's merit to the 4GB limitation fear.

Absolutely nothing wrong with that, and if you can't sit at the adult table without letting some nvidia "shills" get you upset, then I'll have to send you to the corner for a timeout at the kiddie table.

Caprice?

So you're the one quoting that Tweaktown vram test where they have 4x, 8x MSAA and even some have SSAA at 4K showing huge vram usage as if its meaningful... it just demonstrates you either misunderstand what the result means or just shilling.

If its a misunderstanding, then you can correct that.
Did you read this? http://forums.anandtech.com/showpost.php?p=37488175&postcount=21

That's reality. 4GB vs 12GB vram doesn't matter squat for single and dual GPU setups at 4K because they lack GPU processing power to max settings and play with AA where vram matters.

It's a very simple concept.
 

maddie

Diamond Member
Jul 18, 2010
4,881
4,951
136
According to a poster at the TechSpot forums, GDDR5 allows simultaneous read/write:



Source

I'll look for a better source, if I get time.

Thanks, I had forgotten that property of GDDR memory.

The reason I mentioned Dual CMD interface was a paper by Hynix at Hot Chips 2014 where they had a slide saying for CMD input [Single CMD for GDDR5 and DDR3] but [Dual CMD for HBM].
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
4K is a trade off with settings unless you have Quad GPUs

I think I seen a witcher 3 benchmark around maxed @ 4k with 2 overclocked gtx980ti's doing 40fps average. I would think 3 gtx980ti's can handle any game with the right cpu pushing them.
I think its about upcoming games though,todays games you might get away with it.
People buy 1500$ in video cards to play tomorrows games not just todays.

There is no cpu that can properly push 4 gtx980ti's. I've seen that video , I think RS showed me.
 

Mako88

Member
Jan 4, 2009
129
0
0
I think I seen a witcher 3 benchmark around maxed @ 4k with 2 overclocked gtx980ti's doing 40fps average. I would think 3 gtx980ti's can handle any game with the right cpu pushing them.
I think its about upcoming games though,todays games you might get away with it.
People buy 1500$ in video cards to play tomorrows games not just todays.

There is no cpu that can properly push 4 gtx980ti's. I've seen that video , I think RS showed me.

Correct.

2x Titan X OC, or 980 Ti OC, most of which overclock +250Mhz Boost clock on air easily, are more than enough at 4k in most titles.

The Fury X looks set to improve on that a bit, which is where the excitement is coming from (better fps, lower price).
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |