[TT] Pascal rumored to use GDDR5X..

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I'm with Headfoot... They could reuse DDR3 for all I care, if the card performs how I want it to and there isn't a complete bottleneck due to the memory, call it whatever you want.

Anyone who cares because it isn't the newest tech out there (which it looks like it is, set for release in 2016) I must laugh at...

Why design cards that will be obsolete next year when we can make them obsolete on release. Guarantees upgrade when real SOTA cards come out.

To each his own, but I actually like new tech. GDDR5X is definitely a step back from HBM, which we already have. I prefer we move forward if possible. I'm concerned that might be the issue. It might not be possible yet to move forward with HBM.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Why design cards that will be obsolete next year when we can make them obsolete on release. Guarantees upgrade when real SOTA cards come out.

To each his own, but I actually like new tech. GDDR5X is definitely a step back from HBM, which we already have. I prefer we move forward if possible. I'm concerned that might be the issue. It might not be possible yet to move forward with HBM.

If you want HBM just for the sake of HBM. Then its just silly. The native quad core was a class example of a pure tech product that failed completely.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Maybe not issues with HBM but maybe issues with Nvidia getting HBM working. I am sure AMD had a lot of the delays for Fury because of getting HBM working correctly.

Cost, supply and availability is the biggest issues for HBM. GDDR5X is faster than HBM1. So the new bar is now HBM2.
 

96Firebird

Diamond Member
Nov 8, 2010
5,714
316
126
Why design cards that will be obsolete next year when we can make them obsolete on release. Guarantees upgrade when real SOTA cards come out.

To each his own, but I actually like new tech. GDDR5X is definitely a step back from HBM, which we already have. I prefer we move forward if possible. I'm concerned that might be the issue. It might not be possible yet to move forward with HBM.

I don't understand your response to my post, care to elaborate?
 
Mar 10, 2006
11,715
2,012
126
HBM2 will probably show up on NVIDIA's highest end Pascal cards (i.e. $650+ price point). I wouldn't be surprised to see GDDRX5 used in cards at lower price points given the immaturity of and cost adder associated w/ HBM2.
 
Last edited:

zlatan

Senior member
Mar 15, 2011
580
291
136
I don't think that GDDR5X is a viable option here. A standard 4 GB 1,5 (or 6) GHz GDDR5 memory configuration consume 30 watt at peak. 8 GB consume 60 watt. GDDR5X is even more power hungry so a high performance 8 GB option will consume 90 watt, and this is just the memory for a mainstream card. HBM2 is the only option from here, if the performance/watt matters.
 

R0H1T

Platinum Member
Jan 12, 2013
2,582
162
106
Its the native quad core all over. Technology before performance.

The question still begs if HBM is even viable outside the 650$ cards due to cost. And even at that price HBM is questionable for the time being.

GDDR5X looks to be the cheap and easy solution for now. Stacked DRAM is the future, but the future may be some time away.
That remains to be seen, no one knows whether the GTX 970 successor will be cheaper to produce than the current incumbent, putting 16nm TSMC costs & other issues aside. Also the power consumption is yet to be factored in, GDDR5x may well blow a large part of that 120~150W TDP you're pointing at, which may well decide whether it's better to go HBM2 or GDDR5x with say something like a next gen Nano.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
I don't think that GDDR5X is a viable option here. A standard 4 GB 1,5 (or 6) GHz GDDR5 memory configuration consume 30 watt at peak. 8 GB consume 60 watt. GDDR5X is even more power hungry so a high performance 8 GB option will consume 90 watt, and this is just the memory for a mainstream card. HBM2 is the only option from here, if the performance/watt matters.

What is the power consumption difference between a 6GB 980TI and a 12GB Titan X?

And in terms of power consumption:
http://www.anandtech.com/show/9266/amd-hbm-deep-dive/4
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
That remains to be seen, no one knows whether the GTX 970 successor will be cheaper to produce than the current incumbent, putting 16nm TSMC costs & other issues aside. Also the power consumption is yet to be factored in, GDDR5x may well blow a large part of that 120~150W TDP you're pointing at, which may well decide whether it's better to go HBM2 or GDDR5x with say something like a next gen Nano.

HBM2 will increase power consumption too.

And it doesn't seem the difference is that bad.
http://www.anandtech.com/show/9266/amd-hbm-deep-dive/4

Also explains why the Fury and Fury X was 275W despite all the HBM savings.

HBM still looks like a premature tech that needs to evolve further to get a solid ground. While GDDR5X is a bit of a hotfix. It solves the current short term issues and provides something HBM cant do yet.
 

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
I don't think that GDDR5X is a viable option here. A standard 4 GB 1,5 (or 6) GHz GDDR5 memory configuration consume 30 watt at peak. 8 GB consume 60 watt. GDDR5X is even more power hungry so a high performance 8 GB option will consume 90 watt, and this is just the memory for a mainstream card. HBM2 is the only option from here, if the performance/watt matters.
Why is gddr5x more power hungry than gddr5?
 

Goatsecks

Senior member
May 7, 2012
210
7
76
Perhaps not for the desktop but im sure in Laptops a Pascal GDDR5X will loose big time vs Radeon HBM at BoM, thermals, power consumption etc etc. .

Couldn't agree more. :awe:

Nvidia have a lot to learn from AMD when it comes to mobile GPUs.
 

Mondozei

Golden Member
Jul 7, 2013
1,043
41
86
Could be a problem for AMD if there is an HBM2 delay. Going with HBM1 again, even if just for the first round of cards (7970/680 "mid-range") will really sour people if they are at 4GB and Nvidia is on 8GB GDDR5X. Alternatively if AMD used GDDR5/GDDR5X they will lose much of the power efficiency gains that Fiji had and unless Arctic Islands is a massive improvement in PPW then AMD will have no answer to the top Nvidia card.

You hit the nail on its head.

Even if Artic Islands is 2x more efficient than the Fury cards, they are still going to compete with Pascal which is 2x more efficient than Maxwell - which in turn is far more efficient than the Fury cards.

So while both are improving rapidly, AMD needs to be at least 3X as efficient with Artic Islands over the Fury GPUs. And without HBM that will be very difficult. But a 4 GB card in 2016 will be a hard sell for enthusiasts. Fury already got a lot of heat for it.

The big question is if all the rumors about AMD getting preferential access to SK Hynix's HBM production actually bears fruit. If it does, even for the high-end, that would be a big selling point for AMD. Then again, a lot of people thought that HBM would be huge for this generation, yet the 980 Ti crushed Fury in everything except 4K benchmarks, which is a niche even for enthusiasts.
 

Mondozei

Golden Member
Jul 7, 2013
1,043
41
86
So you are condemning GP104 to 4GB of memory. Nice

You don't need more for 1080p. Actually you probably don't need more than that for gaming at 4K today via The Tech Report. And HBM could still be an option for the GTX 1080.

The primary benefits of HBM isn't more VRAM, it's power and size and bandwidth. 4 GB VRAM GPUs are not constrained at high resolutions today, by and large. 8 GB is totally overkill.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
You hit the nail on its head.

Even if Artic Islands is 2x more efficient than the Fury cards, they are still going to compete with Pascal which is 2x more efficient than Maxwell - which in turn is far more efficient than the Fury cards.

Perhaps Maxwell is more efficient than Fiji in DX-11 but it sure will not be in new DX-12 games.

So while both are improving rapidly, AMD needs to be at least 3X as efficient with Artic Islands over the Fury GPUs. And without HBM that will be very difficult. But a 4 GB card in 2016 will be a hard sell for enthusiasts. Fury already got a lot of heat for it.

this is 4GB of HBM vs 6GB of GDDR-5 at 3840x2160



The big question is if all the rumors about AMD getting preferential access to SK Hynix's HBM production actually bears fruit. If it does, even for the high-end, that would be a big selling point for AMD. Then again, a lot of people thought that HBM would be huge for this generation, yet the 980 Ti crushed Fury in everything except 4K benchmarks, which is a niche even for enthusiasts.

Im sure Fury X will be faster than GTX980Ti in the majority of 2016 DX-12 games. Fiji will see a tremendous performance increase with Windows 10 and DX-12.
 

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,120
260
136
GDDR5X is most likely not an ideal alternative when it comes to power consumption ...

This is just straight up doubling the width of GDDR5's memory access which will increase power consumption linearly and when considering how much power video memory consumes on the highest end GPUs this will double to 60 watts ...

With HBM2, HMC, or other similar technologies you can get better than linear scaling on power consumption for increasing both bit density and bandwidth ...
 
Mar 10, 2006
11,715
2,012
126
Couldn't agree more. :awe:

Nvidia have a lot to learn from AMD when it comes to mobile GPUs.

Agreed. NVIDIA's essentially 100% market share of the gaming laptop market really shows that these folks need to be more like AMD :thumbsup:
 

Mondozei

Golden Member
Jul 7, 2013
1,043
41
86
Perhaps Maxwell is more efficient than Fiji in DX-11 but it sure will not be in new DX-12 games.



this is 4GB of HBM vs 6GB of GDDR-5 at 3840x2160





Im sure Fury X will be faster than GTX980Ti in the majority of 2016 DX-12 games. Fiji will see a tremendous performance increase with Windows 10 and DX-12.


Three points:

1. The SW:BF beta isn't DX12. It's DX11. The full game (probably) will be DX12.

2. BF3 and BF4 were NV-tilted games. Especially at lower resolutions. Now DICE seems to have swung the other way. AMD cards have done systematically better than NV cards in that game, the Fury vs 980 Ti are not relevant in this, since it's a top-to-bottom pattern. If you think the VRAM has anything to do with performance in that chart, you're misguided.

3. In the two DX12 benchmarks we've seen, the 980 Ti has been doing as well, and sometimes better, than the Fury. That was true in AotS and in Fable legends. It's really the GM204 cards and below that have struggled most.

Finally, we're not talking about Maxwell here. The discussion is about Pascal.
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91
Not too worried about power given fiji vs maxwell. Seems nvidia is ok on that front.

Space wise, this is perfect for nano. It reigns supreme in fps per inch (lol) and its successor has the potential to do the same.

However, without more mitx style cards like nano being mainstream, we won't get cases specifically designed for these builds.

Sucks, but nvidia is the gpu industry apple.

That is heavily dependent on what they do with pascal. I expect they will rebrand or refresh maxwell chips though. If the cards using gddr5x were pascal cards that do not have the silicon cut out like kepler and maxwell did, nvidia would be well behind AMD on power consumption (if AMD used HBM).

Again just look at the high end of each generation. Nvidias power consumption is typically similar to AMD at those levels even with the removed hardware scheduler and reduced DP. In reality nvidia is LESS efficient than AMD with their GPU design. Going with what looks like faster but more power hungry gddr5 would not help.

I expect mid-range refreshed cards to have it. eg. the 980 refresh/rebrand.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Exactly see, as if DDR power consumption was ever at a head? It will now be as it never existed!

The sad part is that the usual suspects are ready to crap on HBM because it supports their previous positions how HBM was a waste of time, but the reality is just like Maxwell / Fiji would have benefited tremendously from a 20/22nm node, Pascal would be way better off from a perf/watt if all cards top-to-bottom were HBM2 because HBM is superior to GDDR5X because it's impossible to compete in perf/watt and PCB space when HBM2 just needs 1-1.25Ghz clocks to reach 1TB/sec. Who knows if NV can execute even better than AMD and even manage to reduced latency in the process.

People keep regurgitating how HBM brought nothing to the table over GDDR5 without realizing they are not comparing apples-to-apples architectures. This is akin to making a blanket statement during the early eras of hybrid cars that they bring little over the best diesel/gasoline cars, but over time the 3 best hypercars in the world all have hybrid powertrains and use all the benefits of electric power to improve performance. In that sense, we do not have the full picture if we just compare GM200 to Fiji and make statements how HBM is barely better than GDDR5 because GM200 is a far more efficient GPU architecture than GCN 1.2 to start with.

To truly understand the incredible impact HBM has achieved in its 1st iteration, we need to compare GCN to GCN in GPU limited scenarios to see what's what:

Perf/watt

1080P
Fury X = 100% (+33% vs. 290X)
290X = 75%
285 = 72%

1440P
Fury X = 100% (+39% vs. 290X)
290X = 72%
285 = 67%

4K
Fury X = 100% (+43%)
290X = 70%

https://www.techpowerup.com/reviews/AMD/R9_Fury_X/32.html

While it's fair to say that not all of the perf/watt improvement is fully attributable to HBM1 over GDDR5, anyone who says HBM brought nothing/little to the table over GDDR5 is spreading FUD. If engineers could have integrated HBM1 on GM200 Maxwell, it may have had another 20-30% improvement in perf/watt that could be utilized toward a faster GPU or lower power consumption. Not having HBM2 on all Pascal cards top-to-bottom doesn't mean Pascal won't be good but it for sure would mean that not all Pascal chips would be as great as they could have been assuming this rumor is true.

More worrying is the write-up implies that a GP104 may have 256-bit bus with 448GB/sec GDDR5X, while the flagship GP100 will be positioned as a professional Titan X/Tesla/Quadro product at first with HBM2 1TB/sec. That implies 3rd consecutive generation of milking the mid-range chip and making it a Marketing Flagship. D: If NV follows through with this strategy for a 3rd generation in a row we might as well forget decades of how GPUs were released because it would mean marketing has won and it will officially become a new era of GPU generations = bifurcating a generation where the mid-range acts as a flagship for the 1st part of the milking process of a new architecture. :twisted:

With HBM2, HMC, or other similar technologies you can get better than linear scaling on power consumption for increasing both bit density and bandwidth ...

If Big Daddy GP100, the true flagship Pascal, has HBM2 then that in itself would be 100% confirmation from NV themselves that GDDR5X is technologically inferior in all key aspects other than price and time to mass market production compared to HBM2. Despite the detractors here, HBM is the future for the next 5 years even if it has a bumpy road along the way.

This reminds me of all the doom and gloom I read on DDR4 when X99 first launched last summer. In barely more than a year, it's now possible to buy 16GB of DDR4 3200 for about HALF the price of 16GB DDR4 2133 when X99 just launched. In fact, it took just over a year and we already have consumer DDR4 4000 on Newegg, nearly double the speed of the 1st generation DDR4.

HBM might have a rough ramp-up but there is no doubt that GDDR5 or any of its derivative is just a short-term band-aid to what is outdated and budget technology. The amount of benefits HBM brings over GDDR is just too immense to ignore, from perf/watt, to total bandwidth, to ability to stack so much more memory in compact space, to reduced PCB size and the memory itself is cooled directly by the GPU heatsink. With time, like DDR4, HBM2 will obsolete GDDR5 just like GDDR5 obsoleted DDR3 on GPUs, etc.

GDDR5X is faster than HBM1.

Even at 16Gbps @ 384-bit bus, that's 768GB/sec, well short of 1TB/sec of HBM2 and that's just the 1st revision of HBM2.

What matters is bandwidth/watt and absolute memory bandwidth. It seems that in your dreamworld, GDDR5/X will scale linearly with memory bus regardless of the required complexity, but in the real world it's probably not that simple and it could be an engineering roadblock to go from a 256-bit memory controller with 16Gbps GDDR5X and to a 384-512 bit buses with the same GDDR5X memory speeds. Also, let's not ignore that perhaps a 256-512 bit memory controller could take up much more transistor die space than the HBM2 one.
 
Last edited:

tential

Diamond Member
May 13, 2008
7,348
642
121
That is heavily dependent on what they do with pascal. I expect they will rebrand or refresh maxwell chips though. If the cards using gddr5x were pascal cards that do not have the silicon cut out like kepler and maxwell did, nvidia would be well behind AMD on power consumption (if AMD used HBM).

Again just look at the high end of each generation. Nvidias power consumption is typically similar to AMD at those levels even with the removed hardware scheduler and reduced DP. In reality nvidia is LESS efficient than AMD with their GPU design. Going with what looks like faster but more power hungry gddr5 would not help.

I expect mid-range refreshed cards to have it. eg. the 980 refresh/rebrand.
I'll take time to address other points but I couldn't care less how nvidia has lower power consumption. They do with maxwell.
Whether amds architecture is more advanced or not means nothing to me as a consumer. I need to use the architecture, not marvel in its amazingness, while never being able to utilize it.
Aka my 7950 and r9 290 now that have whole parts of the gpu unused for me until a dx 12 game launches. Unless I'm supposed to be excited that some point in the future I'll have great dx 12 performance relative to the competition.
But as always recently the story is "just wait. Amd has it coming soon!"

Edit:
As for nvidia lineup? I expect them to price gouge the high end of course. They launch first so why not? Amd needs to launch first some times otherwise people will buy the first big increase available to them.
 
Last edited:

Azix

Golden Member
Apr 18, 2014
1,438
67
91
I'll take time to address other points but I couldn't care less how nvidia has lower power consumption. They do with maxwell.
Whether amds architecture is more advanced or not means nothing to me as a consumer. I need to use the architecture, not marvel in its amazingness, while never being able to utilize it.
Aka my 7950 and r9 290 now that have whole parts of the gpu unused for me until a dx 12 game launches. Unless I'm supposed to be excited that some point in the future I'll have great dx 12 performance relative to the competition.
But as always recently the story is "just wait. Amd has it coming soon!"

Edit:
As for nvidia lineup? I expect them to price gouge the high end of course. They launch first so why not? Amd needs to launch first some times otherwise people will buy the first big increase available to them.

You're speculating on future hardware. Why they do and why they may not in the future matters. Nvidia seems to be more focused on compute with pascal so right there you are going to have an increase in power consumption.
 
Last edited:

dark zero

Platinum Member
Jun 2, 2015
2,655
140
106
INBF AMD counters with HBM1 un lowest tiers and HBM 2 in higher tiers.

GDDR5X is not good enough against HBM and if nVIDIA is going that path... They won't end well.

Also, there is Intel menace behind them, going to slower things is not a good idea after all.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |