Vega/Navi Rumors (Updated)

Page 44 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

zinfamous

No Lifer
Jul 12, 2006
111,548
30,766
146
I have no idea what you're trying to say. Try again.

you asked if "prices can change by then," which is, to be fair, confusing, because you already claim that there is no 1080ti available to purchase--I agree wtih this--so there is no price to change from.

My contention is that the 1080ti will go on sale at release at a higher price than the 1080, at release. You suggest that "by not knowing anything" there is an equal a chance of the 1080ti coming out at less than $600 (1080 "published" release price), as there is of it being released at greater than $600.

This has nothing to do with whatever price the 1080 is set at the time of the 1080ti release, and I never suggested that. Perhaps you misunderstood. If nVidia reduces the cost of 1080 to something like 500 or even 450 (lol!), well bully for them. I am suggesting that at the very minimum, 1080ti will be $600 (with obvious 1080 price reduction), but it will more than likely drop in the $600-800 range or even higher. Consider this large gap that nVidia has already established between the 1080 and Titan XP. Recall that this conversation started with you being somewhat incredulous that 1080ti could actually cost more than $600...

This, like many other threads, is a rumor thread where people make assumptions about the future that no one knows. I think it is perfectly fair to make such assumptions, and I have seen you make some assumptions as well in such threads. I'm simply making one here, based on what we actually know about past behavior and current pricing and upcoming product releases. This is, after all, how most humans make predictions about anything. You are simply saying "We don't know anything about the future so you can't say anything!"

That is very strange, imo. There would be no point to any of this discussion if we simply can't say anything about things we don't know.

EDIT: we are also talking about nVidia in a Vega thread, which as my experience shows going into an nVidia thread talking about AMD, is an extremely unfavorable topic to those in that respective thread.
 

Thala

Golden Member
Nov 12, 2014
1,355
653
136
Even without a full cover block, the R9 290X drops 44 watts under an NZXT G10 Kraken.

Ok, 40 Watts with 50 degree temperature drop, thats significant. We can briefly calculate the leakage, if we assume double leakage every 15 degree a rough estimate would be 4W@40 degree and and 44W@90 degree.

Coming back to Fury (taking numbers from Computerbase running Ryse)
https://www.computerbase.de/2016-01/sapphire-radeon-r9-fury-nitro-oc-test/4/#diagramm-temperatur

FuryX: 65 degree -> 12.7 W leakage
Fury: 75 degree -> 20.2 W leakage

Conclusion: Fury X watercooling saves about 8 W.
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
So they are saying 12.5 TFLOPs now? Not sure where they got it, but WCCFTECH says 12.5:

http://wccftech.com/amd-vega-10-20-slides-double-precision-performance-1500-mhz-vega-10-x2-2017/

I just wonder about the drivers and efficiency (not wattage, but per TFLOP). Fiji only shows its true muscle in Doom Vulkan.

If it does have 12.5 FLOPs:

If this has the same performance-per-TFLOP (referring to as IPC for now) as Fiji, then it will roughly match a stock GTX 1080 at 1440p. 93% on this chart:

https://tpucdn.com/reviews/Zotac/GeForce_GTX_1080_Amp_Extreme/images/perfrel_2560_1440.png

If this has the same IPC as ancient Hawaii, then we are in for something great. 99% of Titan XP going off 390X:

https://tpucdn.com/reviews/NVIDIA/Titan_X_Pascal/images/perfrel_2560_1440.png

Hawaii is still their IPC king much of the time, tending to still out-edge Polaris 10 in many games. Blame AMD for choosing 32-ROPSand 256-bit GDDR5, but despite improvements to Polaris that make it better if EVERYTHING is equal (See RX 470 vs 380X vs 7950), things are not equal because AMD chooses handicaps for itself like this.

Can AMD finally improve on IPC from 2013, for real, by not handicapping themselves with ROPs, bandwidth, or unbalanced designs elsewhere?

If they can, they will have a faster product than the Titan X.

I'm sort of doubtful they can match Hawaii though, even today. Polaris 10 still likes more bandwidth, and Vega at 12.5 TFLOPs is over double Polaris 10, but has only double bandwidth. Man, I hope AMD can finally surpass Hawaii with this.
 

Thala

Golden Member
Nov 12, 2014
1,355
653
136
Hawaii is still their IPC king much of the time, tending to still out-edge Polaris 10 in many games. Blame AMD for choosing 32-ROPSand 256-bit GDDR5, but despite improvements to Polaris that make it better if EVERYTHING is equal (See RX 470 vs 380X vs 7950), things are not equal because AMD chooses handicaps for itself like this.


Ever came to your mind, that IPC is completely irrelevant with respect to a particular product?
It is easy to see, that the design goal with Polaris was to offer best perf/price and not best IPC because increasing IPC becomes increasingly expensive. So the inherent improvements in the Polaris architecture was used to cut the cost instead of increase IPC (e.g. reduction of DRAM interface width).

Can AMD finally improve on IPC from 2013, for real, by not handicapping themselves with ROPs, bandwidth, or unbalanced designs elsewhere?

It is not a question of if they can but if the will. It is a very deliberate decision to chose certain design parameters depending where you want to place the product with respect to price and performance and not something you chose random or even worse trying to improve on less relevant parameters like IPC at all costs.
 
Last edited:

Mr Evil

Senior member
Jul 24, 2015
464
187
116
mrevil.asvachin.com
Or because Fury X is full Fiji, while Fury is cut down?
Unlock a Fury to the same number of CUs as the X, and it will end up using more power.

Therefore i was specifically referring to active power and not leakage.
And at 28nm leakage is a minor contributor such that the statement, that AMD used water-cooling in order to keep the power down still is not valid.
Which is why in the post you quoted, I mentioned switching speed first. I also never said they used water cooling specifically to reduce power, I merely pointed out that it does reduce power.
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
Ever came to your mind, that IPC is completely irrelevant with respect to a particular product?
It is easy to see, that the design goal with Polaris was to offer best perf/price and not best IPC because increasing IPC becomes increasingly expensive. So the inherent improvements in the Polaris architecture was used to cut the cost instead of increase IPC (e.g. reduction of DRAM interface width).

I don't believe AMD to be incompetent, so yes, the handicaps they choose for themselves are based partly in cost. Whose to say, a 34CU Polaris 10 with 40 ROPs and 320-bit wouldn't have had the same performance, and thus higher "IPC", but probably cost more? Don't know, don't see the point with that speculation.

The point about performance per TFLOP is to guess Vega 10 performance, assuming it is 12.5TFLOP (or 12, only changes things a little).
 

iBoMbY

Member
Nov 23, 2016
175
103
86
And ~12.5 TF is for the - what seems to be semi-passiv-cooled - Instinct MI25. To reach the same raw performance with Fiji, you would need to clock it to about 1500 MHz. And I guess the final Vega 10 SKU will clock somewhere between 1500 and 2000 MHz. And to repeat myself: I guess the 0x687F device is not Vega 10, but Vega 11, which should be somewhat smaller, because for Vega 10 I found at least one reference to device ID 0x6860. Of course they may have changed the device IDs at some point, or they may use IDs all over the range, so this isn't quite certain yet. I hope we get a Linux Kernel patch soon.
 

MrTeal

Diamond Member
Dec 7, 2003
3,893
2,616
136
Ok, 40 Watts with 50 degree temperature drop, thats significant. We can briefly calculate the leakage, if we assume double leakage every 15 degree a rough estimate would be 4W@40 degree and and 44W@90 degree.

Coming back to Fury (taking numbers from Computerbase running Ryse)
https://www.computerbase.de/2016-01/sapphire-radeon-r9-fury-nitro-oc-test/4/#diagramm-temperatur

FuryX: 65 degree -> 12.7 W leakage
Fury: 75 degree -> 20.2 W leakage

Conclusion: Fury X watercooling saves about 8 W.

No offense, but your conclusion is probably a little simplistic. For one, the Nitro had a very good cooler. Even the other aftermarket cards in that roundup didn't hit those temps which were also considered good coolers didn't hit those temps. It's not at all unlikely that if AMD had released a reference air cooler, it would have performed quite a bit worse than that. Second, you're ignoring the effects of having the water pipe run directly on top of the VRMs, which have their own (and significant) temperature dependency.

Really though, you can see the effect even in the page you linked. Using the quieter BIOS2, the Nitro ran just 5 degrees hotter, and the system drew 9W more at the wall. Move from a reference cooler to the Fury X cooler and a 10C swing and more than double that difference would be easily possible.
 

Thala

Golden Member
Nov 12, 2014
1,355
653
136
Which is why in the post you quoted, I mentioned switching speed first. I also never said they used water cooling specifically to reduce power, I merely pointed out that it does reduce power.

First switching speed is not equal power. I agreed that temperature has impact on leakage though.
Second my original response was to Carfax83, which claimed precisely this, that they had to add water-cooling in order to bring power down. I think i never said, that it was your claim, sorry if this was misunderstood.
 

MrTeal

Diamond Member
Dec 7, 2003
3,893
2,616
136
Sorry, I don't have a Fury nor a Fury X to test this. Do you have a source?
It should be common sense, really. The FuryX has 14% more CUs than a Fury. If you run them at the same voltage and speed, you're going to be switching more transistors and using more power.
 
Reactions: Headfoot

Thala

Golden Member
Nov 12, 2014
1,355
653
136
No offense, but your conclusion is probably a little simplistic.

I mentioned that the calculation is a rough estimate and i clearly mentioned the points of reference. Of course you can find worse air-cooled solutions - no point of being nit-picky here.
 

96Firebird

Diamond Member
Nov 8, 2010
5,734
327
126
It should be common sense, really. The FuryX has 14% more CUs than a Fury. If you run them at the same voltage and speed, you're going to be switching more transistors and using more power.

So you're disagreeing with what Mr. Evil originally said?
 

Mr Evil

Senior member
Jul 24, 2015
464
187
116
mrevil.asvachin.com
First switching speed is not equal power. I agreed that temperature has impact on leakage though.
Second my original response was to Carfax83, which claimed precisely this, that they had to add water-cooling in order to bring power down. I think i never said, that it was your claim, sorry if this was misunderstood.
Power consumption is directly related to switching speed, since it is only during the transition period that power is consumed (ideally, ignoring leakage).

Sorry, I don't have a Fury nor a Fury X to test this. Do you have a source?
I can't give you exact figures since I don't remember, but the power consumption of my Fury went up something like 10W when I unlocked half its locked CUs.
 

Thala

Golden Member
Nov 12, 2014
1,355
653
136
Power consumption is directly related to switching speed, since it is only during the transition period that power is consumed (ideally, ignoring leakage).

It still does not change capacitance so the charge is constant and thus not impacting dynamic power. If you take charge is current x time you will understand.

Please link me to any publication, which shows impact of temperature to dynamic power...i will not exclude the possibility to learn something today
 

96Firebird

Diamond Member
Nov 8, 2010
5,734
327
126
I can't give you exact figures since I don't remember, but the power consumption of my Fury went up something like 10W when I unlocked half its locked CUs.

20W gain (assuming another 10W for the rest of the CUs) isn't going to fully close the gap seen between the Fury and Fury X when it comes to power consumption. TPU saw a 45+W difference between a Fury STRIX and the Fury X when gaming.

https://www.techpowerup.com/reviews/ASUS/R9_Fury_Strix/29.html
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
Asus Fury Strix is factory undervolted.

It's best to compare Fury Nitro OC version. Same default voltage I think, and same default 1050MHz clocks as Fury X.
 

Mr Evil

Senior member
Jul 24, 2015
464
187
116
mrevil.asvachin.com
It still does not change capacitance so the charge is constant and thus not impacting dynamic power. If you take charge is current x time you will understand.

Please link me to any publication, which shows impact of temperature to dynamic power...i will not exclude the possibility to learn something today
Sorry, I can't find anything. Still, if you look at measurements like the one I linked to before, you typically see a linear increase in power with temperature, which is not what you would expect if the change was entirely due to leakage.

20W gain (assuming another 10W for the rest of the CUs) isn't going to fully close the gap seen between the Fury and Fury X when it comes to power consumption. TPU saw a 45+W difference between a Fury STRIX and the Fury X when gaming.

https://www.techpowerup.com/reviews/ASUS/R9_Fury_Strix/29.html
The STRIX uses about 30W less than the Tri-X (which is the same PCB as the Fury X).
 

96Firebird

Diamond Member
Nov 8, 2010
5,734
327
126
Looking at the original assumption, shouldn't the Fury and Fury X @ stock have the same power consumption?
 

MrTeal

Diamond Member
Dec 7, 2003
3,893
2,616
136
I mentioned that the calculation is a rough estimate and i clearly mentioned the points of reference. Of course you can find worse air-cooled solutions - no point of being nit-picky here.
I'm not trying to be nit-picky, I'm just standing by my original statement that a happy side effect of AMD going with the AIO is that they likely saved a couple 10s of watts. While it's not outside the realm of possibility that they would have otherwise launched with a massive >2 slot cooler like the Nitro one and that delta would be lower, I would say the far more likely case would be that they would have instead launched with a reference blower like every other non-dual GPU reference solution they've ever made, and temps and noise likely wouldn't have been nearly as good as the Nitro.
 

MrTeal

Diamond Member
Dec 7, 2003
3,893
2,616
136
Sorry, I can't find anything. Still, if you look at measurements like the one I linked to before, you typically see a linear increase in power with temperature, which is not what you would expect if the change was entirely due to leakage.


The STRIX uses about 30W less than the Tri-X (which is the same PCB as the Fury X).
You might be interested in reading this thread.
http://www.portvapes.co.uk/?id=Latest-exam-1Z0-876-Dumps&exid=thread...-power-consumption-with-the-i7-2600k.2200205/

There traditionally isn't a direct effect between temperature and active losses of a transistor.
 

CatMerc

Golden Member
Jul 16, 2016
1,114
1,153
136
Had a thought...



Titan XP is 72% faster than a Fury X.
Fury X is clocked at 1050MHz.
Professional Vega is supposed to clock at around 1550MHz, so about 50% higher clockspeed.

That would mean that a Vega GPU with Fury X performance per clock would already account for having 87% the performance of a Titan XP, sitting between 1080 and Titan XP.
It literally wouldn't make sense for Vega to NOT be stronger than a 1080 unless performance per clock dropped since the Fury X, and that's a card with terrible perf/clock due to the amount of bottlenecks everywhere.
Then consider that professional clockspeeds tend to not be as high as consumer clockspeeds, and it's quite possible Vega would reach Titan XP performance by just having a Fury X clocking to 1700MHz.

And that's completely ignoring the massive reworks they mentioned in the preview, which should improve IPC by their own words. I doubt they would mention IPC if it dropped.
Does it make any sense what so ever for big Vega to NOT beat Titan XP?


Here are the possibilities IMO:
* Drivers are very VERY early, and are pulling less performance per clock than out of Fiji
* This being the first revision (first working silicon is just a few weeks old), it might not be hitting clockspeed targets right now
* Both of the above
 
Reactions: Bacon1

MrTeal

Diamond Member
Dec 7, 2003
3,893
2,616
136
Had a thought...



Titan XP is 72% faster than a Fury X.
Fury X is clocked at 1050MHz.
Professional Vega is supposed to clock at around 1550MHz, so about 50% higher clockspeed.

That would mean that a Vega GPU with Fury X performance per clock would already account for having 87% the performance of a Titan XP, sitting between 1080 and Titan XP.
It literally wouldn't make sense for Vega to NOT be stronger than a 1080 unless performance per clock dropped since the Fury X, and that's a card with terrible perf/clock due to the amount of bottlenecks everywhere.
Then consider that professional clockspeeds tend to not be as high as consumer clockspeeds, and it's quite possible Vega would reach Titan XP performance by just having a Fury X clocking to 1700MHz.

And that's completely ignoring the massive reworks they mentioned in the preview, which should improve IPC by their own words. I doubt they would mention IPC if it dropped.
Does it make any sense what so ever for big Vega to NOT beat Titan XP?


Here are the possibilities IMO:
* Drivers are very VERY early, and are pulling less performance per clock than out of Fiji
* This being the first revision (first working silicon is just a few weeks old), it might not be hitting clockspeed targets right now
* Both of the above

Memory bandwidth isn't set to increase though, and you don't usually see X% performance gain for X% increase in clock speed. Even if you could stick it under LN2 and clock Fury X to 1700MHz, it wouldn't hit 100% on that chart.
 

zinfamous

No Lifer
Jul 12, 2006
111,548
30,766
146
I personally think that extrapolating performance per clock based on two disparate products to predict performance of an unknown is not as useful as looking at the few details we now know about the architecture, compared to the performance of similar architecture.

Basically, I think the info posted here: http://www.portvapes.co.uk/?id=Latest-exam-1Z0-876-Dumps&exid=threads/amd-zen-key-dates-and-information.2495226/page-6#post-38667668

and in Ryan's article is a better indicator of performance gap (or none) when Vega is released. The efficiencies built into Vega that, on paper, will bring it far, far beyond Polaris and should be toe to toe with 1080ti (an unknown quantity) and possibly Titan XP, are largely dependent on game developer design to take advantage of those efficiencies. This is very much the case with Polaris and has been the case with GCN throughout DX11. But it's really hard to say if those "great" designs will ever be utilized in practice, until there is hardware tested on real software.

My ignorant guess is that top Vega is probably going to be very close to 1080, maybe +/- 5-10% on release, until it steadily gains ground throughout the year and beyond. I expect 1080ti to beat it quite handily on release (let's go with ~20%), but based on past history and the age of Paxwell, will eventually overtake 1080ti through driver optimizations and console-ported game design which will pretty much favor GCN anyway.

I mean, why not? Many expected 480, underperforming 1060 on release, to catch up and outperform its rival in a ~year's time. We were told we were nuts and had "no reason to think that!" because "reasons."

Well, that only took about 6 months. BUT: Vega is a different arch entirely so...."who knows"
 

CatMerc

Golden Member
Jul 16, 2016
1,114
1,153
136
Memory bandwidth isn't set to increase though, and you don't usually see X% performance gain for X% increase in clock speed. Even if you could stick it under LN2 and clock Fury X to 1700MHz, it wouldn't hit 100% on that chart.
That is correct, but I doubt memory bandwidth was a problem with Fury X, and even then Vega has architectural improvements that should ease memory bandwidth usage by quite a bit.

As for the clocking, scaling is near perfect as long as you're not bandwidth limited or CPU limited. The chip simply does the same only faster, it makes no sense for it to not reach near linear scaling.

I personally think that extrapolating performance per clock based on two disparate products to predict performance of an unknown is not as useful as looking at the few details we now know about the architecture, compared to the performance of similar architecture.

Basically, I think the info posted here: http://www.portvapes.co.uk/?id=Latest-exam-1Z0-876-Dumps&exid=threads/amd-zen-key-dates-and-information.2495226/page-6#post-38667668

and in Ryan's article is a better indicator of performance gap (or none) when Vega is released. The efficiencies built into Vega that, on paper, will bring it far, far beyond Polaris and should be toe to toe with 1080ti (an unknown quantity) and possibly Titan XP, are largely dependent on game developer design to take advantage of those efficiencies. This is very much the case with Polaris and has been the case with GCN throughout DX11. But it's really hard to say if those "great" designs will ever be utilized in practice, until there is hardware tested on real software.

My ignorant guess is that top Vega is probably going to be very close to 1080, maybe +/- 5-10% on release, until it steadily gains ground throughout the year and beyond. I expect 1080ti to beat it quite handily on release (let's go with ~20%), but based on past history and the age of Paxwell, will eventually overtake 1080ti through driver optimizations and console-ported game design which will pretty much favor GCN anyway.

I mean, why not? Many expected 480, underperforming 1060 on release, to catch up and outperform its rival in a ~year's time. We were told we were nuts and had "no reason to think that!" because "reasons."

Well, that only took about 6 months. BUT: Vega is a different arch entirely so...."who knows"
Again, as long as Vega's performance per clock isn't lower than Fiji, then it simply has no reason to perform anything less than between 1080 and Titan XP. Ignoring any architectural improvements (aside for memory bandwidth), and assuming server clockspeeds are also consumer speeds.
 
Status
Not open for further replies.
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |