[vrworld] Pascal Secrets: What makes Nvidia Geforce GTX 1080 so fast?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

maddie

Diamond Member
Jul 18, 2010
4,787
4,771
136
We haven't seen Pascal reviews, we haven't seen Polaris reviews, and AFAIK AMD has yet to go on record with Vega being a 2016 release, yet you reckon 1080 is a bad buy. You built so much credibility on this forum by patiently explaining how GPUs work, how (DX12) features are implemented and how they affect performance, and now choose to spend it lavishly on unsubstantiated claims.

I was disgusted by the way media outlets welcomed the 1080 and embraced the fallacy of comparing it to Titan X when aftermarket 980Ti models offered more performance for (relatively) less dough, I literally felt the need to cover my face in shame when "hardware enthusiasts" considered the jump in performance much better than what Intel is offering, but I wouldn't even dream of talking bad about this product until it's properly compared at least with Maxwell.

Think about it again: you're advising people not to buy a product that has yet to be launched (and reviewed) in favor of a product that will launch even later (with no current specs available).
I guess this is how the Dinosaurs felt. The newcomers to the hobby are displacing us older, more experienced folks and bringing new values. A new age is dawning.
 

xpea

Senior member
Feb 14, 2014
449
150
116
Basically, what NVIDIA have done with Pascal is overclocked Maxwell and split the SMs into two in order to boost compute efficiency. This will help push Pascal past the FuryX with ease under DX12 scenarios.

The question is, what about Vega? Vega will be quite a boost over Fiji.

We know that Polaris is only a mainstream GPU. It likely won't be competing in the same bracket as the 1080. Heck it might be closer to a 1070. Vega, on the other hand, will have two variants. Vega 10 and 11.

Seems to me that buying a 1080 is a huge mistake as AMD have pushed Vega forward to a Sept/Oct launch. If we assume mass availability of the 1080 by July then 2 months later we're going to have a GPU which will bury the 1080 with relative ease. All Vega needs to do is bump up performance by 5FPS over a FuryX in AotS to beat a 1080.

OF course there are other games but I don't see NVIDIA retaining the performance crown for long. Even if they release a 1080ti.

If Polaris 10 can deliver better than 390x performance despite using GDDR5 on a 256-bit bus then we're looking at Vega being quite powerful.
wow Vega in september and GP104 in July is a nice AMD Santa Claus list
but c'mon, be serious just for a minute and let's back to reality with facts:
GP104 will be available from May 27th, with full bunch of custom AIBs boards in June that will be introduced at Computex.
Vega ? we know nothing, just a late rumor saying that it may come sooner because Nvidia caught AMD pants down with the insane 2+GHz Pascal clocks that turn the green team in full panic mode.

on a personal note, I would like to see Vega ASAP to push Nvidia release their big Pascal. More performance, more choice, better prices, we all win :sneaky:
 
Last edited:

coercitiv

Diamond Member
Jan 24, 2014
6,403
12,864
136
Nvidia caught AMD pants down with the insane 2+GHz Pascal clocks that turn the green team in full panic mode.
If you think AMD did not expect at least a 20-30% clock increase from Nvidia you're deluding yourself. What do you think they were guessing for 16nm FinFET? 10%?!
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
If you think AMD did not expect at least a 20-30% clock increase from Nvidia you're deluding yourself. What do you think they were guessing for 16nm FinFET? 10%?!

ARM SoCs didn't increase much in clock. You need a new design to get these clocks.
 

therealnickdanger

Senior member
Oct 26, 2005
987
2
0
I was disgusted by the way media outlets welcomed the 1080...

Agreed. So far they are only sticking to the NVIDIA PR talking points. Maybe it's just all they can do until the NDA lifts. The hype just doesn't seem worth it.

We already know from multiple sources that the 1080 is slower than the 980Ti, clock for clock. No one is really talking about that fact. Then there's the comparisons to the Titan X - when we ALL know that the Titan X is slower than a 980Ti. Oh, the new SLI bridge has double the bandwidth? Funny, no one mentioned it being an issue before, unlike the SLI drivers. Most of the time, people complain about SLI not working correctly anyway, so without some new commitment from NVIDIA to improve SLI, why is it exciting? Only one 8-pin power connector? OMG, stop the presses! Performance per watt matters a lot, once again. 200fps Doom at 1080p is cool, but what about 4K?

The pricing model is also something that we have seen before. The 1070 will stay locked at its price while the 1080 has room to drop down to $499 after the 1080Ti comes out at $699. The 1080 will be every bit the waste that the 980 was. Depending upon how cut down the 1070 is, its value remains to be seen... in real benchmarks, not just PR-approved GW titles.

I think it would be smart for anyone in the market for a new GPU to wait until after both NVIDIA and AMD launch their cards. NVIDIA hasn't shown anything exciting for these prices, IMO.
 

Mopetar

Diamond Member
Jan 31, 2011
8,024
6,480
136
ARM SoCs didn't increase much in clock. You need a new design to get these clocks.

Apple got ~50% increases going from 28 nm to 16/14 nm nodes. Others didn't get as much as their designs were already tuned for high clock speeds whereas Apple was using a wider design so they had more to gain.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Posters are taking some liberties with the known info. Suddenly, there is only $700 1080's in June, and Vega 10 is now a September product?

As for NV leaving performance on table, maybe they picked up a trick from AMD.

GTX 1080 in June
GTX 1080 2GHz Edition in September/October (whenever Polaris/Vega is expected to come out now) just to shake the pricing boat.

/shrug since we're all just making wild claims now figured I'd add mine.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
That's not how it works.
The 1080 is the new top dog, and it is launching at 700$ MSRP, which is higher than the 980ti MSRP.

Either way, I agree with Arachnotronic - I bet that the 1080's stock performance will be just a bit better than an after-market 980ti at 4k (i.e. 25% better than the stock 980ti, 5%-10% better than after-market), which easily beats the FuryX. If it OCs, it'll be much better.

The FuryX isn't even close to after-market 980TIs.
https://www.techpowerup.com/reviews/Gigabyte/GTX_980_Ti_XtremeGaming/23.html

NV compares the 1080 to the 980, not the 980 ti.

http://www.geforce.com/hardware/10series/geforce-gtx-1080#performance
http://www.geforce.com/hardware/10series/geforce-gtx-1080#specs

I don't think it's a stretch to believe that the 1080 is not the top dog, and that a 1080ti will be the top dog for NV. At least for single GPU cards.
 

Ancalagon44

Diamond Member
Feb 17, 2010
3,274
202
106

SoulWager

Member
Jan 23, 2013
155
0
71
We already know that a GP100 exists. No reason to believe this would not be released as a consumer GPU.

There were rumors(not particularly reliable rumors) of a large die gaming optimized part(GP102). If that's true there would be zero reason to sell GP100 as a consumer part.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
Honestly, does anyone care about this? If so, why? D:

Well, the IPC of a given architecture can be interesting to discuss from a purely academic perspective, even if it doesn't matter to the end user (or at least not in isolation).

Of course at this stage, we have so little info about the Pascal architecture as it exists in GP104, that discussing it is largely an exercise in futility.

There were rumors(not particularly reliable rumors) of a large die gaming optimized part(GP102).

I'm not sure I would call the existence of GP102 a rumor. We know for a fact that the codename has shown up in Nvidia Geforce drivers.

Of course the exact nature of GP102 (and whether it will actually be released) is at this stage nothing but rumors.
 
Last edited:

Ancalagon44

Diamond Member
Feb 17, 2010
3,274
202
106
There were rumors(not particularly reliable rumors) of a large die gaming optimized part(GP102). If that's true there would be zero reason to sell GP100 as a consumer part.

Depends on cost efficiency.

It cost them a certain amount of money to design GP100. There is a lot of cost involved in getting the design fabricated etc. If they only sell GP100 as a Quadro and not as a GeForce, it lowers the amount of money that they can make from it. So, there would have to be a very compelling reason not to sell it to consumers.

Put it this way - for a GPU design like GP100, there are the fixed costs which are invested once off. The R&D costs basically. Then there are the variable costs, which occur everytime a GP100 is manufactured. The material and labour costs.

Given that Nvidia has already paid the fixed costs, they will want to sell as many as possible to make back their investment on the fixed costs.
 

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
ARM SoCs didn't increase much in clock. You need a new design to get these clocks.

Apple A7 (28nm): 1.3ghz
Apple A9 (16nm): 1.85ghz

Apple this achieved while also lowering power usage on a VERY early 16nm process from TSMC.
 
Last edited:

Genx87

Lifer
Apr 8, 2002
41,095
513
126
We already know that a GP100 exists. No reason to believe this would not be released as a consumer GPU.

Eh all that FP64 compute tells me the GP100 in its current state wont be released as a consumer card. A GP100 card will show up. But I think it will look quite a bit different than what we saw in April.
 

Mopetar

Diamond Member
Jan 31, 2011
8,024
6,480
136
Eh all that FP64 compute tells me the GP100 in its current state wont be released as a consumer card. A GP100 card will show up. But I think it will look quite a bit different than what we saw in April.

On the other hand it means that Nvidia could harvest chips that have defects in the FP64 hardware assuming they can disable it independently of other parts of the SM. GP100 has 60 SMs over GP104's 40, so even through it's packing extra hardware that wouldn't be used, it still has more power at hand. The same goes if the chip is fine except for defects in the NVLink hardware.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Apple A7 (28nm): 1.3ghz
Apple A9 (16nm): 1.85ghz

Apple this achieved while also lowering power usage on a VERY early 16nm process from TSMC.

Yes, with the R&D behind and a design for it its not an issue. Most ARM SoCs however didn't gain much.
 

coercitiv

Diamond Member
Jan 24, 2014
6,403
12,864
136
Yes, with the R&D behind and a design for it its not an issue. Most ARM SoCs however didn't gain much.
Indeed, considering Nvidia's struggling R&D budget and poor Maxwell maximum clocks (GM204 could barely hit 1500Mhz) we had no reason to believe Pascal would clock high. It's truly a miracle they achieved higher clocks at all, despite FinFETs. /s
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
Yes, with the R&D behind and a design for it its not an issue. Most ARM SoCs however didn't gain much.

But which ARM SoCs didn't have a design change with the jump to 14/16 nm?

Apple went from the Typhoon to Twister
Exynos went from A15 to A57 (and subsequently to Mongoose)
Snapdragon went from Krait to A57 (and subsequently to Kryo)

As far as I can tell there hasn't really been any major ARM SoC node transition from 28 nm to 14/16 nm which didn't also involve a design change, and as such it would pretty much be impossible to make any conclusion on any frequency improvements specifically coming from the 14/16 nm node jump.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Seems way, waaaaaaay too early to be publishing laudatory puff pieces like "Pascal Secrets: What makes Nvidia Geforce GTX 1080 so fast?" when we still don't even know how fast it really is since no 3rd party reviews are out...
 

Genx87

Lifer
Apr 8, 2002
41,095
513
126
On the other hand it means that Nvidia could harvest chips that have defects in the FP64 hardware assuming they can disable it independently of other parts of the SM. GP100 has 60 SMs over GP104's 40, so even through it's packing extra hardware that wouldn't be used, it still has more power at hand. The same goes if the chip is fine except for defects in the NVLink hardware.

That is true. However I would imagine this would be a low volume one off product. I think it would be a tough sell as a legitimate product run. They can do more with the die space if they yanked out the FP64 hardware.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
One thing with the reduced clock performance would be lower scaling than maxwell with OC. I think.

The other thing would be, try finding any Maxwell that could reach Pascal clocks. Exaggerate the point for easier understanding. Clock for clock performance means nil when one runs 1ghz and the other at 500ghz.
Again, exaggeration, but just trying to make the point.
Anyone comparing clock for clock, doesn't really know why they're doing it.
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91
The other thing would be, try finding any Maxwell that could reach Pascal clocks. Exaggerate the point for easier understanding. Clock for clock performance means nil when one runs 1ghz and the other at 500ghz.
Again, exaggeration, but just trying to make the point.
Anyone comparing clock for clock, doesn't really know why they're doing it.

it really dose matter even then. It depends on how big the difference is. 1ghz could actually = 500mhz performance (in a case that was that bad, not this). A 2.5Ghz pascal could end up performing barely better than 2.1-2.2Ghz. u fly up to 300W for barely any gains.
 

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
Yes, with the R&D behind and a design for it its not an issue. Most ARM SoCs however didn't gain much.

What other ARM SoCs are you talking about? Snapdragon 820 doesn't count because it had to solve the massive power throttling issue of the 810/801. So it increased clocks while dramtically decreasing power consumption. Had they kept power consumption at a constant they would have seen a large increase in frequency.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |