I'd be curious to know how well they OC. If these are true, they clearly set the clocks to match the Titan X and 980ti. Did they leave performance on the table, or did they stretch it to reach those scores.
Good point. NV's 980Ti easily overclocks 20% and it does so with minimal increase in GPU voltages. As a result in overclocked states it doesn't use a lot of power vs. the reference model. This could be different for after-market 980Tis and unfortunately I haven't seen the power usage at peak on an overclocked after-market card. It's a no-win situation comparing a reference 980TI's overclocking vs. power usage since it sounds like a jet engine, nullifying it for overclocking to begin with. We'd need Fiji XT vs. 980Ti after-market OCing comparisons.
I agree. If it isn't 2x 290x then AMD can go to sleep.
That's not realistic at all. Neither AMD nor NV have doubled the specs of their previous gen flagships cards in 1.5 years. Titan X is not equal to 2x 780Ti in specs. It took NV 3 years to double the performance of a 580 with a 780Ti and right now it took NV 3.5 years to double the performance of a 7970. 290X came out November 2013 and right now is June 2015. How do you expect a card with 5832 shaders, 128 ROPs and 352 TMUs on the same 28nm node? Even 50% improvement over the 290X is impressive. For a true leap, wait for 14nm/16nm HBM2 GPUs then.
This chart can be misleading for 3 reasons:
1) Were launch drivers used for testing of the new cards?
2) Was the flagship WCE card used or was it an air cooled Fiji card?
3) Arguably most important - 3DMark is NOT a reflection of gaming performance and that chart shows that issue.
980 has a 19% lead over 290X at 1440P in 3DMark, but real world gaming scores show an 11-12% lead. 980 has a 13% lead at 4K but bench scores for games only show a 6-8% lead. Therefore, using the same logic, if Fiji XT matches Titan X in 3DMark scores, we would need to add 5-7% increase in performance in games which theoretically means if a Fiji card matches Titan X in 3DMark, logic would dictate that it's actually faster in the real world gaming scores, at least when comparing Maxwell vs. GCN products this gen. Honestly though, I don't like 3DMark scores all that much since for the last 5 years, they have been hit and miss - often failing to predict accurate standing of GPU cards between brands. For example, 3DMark never predicted how Fermi and Kepler would perform much worse than GCN in future titles but that's exactly what happened. Sometimes 3D mark can be used as a very rough gauge to compare NV vs. NV or AMD vs. AMD but they usually fail to depict an
accurate representation of NV vs. AMD for DX11 games. This of course makes sense since naturally no DX11 game uses a 3DMark game engine, which is why sites like TPU, AnandTech and so on stopped using 3DMark as it's just a synthetic bench.
I would much rather see a professional review site running real world gaming benchmarks on final release date drivers.
So true time and time again amd would lose in firestrike or heaven but still outperform competing Nvidia cards or at least be more competitive than the benchmarks would indicate.
That's exactly it. It's ironic that so many people online keep using Unigine Valley and 3DMark for comparing GPU performance. Those benchmarks can be good for gauging your card's noise levels, temperatures, getting a good starting base for testing your GPU stability since you can loop them. For predicting how AMD and NV fair against each other in games though, these 2 synthetic benchmarks are not very accurate at all, especially Unigine Valley. Also, one can make the argument that a firm with more resources could specifically optimize better for those synthetic benches but it doesn't mean Star Wars Battlefront will run faster on those products, etc. If we go back to 3DMark03 and those days, NV/ATI are known to optimize and even using IQ reducing/cheating drivers to increase scores in those synthetic benches.
If you want to help AMD out with a leak, why would you post FS or Heaven numbers if they are not representative of actual performance?
Are the leakers stupid?
Are you seriously trying to present this argument? Most people on these boards who used synthetic GPU benchmarks like Unigine and 3DMark know they are worthless for predicting gaming performance. Just because someone ran the card in 3DMark it doesn't at all mean it's representative of actual gaming scores. What part of "No game in the world uses a 3DMark game engine" do some gamers not understand? I can't believe synthetic GPU benches and synthetic power viruses are still relevant in 2015. It's just insane how stubborn some PC gamers are that they cling to outdated 15+ year old methods where these tests used to be relevant. One must understand what these tests and designed to do - they are just trying to simulate things, key word --
simulate.
3DMark score tells me absolutely nothing about how my card will perform in The Witcher 3, Star Wars Battlefront, Batman AK, GTA V, Shadow of Mordor. Can you tell me how a 980Ti performs in those games against a Fiji card based on a 3DMark score? No, you cannot. Using 3DMark to gauge where GCN sits against another GCN card is borderline as is and even those are often not correct because if game is geometry limited (280X suffers more than a 285/290), a 3DMark bench score won't predict that scenario and it also can't predict performance in GameWorks titles.
That's not the point. It's more impressive from an engineering perspective to design a dual-purpose compute card that's still top of class in gaming, whether gamers want DP or not is not what some of the members here alluded to. It's about the idea that a firm with less resources could potentially beat a company that's more or less focused on graphics (NV). Their point was that if AMD's chip is only 550mm2 but is as fast as the Titan X in games butstill has DP performance, that means AMD's engineers have outperformed NV's because they would have managed to make a dual-purpose product that succeeds in more than 1 area in a
smaller die size. The penalty for this is likely that 300W TDP though, still NV's engineers needed a 601mm2 die
just for gaming. You also may have forgotten how double precision was used to justify the brand/premiums for the OG Titan, but now the same posters no longer care for it.
Considering for months all we heard how Fiji is either a re-badge of a 290X with HBM or it's a dual Tonga XT card (2x2048 shaders on 2 die) or that it has no chance of catching up the Titan X (coming from Titan X owners), if AMD's card is even 1% faster than the Titan X at 4K, a lot of people on these forums will have been proven wrong.