So why is non reference 980TI 26% faster than FURYX in 1080p?
GPUs are not bottlenect in 1080 its just FURYX is that bad in 1080p.
DX11 driver bottleneck. Also, it's
11% not 26%. You are comparing a 1.4+Ghz 980Ti to a stock Fury X? My point is even if you compare NV GPUs to each other, their true GPU raw power comes out at 1440P and 4K. I mean if it makes some people feel better that 980Ti is 26% faster than Fury X at 1080P, they can knock themselves out but at least compare apples-to-apples (stock vs. stock or OC vs. OC). If 980Ti was 30% faster than Fury X at 1080P and lost by 15% at 1440P/4K, it only matters if you play at 1080p. I don't. When I buy a $300 GPU now, I don't look at 1920x1200 and below benches. If those matter, go NV until more games are DX12. Also, I tend to keep my GPUs longer than 12 months so long-term performance, as is often gauged by a card's ability to perform better at higher resolution, matters far more to me. Even since I started gauging the card's longevity by high resolution gaming benches, it hasn't once let me down.
If you care about 1080p benches, but you also have a 970 OC, I wouldn't waste $ upgrading to 2016 cards. 970 OC ~ 980/390X OC is probably good enough for another 1.5 years at 1080p, maybe longer.
BTW, stock vs. stock
900p
980Ti beats 980 by 14%
980Ti beats 970 by 27%
1080p
980Ti beats 980 by 19%
980Ti beats 970 by 35%
1440p
980Ti beats 980 by 24%
980Ti beats 970 by 43%
4K
980Ti beats 980 by 27%
980Ti beats 970 by 47%
http://www.techpowerup.com/reviews/Gigabyte/GTX_980_Ti_XtremeGaming/23.html
It's pretty clear the point I am illustrating. If someone buys a high-end card for CPU limited resolutions, well your GPU is waiting for that CPU. The benches aren't showing true GPU vs. GPU comparison since so many of them are instead showing CPU limited gaming scenarios.
The reverse of that is showing 900p CPU benchmarks of i7 6700K OC vs. i7 4790K OC but the gamer uses 1440p/4K gaming resolution. Ya, not really relevant for CPU upgrade path for the Devil's Canyon user, is it?