Pcgameshardware using only aftermarket cards.Thats why rx480(overclocked and not throttling) is so fast vs furyX.Rest using reference crap cards.Pcgameshardware also have 980Ti at 1350mhz vs rest that using 980Ti at 1000-1100mhz(thats 30% more performance).So its 35% faster than furyx.
There is really no such thing as a GTX980Ti that operates at 1000-1100mhz unless it's full of dust or the test bed is in a very hot environment (i.e., near the equator, etc.).
GTX980Ti reference boosts to 1200mhz. NV's base clocks are rather pointless to discuss starting with Kepler generation. What matters for NV are boost clocks. It's also somewhat misleading testing 1350-1500mhz 980Ti and leaving Fury X at stock speeds. In any case, looking at the benchmarks for this game, the performance difference between a 980Ti and Fury X cannot be simply explained by
Pcgameshardware also have best benchmarks because we actually see all GPU frequencies.They are just best.
Going back 5-6 generations of GPU testing, as far back as I can remember, PCGamesHardware has always favoured NV cards more than AMD, no matter the generation or the games. The most objective European site is Computerbase.de.
Yea funny why does the 3gb 780ti in sli do so well also?
What's your point that 780Ti SLI 3GB does poorly? What's that have to do with Fury X CF is outperforming GTX1080 8GB? The entire discussion was on 4GB VRAM being a bottleneck. 4GB VRAM bottleneck not found.
Can you or other people claiming 4GB VRAM bottleneck on the Fury X provide evidence backing this up with hard facts?
God... all of my feels summed up.
2016 sucked. So much.
Yup, thank AAA console ports and Gameworks features. With GWs features, GTX1080 drops to
22-30 fps at 1080p. That's right.
https://www.youtube.com/watch?v=AIKm3882Zbk
Other than HTFS shadows, the game is 99% console port to PC. You need a magnifying glass to tell the different in graphics if playing the PC version on a living room TV/projector:
We really shouldnt use gamegpu test at all, they been one of the worst tech sites since forever for benchmarks. To little info and to lazy, that german site is real nice tough! So thanks for the share.
100% BS.
Guru3D got 45 fps on an RX 480
Computerbase.de got 52 fps on an RX 480
GameGPU shows 45 fps for the RX 480. GameGPU often picks some of the most demanding scenes and as soon as AMD's latest drivers came out, they updated the test, even splitting Gameworks off with GameWorks On. It is true that GameGPU often rushes the review before the latest drivers and patches are released, but then they often update the review with the latest drivers/patches and they also do year end testing with all of the popular games released in 2016.
GameGPU has also not shown much bias when testing NV or AMD cards, unlike PCGamesHardware, PCLabs, etc. that almost always have NV cards leading. Computerbase and TechSpot also have an excellent track record of reliable and objective GPU testing.
Wow, 1060 3GB faster than Fury X with High Textures @ 1080p? Not a bad little card there.
Obviously not from a reputable site. Check 5-6 tests for the game before making such a flawed conclusion. Fury X is faster than GTX1060 6GB in this title. Not that it matters since that's not even an accomplishment. Either way, this game is an unoptimized Ubisoft pile.
The game barely runs better on GTX1080 SLI than it does on PS4 Pro.
Other than NV's HTFS shadows, a $399 PS4 Pro and a $2500 PC provide 97% the same IQ in this console port.
Stunts like these is the biggest FAIL for PC Master Race, unless of course Ubisoft is pushing hard to sell more copies of WD2 on XB1 and PS4/Pro. Ubisoft is clearly treating its games primarily as console games and then ports them to PC. There is absolutely 0 doubt about that and I wish an ex-Ubisoft programmer came on here to confirm the truth.
The level of graphics in this title vs. the performance are atrocious. I truly hope the sales of this title reflect it or otherwise Ubisoft will continue releasing turds after turds.
TPU's and GameGPU's screenshot captures of the PC game, as well as DudeRandom84's videos highlight how outdated and primitive this game graphics are. The textures, polygon detail, geometric complexity is often at late PS3/Xbox 360 level or 2008-2010 PC game level, at best. BF1 and SW:BF look like PS5/XB2 games in comparison to this:
These graphics are pathetic when one considers it takes a GTX980Ti/1080/1070 to just hit 60 fps at 1080p! Shockingly bad optimization. Shockingly outdated graphics.
Poor Crytek got so much hate for "unoptimized" Crysis games. Ubisoft wishes it could make games as well optimized as the Crysis series. Even if this game came out in 2012, its graphics would be nothing special but in 1 month it's the year 2017!