Watchdogs2 benchmarks

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

caswow

Senior member
Sep 18, 2013
525
136
116
are there any tests without the crippling gameworks effects cranked up? all i see is "omg 1060 is faster than 480" while the game beeing unplayable with its 40fps.
 
Reactions: Bacon1

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
I know you guys have a problem when AMD CPUs are faster so lets take the Core i7 2600K back from 2011 that completely destroys the 4 years younger Core i5 Skylake from Q3 2015 and at the same clocks

Dont worry, Intel will give us an unlocked Core i3 at $180 early 2017, so much for CPU progression

Well, it's a game that is well threaded, so I would certainly expect the higher clocked 8 thread i7 to do well against the lower clocked 8 thread i5.

I also note the launch price of the 2600K back in 2011, which puts it $100 more expensive than the i5-6600 in today's dollars.

I also note that there is an i5-6600K.

I also note that AMD's FX chips have always shined in certain areas, it's been overall that they have been unimpressive.

Further, I pat the FX-6300 system that I built myself for the second office here, one among several FX systems I have built.
 

96Firebird

Diamond Member
Nov 8, 2010
5,712
316
126
are there any tests without the crippling gameworks effects cranked up? all i see is "omg 1060 is faster than 480" while the game beeing unplayable with its 40fps.

What settings would you like to see disabled? As far as I can tell, PCGH didn't use any GameWorks effects. PCSS/HFTS shadows weren't used; Ultra shadows were. HBAO+ wasn't used; HMSSAO was. TXAA wasn't used (AMD cards can't even use this); Temporal-AA w/SMAA was used.

So no, GameWorks settings weren't used in their testing.
 

Innokentij

Senior member
Jan 14, 2014
237
7
81
We really shouldnt use gamegpu test at all, they been one of the worst tech sites since forever for benchmarks. To little info and to lazy, that german site is real nice tough! So thanks for the share.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
We really shouldnt use gamegpu test at all, they been one of the worst tech sites since forever for benchmarks.

https://www.techpowerup.com/reviews/Performance_Analysis/Watch_Dogs_2/

I like Techpowerup .....





Some highlights worth mentioning are:
  • Field of View: Adjustable from 70° to 110°
  • Pixel Density: Lets you adjust the game's rendering resolution
  • Extra Details: Even when running at Ultra, you can increase the settings by increasing this slider which controls the game engine's level-of-detail scaling with distance.
  • Texture Resolution: Ultra is available only after downloading the free HD Texture pack DLC
  • Temporal Filtering: This boosts performance greatly by rendering at half the resolution while the player is moving.
  • MSAA: Is available and can be combined with other post-process-AA options, but comes with a huge performance hit.
 
Last edited:

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
Sad so many sites are testing in settings that no gamer would want to actually game at unless they had freesync/gsync. under 50 fps avg for an action game? No thanks!
 

[DHT]Osiris

Lifer
Dec 15, 2015
14,639
12,767
146
Sad so many sites are testing in settings that no gamer would want to actually game at unless they had freesync/gsync. under 50 fps avg for an action game? No thanks!

I think the idea is to test at 'appropriate levels', so generally you're not gonna do watchdogs 2 cranked on a GTX960, nor at lowest settings on a 1080, so test sort-of-in-the-middle and get a few of the outliers (like cranky cranked settings on the 6GB 1060, just to see how bad it is), then compare/contrast them all, find the weaknesses in the cards/settings, and create opinions from there.

Doesn't help when people post a single benchmark/chart and say x is better than y based on that chart. But you should never exclude options from your testing environment based on what gamers would actually do. I mean the quality of the game has never come up in this thread, it's just numbers based on specific driver versions.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
I think the idea is to test at 'appropriate levels', so generally you're not gonna do watchdogs 2 cranked on a GTX960, nor at lowest settings on a 1080, so test sort-of-in-the-middle and get a few of the outliers (like cranky cranked settings on the 6GB 1060, just to see how bad it is), then compare/contrast them all, find the weaknesses in the cards/settings, and create opinions from there.

Doesn't help when people post a single benchmark/chart and say x is better than y based on that chart. But you should never exclude options from your testing environment based on what gamers would actually do. I mean the quality of the game has never come up in this thread, it's just numbers based on specific driver versions.

Oh sure, I just wish sites would use at least 2 settings. One for the 600+ cards and one for the cards people actually buy and what 90% of their reader base would use.

The guides from Nvidia are actually nice because of this because they go into detail which setting are huge perf hits while not improving IQ much, so you can see what you can turn off to get much better FPS without degrading IQ.
 

96Firebird

Diamond Member
Nov 8, 2010
5,712
316
126
PCGH has lower settings on a separate tab, but only a few cards are tested...

HardOCP does playable-settings testing, but people like to berate them here.
 
Last edited:
Mar 10, 2006
11,715
2,012
126
The guides from Nvidia are actually nice because of this because they go into detail which setting are huge perf hits while not improving IQ much, so you can see what you can turn off to get much better FPS without degrading IQ.

Yes, they are. I remember googling for some info on a settings in BLOPS3 and I wound up reading the NVIDIA guide on the game.

This is a very valuable service they provide for gamers.
 

dogen1

Senior member
Oct 14, 2014
739
40
91
For those of you who have grabbed a copy of the game. How do you feel the games graphics/ fps ratio is on our current PC hardware. Personally I think it is one the better UBisoft releases with that being said max settings does turn it into an absolute turd.

Isn't that how it always is? Max settings are always extra. In fact, that's what they should be called, especially if their cost is out of proportion to the visual benefit.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
PCGH has lower settings on a separate tab, but only a few cards are tested...

Sadly they only lowered Textures, which only really helped the 1060 3gb since it was out of VRAM. Other settings are still Ultra.
HardOCP does playable-settings testing, but people like to berate them here.

The issue with their testing is they'd use different settings per card often so you couldn't tell which was playable, and it was completely left upto their judgement instead of hard data, they'd turn down settings which made 1-2 fps difference.
 

[DHT]Osiris

Lifer
Dec 15, 2015
14,639
12,767
146
The guides from Nvidia are actually nice because of this because they go into detail which setting are huge perf hits while not improving IQ much, so you can see what you can turn off to get much better FPS without degrading IQ.

Agreed, I love that part. I bet they're deriving data from their gameworks information, since they have to know that info anyhow to create those 'apply best settings!' profiles from the experience application. A ton of legwork on NV's end to do this for any given game I imagine but once the info is done they can probably extrapolate it for all their cards easily.
 
Reactions: Headfoot

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Pcgameshardware using only aftermarket cards.Thats why rx480(overclocked and not throttling) is so fast vs furyX.Rest using reference crap cards.Pcgameshardware also have 980Ti at 1350mhz vs rest that using 980Ti at 1000-1100mhz(thats 30% more performance).So its 35% faster than furyx.

There is really no such thing as a GTX980Ti that operates at 1000-1100mhz unless it's full of dust or the test bed is in a very hot environment (i.e., near the equator, etc.). GTX980Ti reference boosts to 1200mhz. NV's base clocks are rather pointless to discuss starting with Kepler generation. What matters for NV are boost clocks. It's also somewhat misleading testing 1350-1500mhz 980Ti and leaving Fury X at stock speeds. In any case, looking at the benchmarks for this game, the performance difference between a 980Ti and Fury X cannot be simply explained by

Pcgameshardware also have best benchmarks because we actually see all GPU frequencies.They are just best.

Going back 5-6 generations of GPU testing, as far back as I can remember, PCGamesHardware has always favoured NV cards more than AMD, no matter the generation or the games. The most objective European site is Computerbase.de.

Yea funny why does the 3gb 780ti in sli do so well also?





What's your point that 780Ti SLI 3GB does poorly? What's that have to do with Fury X CF is outperforming GTX1080 8GB? The entire discussion was on 4GB VRAM being a bottleneck. 4GB VRAM bottleneck not found.

Can you or other people claiming 4GB VRAM bottleneck on the Fury X provide evidence backing this up with hard facts?

God... all of my feels summed up.
2016 sucked. So much.

Yup, thank AAA console ports and Gameworks features. With GWs features, GTX1080 drops to 22-30 fps at 1080p. That's right.

https://www.youtube.com/watch?v=AIKm3882Zbk

Other than HTFS shadows, the game is 99% console port to PC. You need a magnifying glass to tell the different in graphics if playing the PC version on a living room TV/projector:


We really shouldnt use gamegpu test at all, they been one of the worst tech sites since forever for benchmarks. To little info and to lazy, that german site is real nice tough! So thanks for the share.

100% BS.

Guru3D got 45 fps on an RX 480



Computerbase.de got 52 fps on an RX 480

GameGPU shows 45 fps for the RX 480. GameGPU often picks some of the most demanding scenes and as soon as AMD's latest drivers came out, they updated the test, even splitting Gameworks off with GameWorks On. It is true that GameGPU often rushes the review before the latest drivers and patches are released, but then they often update the review with the latest drivers/patches and they also do year end testing with all of the popular games released in 2016.

GameGPU has also not shown much bias when testing NV or AMD cards, unlike PCGamesHardware, PCLabs, etc. that almost always have NV cards leading. Computerbase and TechSpot also have an excellent track record of reliable and objective GPU testing.

Wow, 1060 3GB faster than Fury X with High Textures @ 1080p? Not a bad little card there.

Obviously not from a reputable site. Check 5-6 tests for the game before making such a flawed conclusion. Fury X is faster than GTX1060 6GB in this title. Not that it matters since that's not even an accomplishment. Either way, this game is an unoptimized Ubisoft pile.

The game barely runs better on GTX1080 SLI than it does on PS4 Pro.

Other than NV's HTFS shadows, a $399 PS4 Pro and a $2500 PC provide 97% the same IQ in this console port.


Stunts like these is the biggest FAIL for PC Master Race, unless of course Ubisoft is pushing hard to sell more copies of WD2 on XB1 and PS4/Pro. Ubisoft is clearly treating its games primarily as console games and then ports them to PC. There is absolutely 0 doubt about that and I wish an ex-Ubisoft programmer came on here to confirm the truth.

The level of graphics in this title vs. the performance are atrocious. I truly hope the sales of this title reflect it or otherwise Ubisoft will continue releasing turds after turds.

TPU's and GameGPU's screenshot captures of the PC game, as well as DudeRandom84's videos highlight how outdated and primitive this game graphics are. The textures, polygon detail, geometric complexity is often at late PS3/Xbox 360 level or 2008-2010 PC game level, at best. BF1 and SW:BF look like PS5/XB2 games in comparison to this:




These graphics are pathetic when one considers it takes a GTX980Ti/1080/1070 to just hit 60 fps at 1080p! Shockingly bad optimization. Shockingly outdated graphics.





Poor Crytek got so much hate for "unoptimized" Crysis games. Ubisoft wishes it could make games as well optimized as the Crysis series. Even if this game came out in 2012, its graphics would be nothing special but in 1 month it's the year 2017!
 
Reactions: Bacon1

96Firebird

Diamond Member
Nov 8, 2010
5,712
316
126
Obviously not from a reputable site. Check 5-6 tests for the game before making such a flawed conclusion. Fury X is faster than GTX1060 6GB in this title. Not that it matters since that's not even an accomplishment. Either way, this game is an unoptimized Ubisoft pile.

Oh, right, I forgot, if you don't like the results blame the review. Or the game. Or the day of the week.


Trolling is not allowed
Markfw
Anandtech Moderator
 
Last edited by a moderator:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
This is completely off topic, but I just want to say this.

I think the important point being made is that Crossfire/SLI is a gamble. There's no guarantee that 50% of your investment is going to be worth it for the games you want to play in the future. Additionally, that 50% won't be usable in most non AAA games. Sure, they might not need the horsepower, but if you had a 1070 you'd probably be running them at 4k DSR or maybe with driver forced SGSSAA, HBAO+, idk. A 1070 is always a 1070. 2x 480s(or any multi gpu setup) is only 2 of them in AAA games, and definitely not all of them.

On topic though. Benches for this game don't look too bad. I didn't expect that big of a gap between the fury cards and the 980 ti (is that normal?), but it seems like this might be pretty geometry heavy considering the 480s performance.

Just because you don't understand the value of CF/SLI, doesn't mean it's not there.

GTX1070 SLI > GTX1080



SLI/Xfire has been on life support for a while now. AMD's landscape is such that the only option for really 'high end performance' with modern arch is to xfire it. For anyone with NV, or not tied to one arch (via preference or freesync/gsync), it makes 0 sense to SLI. You get better avg perf/$ by just buying the next card up.

100% wrong. Please tell me what I could have bought in 2016 that was better than GTX1070 SLI for $650 if I want to play AAA games spanning 2010-2016 at max settings? I'll be waiting. Let's just ignore all the popular AA games where CF/SLI works well. Let's ignore how 290 CF that cost $500 is wiping the floor with a $550 GTX980 in BF1, or how GTX1070/1080 SLI would achieve higher minimum FPS at 4K in BF1 than a single GTX1080 gets in averages.



The SLI/CF haters often haven't used SLI/CF in their life or in the last 3 years. SLI/CF was never designed to provide 90-100% benefit from the 2nd videocard. If you expect that in all games, you clearly don't understand the reason SLI/CF has its benefits even in 2016. If someone spends $1200-2400 on SLI/CF and expect support in almost all AAA games on launch date, they only created these unrealistic expectations for themselves and will always be disappointed.

Oh, right, I forgot, if you don't like the results blame the review. Or the game. Or the day of the week.

Nope, I look at as many reviews as possible to see what the general trends are. There are always testers that have been shown to be biased towards NV for several generations of GPU testing, or testers that have issues with their GPUs. PCGameshardware has always favoured NV GPUs and it's amusing you aren't admitting to this fact. Everyone on Russian GPU forums even knows this too.

Same reasons knowledgeable Russian gamers don't use Overclockers.ru GPU tests as they 99% worthless.

Interesting how none of the reputable and proven over time sites such as Computerbase, Guru3D, GameGPU, TPU do not show GTX1060 3GB outperfomring the Fury X, and yet you ignored all their testing in favour of a site that has leaned towards NV for years. Expected nothing less from you though.

It's also hilarious people defending another gimped GWs title when we know GameWorks = NV worked closely with the developer which means their cards should perform better especially since NV has a history of poorly optimized GWs games that also "as if by pure magic" cripple performance on older NV cards and GCN. It's been their strategy for the last 4 years.

Either way, Ubifail shares most of the blame in my eyes. The game's graphics are terrible and the optimization for the PC is MIA. Even when the game runs without the performance killing HTFS, it still performs horribly given the level of graphics on offer and how little differentiates console vs. PC version of this title. Another typical Ubisoft game many of us expect like clockwork.
 
Last edited:

dogen1

Senior member
Oct 14, 2014
739
40
91
Just because you don't understand the value of CF/SLI, doesn't mean it's not there.

I understand that there's little value for me. I mostly play old games, non AAA games. Sometimes I use emulators, which never support SLI. And assuming CF/SLI still don't work well here, I also like to use window mode without losing 50% of my GPU lol.

What am I missing? That 2 1070s beat a 1080 in games that support SLI? Cool. If those are your games go for it. Don't pretend it's the more reliable option.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
What's your point that 780Ti SLI 3GB does poorly? What's that have to do with Fury X CF is outperforming GTX1080 8GB? The entire discussion was on 4GB VRAM being a bottleneck. 4GB VRAM bottleneck not found.

Can you or other people claiming 4GB VRAM bottleneck on the Fury X provide evidence backing this up with hard facts?

I was pointing out if 4gb bottleneck not found.....
3gb also not found. The gtx780ti would fall on its face also if it ran out of Vram.
 

Head1985

Golden Member
Jul 8, 2014
1,866
699
136
There is really no such thing as a GTX980Ti that operates at 1000-1100mhz unless it's full of dust or the test bed is in a very hot environment (i.e., near the equator, etc.). GTX980Ti reference boosts to 1200mhz. NV's base clocks are rather pointless to discuss starting with Kepler generation. What matters for NV are boost clocks. It's also somewhat misleading testing 1350-1500mhz 980Ti and leaving Fury X at stock speeds. In any case, looking at the benchmarks for this game, the performance difference between a 980Ti and Fury X cannot be simply explained by


Going back 5-6 generations of GPU testing, as far back as I can remember, PCGamesHardware has always favoured NV cards more than AMD, no matter the generation or the games. The most objective European site is Computerbase.de.
1200Mhz is max boost clock not average.Average is 1000-1100Mhz.You must set max power limit in afterburner and maybe then it will boost at 1200Mhz whole time.

Also Pcgameshardware is not favouring nv cards.Their 7970 vs 680 review was excelent
http://www.pcgameshardware.de/Grafi...sts/Test-Geforce-GTX-680-Kepler-GK104-873907/
 
Last edited:
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |