Gears of War 4 Benchmark

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

master_shake_

Diamond Member
May 22, 2012
6,430
291
121
It wouldn't be a VC&G benchmark thread without throwing the "lets rank review sites" into the mix!

And DAMN, GTX 770 4GB > $1000 HD7990!

of course it is.

this engine or whatever does not do multi gpu scaling at all.

so it's using what amounts to a lower clocked 7970 vs a 680. edit: 7990 is the same clocks as a 7970.

why is that a surprise?
 
Reactions: Headfoot

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Are we looking at the same benchmarks? FX8370 at ComputerBase.de is bottlenecking NV's cards compared to the same GPUs running on 6700K.

i7 6700K
GTX970 = 62.9 fps
R9 390 = 67.9 fps

FX 8370
GTX 970 = 48.1 fps
R9 390 = 61.9 fps

GameGPU shows 2600K and i7 6700 demolishing 2500K and i5 6600. It takes at least an i5 6600 to hit maintain 60 fps minimums.

Eh, who really cares? The FX series are crap CPUs for gaming, everyone knows that. Nobody in their right mind would pair one of them with a GTX 1080, as it would just be a waste..

It's not impressive when the game looks worse than Crysis 2, and by far worse than Crysis Warhead or Metro 2033/Last Light/Rise of the Tomb Raider/Deus Ex Mankind Divided/Crysis 3, etc. This is a straight up console port, with mainly shadows and SSR superior on the PC. The graphics are average at best for a 2016 game.

You took the worst screenshots you could find, which is typical I suppose. Anyone else that wants to see a screenshot comparison that's actually worth a damn, look here..

Anyway, I reserve final judgment for when I actually get to play the game, and I will make my own determination, rather than just relying on compressed images and video footage which is what you always like to do..
Zooming in on the screenshots or watching a Candyland video reveals low-polygon character models, mediocre textures, nothing special lighting effects/particle effects/smoke effects. Up-close the vegetation, rocks/terrain looks terrible for a 2016 DX12 PC game. Ironic how you dismissed the gorgeous Forza Horizon 3 (even though it was poorly optimized), but are calling this an exceptionally optimized DX12 title!

The PC version is hitting triple digit frame rates, has 4K textures, uses advanced DX12 features, and more settings than you can throw a stick at, and yet only you would consider it poorly optimized.

Could the PC version look a lot better? Of course it could, but the game was developed around the Xbox One first and foremost, so obviously this was a limiting factor..

Candyland: Gears of War 4 – PC Ulltra vs. Xbox One: https://www.youtube.com/watch?v=1LXEBeTezh8

Graphics are nothing special, I'd even call them mediocre for 2016. No wonder R9 290 is hitting 70 fps at 1080p UQ at GameGPU.

Well, I'm sure your followers will like this YouTube compressed comparison, but for me who has bought and will actually play the game when it unlocks, I reserve the right to make my own determination rather than relying on compressed footage
 
Reactions: Sweepr

moonbogg

Lifer
Jan 8, 2011
10,637
3,095
136
Looks like my 980ti @ 1465mhz should still survive this game pretty well. Too damn bad I can't use both of them. I am SOOO done with SLI. What a sad waste of money. So sad.
Also, those CPU results are good. People are looking at those numbers, but come on, any of those chips would have a full Ghz added to them with an OC. Any decent CPU has this game covered, although I would have liked to see some 6 core results. They only tested 4 and 8 core CPUs.,
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
As far as your comparison of 1080 to Fury X, that's stating the obvious because it's akin to comparing HD7970/GTX680 (1080) to GTX580 (Fury X). A generation newer Pascal architecture SKU that costs $620+ made on a far superior 16nm node should obliterate the 1.5-year-old 28nm Fury X.

Oh is it now? I thought that since NVidia didn't have proper DX12 hardware or some such, that even Pascal would be slaughtered by the mighty AMD Fury X, with its DX12 certified hardware ACEs

Today, it's on the lower end the GTX1080 costs $620 and Fury X is $350 on Newegg, the constant comparisons of GTX1080 to Fury X with conclusions that "AMD is getting destroyed in DX12, blah blah blah" are getting tiresome to read. Let's see how 1080 does compared to its true generational competitor - Vega 10. At the very least, given the current pricing landscape, the Fury X should be compared to the GTX1070.

Well, if you don't like the comparison, you should blame your buddies for spreading the false rumor that NVidia sucks at DX12, and that AMD's ACEs add 50% extra performance

It seems after AMD HD7000/R9 290 series wiped the floor with Kepler, the new thing is to compare GPUs while completely ignoring their prices. Interesting new strategy from NV fans.

The price doesn't really matter in this regard, because the FuryX is AMD's flagship GPU. So it will be compared to NVidia's high end GPUs no matter what..
 

Sweepr

Diamond Member
May 12, 2006
5,148
1,142
131
As expected ComputerBase lowered settings to avoid a VRAM bottleneck with Fury X, especially at 4K.

Still, according to them GTX 1060 is 11.4% faster than RX 480 @ maximum details at 1080p (48.7 FPS vs 43.7 FPS @ stock). One has to wonder why they didn't test more VGAs at these settings.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
As expected ComputerBase lowered settings to avoid a VRAM bottleneck with Fury X, especially at 4K.

You mean only at 4k right? And they did it to get playable framerates on all cards, not just the Fury X.

Or do you think that 28->31 fps on a OC'd 980 Ti are enjoyable fps as shown by PCGH?

What about 18-22 fps on a 1060? Sure sounds enjoyable to me
 
Reactions: crisium

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
Still, according to them GTX 1060 is 11.4% faster than RX 480 @ maximum details at 1080p (48.7 FPS vs 43.7 FPS @ stock). One has to wonder why they didn't test more VGAs at these settings.

Probably because over a 50% performance loss isn't worth it for an action game over Ultra.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
Oh is it now? I thought that since NVidia didn't have proper DX12 hardware or some such, that even Pascal would be slaughtered by the mighty AMD Fury X, with its DX12 certified hardware ACEs



Well, if you don't like the comparison, you should blame your buddies for spreading the false rumor that NVidia sucks at DX12, and that AMD's ACEs add 50% extra performance



The price doesn't really matter in this regard, because the FuryX is AMD's flagship GPU. So it will be compared to NVidia's high end GPUs no matter what..

No one has said any of that stop trolling.
 

Sweepr

Diamond Member
May 12, 2006
5,148
1,142
131
Where did you see this?

Here. 4K was lowered to High, 1080p/1440p at Ultra (not maximum) - would be nice to compare the performance difference between their cards at High vs Ultra vs Insane settings but they omitted this (this isn't the first time).

It's clearly not because of performance limitations, if a GTX 1060 can push nearly 50 FPS at Insane, then GP104/GP102 would have no problem at 1440p. Fury X's limited VRAM might be a problem though.
 
Last edited:
Reactions: Carfax83

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
but they omitted this.

I already answered you why they did so in my other posts, they aren't playable settings that people would use, so why waste time benchmarking them?

Do you think that 28->31 fps on a OC'd 980 Ti are enjoyable fps as shown by PCGH?

What about 18-22 fps on a 1060?
 
Aug 11, 2008
10,451
642
126
Conflicting benchmarks? What are you on about? The benchmarks so far are fairly uniform if anything.. Gears of War 4 is an exceptionally optimized DX12 title. It even makes use of tiled resources, a highly useful feature that came out with DX11.2 but that no game developer has used to my knowledge.
Or that consoles being AMD based means an automatic win for for AMD.

Interesting that a Microsoft title runs better on the "other" companies' hardware, don't you think? Some of the rather more vehement AMD supporters have suddenly gone very quiet.

Actually they are just starting a crusade to discredit the results because it is an "nVidia game". Kind of ironic since they have no qualms about regurgitating the Doom results over and over again to "prove" how much better AMD cards are going to age.
 
Reactions: Sweepr

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
It wouldn't be a VC&G benchmark thread without throwing the "lets rank review sites" into the mix!

Since I've followed this forum, Gamer's Nexus GPU reviews were never viewed as objective or reliable.

And DAMN, $450 GTX 770 4GB > $0 HD7990!

Fixed. HD7990 was one of the best bitcoin mining cards ever made. Just because you never utilized the mining potential of AMD cards during that era was solely your own decision. I never compared single GPU to multi-GPU cards in my comparison but rather single AMD GPU vs. single NV GPU cards. It's interesting how nearly every single NV GPU except GeForce 8 era that you have purchased got completely demolished by its AMD-equivalent throughout the holding period in which you owned a particular NV GPU. What did you own before 970, a 760/670/770? Yup, destroyed in modern games by 7970/Ghz/R9 280X, and R9 290X/R9 390 also > 970. I have to wonder how you keep buying worse videocards from the same company over and over and over? That's a remarkable track record of buying inferior tech gen after gen. It's hard to believe that in 6-8 years you never came across a deal or waited for a deal when AMD card was easily the better choice.

Eh, who really cares?

You said in this very thread that the game is not very CPU dependent. Various posters, including myself, provided data that shows the opposite. Now it's "who cares?"

The FX series are crap CPUs for gaming, everyone knows that. Nobody in their right mind would pair one of them with a GTX 1080, as it would just be a waste..

No one was suggesting to pair a GTX1080 with an FX processor. The Computerbase charts show a GTX970/R9 390 with an FX8370. It shows a severe CPU bottleneck for NV GPUs, which you want to dismiss after bashing AMD's CPU-overhead for years. Now it doesn't matter since NV's CPU overhead in this game is huge?

The PC version is hitting triple digit frame rates, has 4K textures, uses advanced DX12 features, and more settings than you can throw a stick at, and yet only you would consider it poorly optimized.

I never said the game is poorly optimized. I said it's easy to have a well-optimized game when the graphics are average and aren't pushing the envelope. Sorry, but compression is not responsible for inadequate level of polygons, primitive levels of detail on objects, lack of tessellated edges, lack of proper hair, mediocre vegetation complexity, etc.

Had you watched the Candyland video in detail, you would have seen the game was designed for Xbox One. It's not a true next generation PC title, despite you hyping it as the first true DX12 game on the PC. Not even close. The graphics are nowhere near as good as Deus Ex Mankind Divided or Rise of the Tomb Raider.







Eh, who really cares?
Could the PC version look a lot better? Of course it could, but the game was developed around the Xbox One first and foremost, so obviously this was a limiting factor..[/quote]

Right, so it shouldn't be surprising that a GTX970/R9 290/290X/R9 390/480/1060 are playing this game at 1080p Ultra without much effort. What's that have anything to do with "exceptionally optimized DX12 title"? This is the bare minimum expected given that the graphics on the PC and Xbox One with HD7790 don't deviate all that much. I would sure hope that a PC with CPUs and GPUs 3-4X more powerful than Xbox One would be able to play a console game at 1080p 60 fps on Ultra in 2016.

The PC version is hitting triple digit frame rates, has 4K textures, uses advanced DX12 features, and more settings than you can throw a stick at, and yet only you would consider it poorly optimized.

Too bad that none of it comes together to create a next gen graphically mind-blowing game. As I said, this game runs well but just because it runs well when other Windows 10 store PC ports have failed miserably out of the gate, doesn't mean it itself is a true next gen DX12 PC game -- and that's what you tried claiming before it launched. How can it be a true DX12 game from the ground-up and yet look worse than not only some of the best DX11 games, but even half-baked DX12 Deus Ex MD and Rise of the Tomb Raider implementations?

Looks like my 980ti @ 1465mhz should still survive this game pretty well. Too damn bad I can't use both of them. I am SOOO done with SLI.

It's been true for a while that if you play games right at launch or even in the first 1-2 months from launch, expect poor SLI/CF support. For those who play older games, or buy games later on in their cycle, SLI/CF still have their place imho. For example, right now I'd pick GTX1070 SLI over GTX1080 all day, any day. If someone wants to max settings or play at 1440p 144Hz or 4k on a budget, 1070 SLI is way better than a GTX1080 and $1200 Titan XP is close to 2x the price of 1070 SLI. I understand that you are disappointed that modern titles don't support SLI/CF out of the box but that's partly because DX12 shifts the burden onto the developers, not AMD/NV only.

Oh is it now? I thought that since NVidia didn't have proper DX12 hardware or some such, that even Pascal would be slaughtered by the mighty AMD Fury X, with its DX12 certified hardware ACEs

Who said that? It's reasonable to have stated that Fury X's shader underutilization and DX11 CPU overhead would be improved under a proper DX12 implementation with heavy use of Async Compute but I doubt anyone was claiming that Fury X would gain over 40% of extra performance, which is what's required to beat a 1080.

Well, if you don't like the comparison, you should blame your buddies for spreading the false rumor that NVidia sucks at DX12, and that AMD's ACEs add 50% extra performance

Who claimed that ACEs would add 50% extra performance to videocards? Benchmarks for this title show that GCN gains more from Async than NV does. So it's consistent with nearly every DX12 game out before. What difference does it make to you if Pascal is garbage under DX12 or not since we all know you'll buy 1080Ti, and then 2080 and then 2080Ti, etc. Why can't AMD GPUs benefit from Async Computer more than NV's GPUs but at the same time AMD's GCN GPUs have their own inherent bottlenecks that ensure NV's GPUs are just as fast and/or faster? You are so stuck on defending NV's DX12 architecture as if it matters. Even if Pascal had horrible DX12, average DX12 or amazing DX12 architecture, you would have still purchased at least 2 GPUs in the same generation. You are G-Sync locked and will never buy an AMD GPU. Moving on now....

The price doesn't really matter in this regard, because the FuryX is AMD's flagship GPU. So it will be compared to NVidia's high end GPUs no matter what..

Interesting, price doesn't matter as long the cards are the fastest now. Let me guess if Fury X was discontinued, you'd compare a $650 GTX1080 to a $250 RX 480 and site that RX 480 is AMD's current flagship? Keep telling yourself that. I guess it makes you feel better after paying $550 for mid-range 980 and $700 for mid-range 1080. Your posts have gotten too biased as of late when you start blatantly comparing GTX1080 to Fury X as if there were meant to be competitors, while also ignoring that one of them costs nearly 1/2. It's just funny to read coming from a guy who had GTX580 SLI while it got owned by HD7970Ghz CF, and then proceeded to have 980 SLI that was barely faster than $650 R9 295X2. You realize not everyone loves throwing $ into the toilet on overpriced NV cards like you do, right? I know if a $650 Vega 10 came out first and NV delayed GTX1080, I wouldn't be comparing a $350 980Ti to a $650 videocard and claiming 980Ti got owned hard.

You keep trying so hard to crap on Fury X, but almost no one on this forum has purchased that card for $650. I bet the gamers on this forum who purchased a Fury or Fury X have paid $300-400 for it, not $650. Those who paid $650 for it were likely mining ethereum which means the card has already paid for itself. Therefore, your constant praise for the overpriced 1080 and negativity targeted towards Fury X keeps coming off as justification for overpaying for the 1080 and wanting to feel superior to those Fury X owners. You do realize that if a Fury/Fury X owner paid just $300-400 for their card, they you know expect the $650-700 1080 to be 50-70% faster in almost all AAA games. That's why they didn't buy the Fury X for $650 in the first place either -- because that level of performance wasn't justifiable for the price. What makes you think these gamers who criticized the Fury X at launch view 1080 any differently?

Fury X actually plays this game well, so there is no controversy or story here despite you trying hard to claim that Pascal is WAY ahead -- > 120 fps from a $620 GPU is irrelevant in a 3rd person shooter when the $350 AMD card is getting 70+.

Why don't you instead talk about how 1080 is barely faster than GTX1070? I guess it's better to discuss how much of a "failure" a $350 Fury X is and ignore that it's possible to pick up an open box Asus Strix 1070/OC for $325 or $340 (after $25 MasterPass code) and get nearly identical gaming experience to the overpriced $600+ 1080. But then that isn't interesting enough for you since "Price is irrelevant."
 
Last edited:
Aug 11, 2008
10,451
642
126
Since I've followed this forum, Gamer's Nexus GPU reviews were never viewed as objective or reliable.



Fixed. HD7990 was one of the best bitcoin mining cards ever made. Just because you never utilized the mining potential of AMD cards during that era was solely your own decision. I never compared single GPU to multi-GPU cards in my comparison but rather single AMD GPU vs. single NV GPU cards. It's interesting how nearly every single NV GPU except GeForce 8 era that you have purchased got completely demolished by its AMD-equivalent throughout the holding period in which you owned a particular NV GPU. What did you own before 970, a 760/670/770? Yup, destroyed in modern games by 7970/Ghz/R9 280X, and R9 290X/R9 390 also > 970. I have to wonder how you keep buying worse videocards from the same company over and over and over? That's a remarkable track record of buying inferior tech gen after gen. It's hard to believe that in 6-8 years you never came across a deal or waited for a deal when AMD card was easily the better choice.



You said in this very thread that the game is not very CPU dependent. Various posters, including myself, provided data that shows the opposite. Now it's "who cares?"



No one was suggesting to pair a GTX1080 with an FX processor. The Computerbase charts show a GTX970/R9 390 with an FX8370. It shows a severe CPU bottleneck for NV GPUs, which you want to dismiss after bashing AMD's CPU-overhead for years. Now it doesn't matter since NV's CPU overhead in this game is huge?



I never said the game is poorly optimized. I said it's easy to have a well-optimized game when the graphics are average and aren't pushing the envelope. Sorry, but compression is not responsible for inadequate level of polygons, primitive levels of detail on objects, lack of tessellated edges, lack of proper hair, mediocre vegetation complexity, etc.

Had you watched the Candyland video in detail, you would have seen the game was designed for Xbox One. It's not a true next generation PC title, despite you hyping it as the first true DX12 game on the PC. Not even close. The graphics are nowhere near as good as Deus Ex Mankind Divided or Rise of the Tomb Raider.







Could the PC version look a lot better? Of course it could, but the game was developed around the Xbox One first and foremost, so obviously this was a limiting factor..

Right, so it shouldn't be surprising that a GTX970/R9 290/290X/R9 390/480/1060 are playing this game at 1080p Ultra without much effort. What's that have anything to do with "exceptionally optimized DX12 title"? This is the bare minimum expected given that the graphics on the PC and Xbox One with HD7790 don't deviate all that much. I would sure hope that a PC with CPUs and GPUs 3-4X more powerful than Xbox One would be able to play a console game at 1080p 60 fps on Ultra in 2016.



Too bad that none of it comes together to create a next gen graphically mind-blowing game. As I said, this game runs well but just because it runs well when other Windows 10 store PC ports have failed miserably out of the gate, doesn't mean it itself is a true next gen DX12 PC game -- and that's what you tried claiming before it launched. How can it be a true DX12 game from the ground-up and yet look worse than not only some of the best DX11 games, but even half-baked DX12 Deus Ex MD and Rise of the Tomb Raider implementations?



It's been true for a while that if you play games right at launch or even in the first 1-2 months from launch, expect poor SLI/CF support. For those who play older games, or buy games later on in their cycle, SLI/CF still have their place imho. For example, right now I'd pick GTX1070 SLI over GTX1080 all day, any day. If someone wants to max settings or play at 1440p 144Hz or 4k on a budget, 1070 SLI is way better than a GTX1080 and $1200 Titan XP is close to 2x the price of 1070 SLI. I understand that you are disappointed that modern titles don't support SLI/CF out of the box but that's partly because DX12 shifts the burden onto the developers, not AMD/NV only.



Who said that? It's reasonable to have stated that Fury X's shader underutilization and DX11 CPU overhead would be improved under a proper DX12 implementation with heavy use of Async Compute but I doubt anyone was claiming that Fury X would gain over 40% of extra performance, which is what's required to beat a 1080.



Who claimed that ACEs would add 50% extra performance to videocards? Benchmarks for this title show that GCN gains more from Async than NV does. So it's consistent with nearly every DX12 game out before. What difference does it make to you if Pascal is garbage under DX12 or not since we all know you'll buy 1080Ti, and then 2080 and then 2080Ti, etc. Why can't AMD GPUs benefit from Async Computer more than NV's GPUs but at the same time AMD's GCN GPUs have their own inherent bottlenecks that ensure NV's GPUs are just as fast and/or faster? You are so stuck on defending NV's DX12 architecture as if it matters. Even if Pascal had horrible DX12, average DX12 or amazing DX12 architecture, you would have still purchased at least 2 GPUs in the same generation. You are G-Sync locked and will never buy an AMD GPU. Moving on now....



Interesting, price doesn't matter as long the cards are the fastest now. Let me guess if Fury X was discontinued, you'd compare a $650 GTX1080 to a $250 RX 480 and site that RX 480 is AMD's current flagship? Keep telling yourself that. I guess it makes you feel better after paying $550 for mid-range 980 and $700 for mid-range 1080. Your posts have gotten too biased as of late when you start blatantly comparing GTX1080 to Fury X as if there were meant to be competitors, while also ignoring that one of them costs nearly 1/2. It's just funny to read coming from a guy who had GTX580 SLI while it got owned by HD7970Ghz CF, and then proceeded to have 980 SLI that was barely faster than $650 R9 295X2. You realize not everyone loves throwing $ into the toilet on overpriced NV cards like you do, right? I know if a $650 Vega 10 came out first and NV delayed GTX1080, I wouldn't be comparing a $350 980Ti to a $650 videocard and claiming 980Ti got owned hard.

You keep trying so hard to crap on Fury X, but almost no one on this forum has purchased that card for $650. I bet the gamers on this forum who purchased a Fury or Fury X have paid $300-400 for it, not $650. Those who paid $650 for it were likely mining ethereum which means the card has already paid for itself. Therefore, your constant praise for the overpriced 1080 and negativity targeted towards Fury X keeps coming off as justification for overpaying for the 1080 and wanting to feel superior to those Fury X owners. You do realize that if a Fury/Fury X owner paid just $300-400 for their card, they you know expect the $650-700 1080 to be 50-70% faster in almost all AAA games. That's why they didn't buy the Fury X for $650 in the first place either -- because that level of performance wasn't justifiable for the price. What makes you think these gamers who criticized the Fury X at launch view 1080 any differently?

Fury X actually plays this game well, so there is no controversy or story here despite you trying hard to claim that Pascal is WAY ahead -- > 120 fps from a $620 GPU is irrelevant in a 3rd person shooter when the $350 AMD card is getting 70+.

Why don't you instead talk about how 1080 is barely faster than GTX1070? I guess it's better to discuss how much of a "failure" a $350 Fury X is and ignore that it's possible to pick up an open box Asus Strix 1070/OC for $325 or $340 (after $25 MasterPass code) and get nearly identical gaming experience to the overpriced $600+ 1080. But then that isn't interesting enough for you since "Price is irrelevant."[/QUOTE]
So you think it is a plus that Fury X has fallen to half of what AMD was trying to sell it for? All that means is that it wasnt competitive when it came out, and it is even less competitive now.
 

tg2708

Senior member
May 23, 2013
687
20
81
@rs thanks for that post I just ordered that 1070. All I have to do is now return my zotac 1080 to microcenter once it comes since I'll have to exchange my gsync monitor for a less messed up one, has atrocious blb. It's weird since the monitor is the asus pg279q.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
So you think it is a plus that Fury X has fallen to half of what AMD was trying to sell it for? All that means is that it wasnt competitive when it came out, and it is even less competitive now.

All last gen cards have had their cost cut in half, it is what happens every single time a new card comes out. Or are you going to say the 980 TI is a $600+ card still? In that case it is terrible price/performance.

Fury Air was always price / perf level with the 980 TI. Fury X wasn't, but it had a water cooler. The cheapest 980 TI with a watercooler was an extra $100 and the same price/perf as the Fury X.

His point was trying to compare a 1080 to the Fury X is a terrible comparison because they aren't in the same league price wise now. Or do you think the 980 TI should also keep up with the 1080?
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
GamersNexus updated their results: http://www.gamersnexus.net/game-bench/2630-gears-4-pc-benchmark-updated-with-ultra-high-settings

They were using higher settings on AMD cards than Nvidia in the previous graphs, those have been fixed.





They didn't retest just re-graphed so still using the 16.9.2 not 16.10.1 ("game ready") drivers for AMD, but are for Nvidia (373.02)

Their minimums seem super low for AMD though, neither PCGH or GameGPU saw such low minimums. Wonder if they have some odd bottleneck as all cards have massive (50%ish) difference between minimum and average.
 
Last edited:

moonbogg

Lifer
Jan 8, 2011
10,637
3,095
136
I'll get this game. I enjoyed gears the first time around. I'm not buying no $100 pre order though. Hell to the NO.
 

Unreal123

Senior member
Jul 27, 2016
223
71
101
I never seen a person based so much that he posting gamegpu benchmark ,which they clearly stated that they never used game ready driver of Gears of war 4 and telling about prices of GPUs.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |