It wouldn't be a VC&G benchmark thread without throwing the "lets rank review sites" into the mix!
Since I've followed this forum, Gamer's Nexus GPU reviews were never viewed as objective or reliable.
And DAMN, $450 GTX 770 4GB > $0 HD7990!
Fixed. HD7990 was one of the best bitcoin mining cards ever made. Just because you never utilized the mining potential of AMD cards during that era was solely your own decision. I never compared single GPU to multi-GPU cards in my comparison but rather single AMD GPU vs. single NV GPU cards. It's interesting how nearly every single NV GPU except GeForce 8 era that you have purchased got completely demolished by its AMD-equivalent throughout the holding period in which you owned a particular NV GPU. What did you own before 970, a 760/670/770? Yup, destroyed in modern games by 7970/Ghz/R9 280X, and R9 290X/R9 390 also > 970. I have to wonder how you keep buying worse videocards from the same company over and over and over? That's a remarkable track record of buying inferior tech gen after gen. It's hard to believe that in 6-8 years you never came across a deal or waited for a deal when AMD card was easily the better choice.
You said in this very thread that the game is not very CPU dependent. Various posters, including myself, provided data that shows the opposite. Now it's "who cares?"
The FX series are crap CPUs for gaming, everyone knows that. Nobody in their right mind would pair one of them with a GTX 1080, as it would just be a waste..
No one was suggesting to pair a GTX1080 with an FX processor. The Computerbase charts show a GTX970/R9 390 with an FX8370. It shows a severe CPU bottleneck for NV GPUs, which you want to dismiss after bashing AMD's CPU-overhead for years. Now it doesn't matter since NV's CPU overhead in this game is huge?
The PC version is hitting triple digit frame rates, has 4K textures, uses advanced DX12 features, and more settings than you can throw a stick at, and yet only you would consider it poorly optimized.
I never said the game is poorly optimized. I said it's easy to have a well-optimized game when the graphics are average and aren't pushing the envelope. Sorry, but compression is not responsible for inadequate level of polygons, primitive levels of detail on objects, lack of tessellated edges, lack of proper hair, mediocre vegetation complexity, etc.
Had you watched the Candyland video in detail, you would have seen the game was designed for Xbox One. It's not a true next generation PC title, despite you hyping it as the first true DX12 game on the PC. Not even close. The graphics are nowhere near as good as Deus Ex Mankind Divided or Rise of the Tomb Raider.
Could the PC version look a lot better? Of course it could, but the game was developed around the Xbox One first and foremost, so obviously this was a limiting factor..[/quote]
Right, so it shouldn't be surprising that a GTX970/R9 290/290X/R9 390/480/1060 are playing this game at 1080p Ultra without much effort. What's that have anything to do with "exceptionally optimized DX12 title"? This is the bare minimum expected given that the graphics on the PC and Xbox One with HD7790 don't deviate all that much. I would sure hope that a PC with CPUs and GPUs 3-4X more powerful than Xbox One would be able to play a console game at 1080p 60 fps on Ultra in 2016.
The PC version is hitting triple digit frame rates, has 4K textures, uses advanced DX12 features, and more settings than you can throw a stick at, and yet only you would consider it poorly optimized.
Too bad that none of it comes together to create a next gen graphically mind-blowing game. As I said, this game runs well but just because it runs well when other Windows 10 store PC ports have failed miserably out of the gate, doesn't mean it itself is a true next gen DX12 PC game -- and that's what you tried claiming before it launched. How can it be a true DX12 game from the ground-up and yet look worse than not only some of the best DX11 games, but even half-baked DX12 Deus Ex MD and Rise of the Tomb Raider implementations?
Looks like my 980ti @ 1465mhz should still survive this game pretty well. Too damn bad I can't use both of them. I am SOOO done with SLI.
It's been true for a while that if you play games right at launch or even in the first 1-2 months from launch, expect poor SLI/CF support. For those who play older games, or buy games later on in their cycle, SLI/CF still have their place imho. For example, right now I'd pick GTX1070 SLI over GTX1080 all day, any day. If someone wants to max settings or play at 1440p 144Hz or 4k on a budget, 1070 SLI is way better than a GTX1080 and $1200 Titan XP is close to 2x the price of 1070 SLI. I understand that you are disappointed that modern titles don't support SLI/CF out of the box but that's partly because DX12 shifts the burden onto the developers, not AMD/NV only.
Oh is it now? I thought that since NVidia didn't have proper DX12 hardware or some such, that even Pascal would be slaughtered by the mighty AMD Fury X, with its DX12 certified hardware ACEs
Who said that? It's reasonable to have stated that Fury X's shader underutilization and DX11 CPU overhead would be improved under a proper DX12 implementation with heavy use of Async Compute but I doubt anyone was claiming that Fury X would gain
over 40% of extra performance, which is what's required to beat a 1080.
Well, if you don't like the comparison, you should blame your buddies for spreading the false rumor that NVidia sucks at DX12, and that AMD's ACEs add 50% extra performance
Who claimed that ACEs would add 50% extra performance to videocards? Benchmarks for this title show that GCN gains more from Async than NV does. So it's consistent with nearly every DX12 game out before. What difference does it make to you if Pascal is garbage under DX12 or not since we all know you'll buy 1080Ti, and then 2080 and then 2080Ti, etc. Why can't AMD GPUs benefit from Async Computer more than NV's GPUs but at the same time AMD's GCN GPUs have their own inherent bottlenecks that ensure NV's GPUs are just as fast and/or faster? You are so stuck on defending NV's DX12 architecture as if it matters. Even if Pascal had horrible DX12, average DX12 or amazing DX12 architecture, you would have still purchased at least 2 GPUs in the same generation. You are G-Sync locked and will never buy an AMD GPU. Moving on now....
The price doesn't really matter in this regard, because the FuryX is AMD's flagship GPU. So it will be compared to NVidia's high end GPUs no matter what..
Interesting, price doesn't matter as long the cards are the fastest now. Let me guess if Fury X was discontinued, you'd compare a $650 GTX1080 to a $250 RX 480 and site that RX 480 is AMD's current flagship? Keep telling yourself that. I guess it makes you feel better after paying $550 for mid-range 980 and $700 for mid-range 1080. Your posts have gotten too biased as of late when you start blatantly comparing GTX1080 to Fury X as if there were meant to be competitors, while also ignoring that one of them costs nearly 1/2. It's just funny to read coming from a guy who had GTX580 SLI while it got owned by HD7970Ghz CF, and then proceeded to have 980 SLI that was barely faster than $650 R9 295X2. You realize not everyone loves throwing $ into the toilet on overpriced NV cards like you do, right? I know if a $650 Vega 10 came out first and NV delayed GTX1080, I wouldn't be comparing a $350 980Ti to a $650 videocard and claiming 980Ti got owned hard.
You keep trying so hard to crap on Fury X, but almost no one on this forum has purchased that card for $650. I bet the gamers on this forum who purchased a Fury or Fury X have paid $300-400 for it, not $650. Those who paid $650 for it were likely mining ethereum which means the card has already paid for itself. Therefore, your constant praise for the overpriced 1080 and negativity targeted towards Fury X keeps coming off as justification for overpaying for the 1080 and wanting to feel superior to those Fury X owners. You do realize that if a Fury/Fury X owner paid just $300-400 for their card, they you know expect the $650-700 1080 to be 50-70% faster in almost all AAA games. That's why they didn't buy the Fury X for $650 in the first place either -- because that level of performance wasn't justifiable for the price. What makes you think these gamers who criticized the Fury X at launch view 1080 any differently?
Fury X actually plays this game well, so there is no controversy or story here despite you trying hard to claim that Pascal is WAY ahead -- > 120 fps from a $620 GPU is irrelevant in a 3rd person shooter when the $350 AMD card is getting 70+.
Why don't you instead talk about how 1080 is barely faster than GTX1070? I guess it's better to discuss how much of a "failure" a $350 Fury X is and ignore that it's possible to pick up an open box Asus Strix 1070/OC for
$325 or
$340 (
after $25 MasterPass code) and get nearly identical gaming experience to the overpriced $600+ 1080. But then that isn't interesting enough for you since "Price is irrelevant."