Yes, apparently 90% of this forum is too daft or willfully blind to understand ASICS OVERCLOCK BETTER WITH MORE VOLTAGE.
Apparently its VERY difficult to understand that it will overclock more when voltage raising utilities come out.
This forum is 100% full of double standards, but specifically when it comes to favouring NV products. I was about to start a new thread but didn't even bother because I'd just get called out for "Damage control" or some other non-sense.
Here are the facts:
HD7970Ghz = $499 with a 3-game bundle, 50% more VRAM, huge overclocking headroom/scaling,
9% faster at TPU at 1440P on launch date than $499 GTX680 2GB. I recommended HD7970Ghz back then and today I am recommending the 980Ti. Coincidentally that is definitely not the case for the 980Ti fanclub which back then downplayed all key advantages of HD7970Ghz from launch.
#1. The amount of bashing GTX680 2GB received for having 50% less VRAM from the same people talking about 4GB vs. 6GB today on the NV side - practically nil.
#2. The amount of bashing GTX680 2GB received for being noncompetitive by 9% at the same price resulting in worse price/performance from the same people today bashing Fury X for not being fast enough at 1440p - practically nil.
#3. How many people during HD7970Ghz vs. 680 era discussed how one can buy after-market HD7970Ghz cards? Almost none of the people today who only want to compare factory pre-overclocked 980Ti cards vs. a reference Fury X. That's right cuz they don't want to discuss how a reference 980Ti is a failure in terms of noise levels and temperatures but back then the theme was to ignore after-market HD7970Ghz cards' noise levels, temperatures and overclocking at all costs and continue with the mantra that all HD7970Ghz cards run hot and loud despite not 1 reference HD7970Ghz ever sold in retail channels like Newegg or Amazon, etc. It's sad to see these insane double standards being applied today but 680 was somehow excused from this? Right....
While objective gamers have every single right to criticize Fury X for not delivering enough (until we see its full overclocking potential with voltage control), nearly every single person who recommended a 680 over an HD7970Ghz back then is a true
hypocrite. I'll never forgot how certain posters on here would constantly, and I mean constantly mislead with reference HD7970Ghz noise and temperature levels but now they are eerily quiet about how the reference 980Ti's cooler is a giant failure, especially when it comes to overclocking. :sneaky:
Did Fury X live up to the hype? No, for now it didn't but the amount of bashing happening and how it's a total failure is ludicrous considering HD7970Ghz had 50% more VRAM, cost the same and outperformed 680 by the same 9%.
Yet, how many of the same people called 680 a failure and recommended 7970Ghz over it? Practically none of the NV owners/Fury X bashers. It's pretty easy to spot blind fans on this forum to be honest.....
Fury X should have been priced at $549 but considering we heard the most insane non-sense like Fury X = R9 290X with HBM or Fury X = Dual Tonga XTs, we still got a card that at stock is nearly as fast as a reference GTX980Ti, but runs way cooler and quieter. Maybe with full voltage control, maybe it can overclock to 1250mhz? We can't say yet, but either way there seem to be driver issues because the scaling from R9 390X is simply too low at the moment to make sense unless the card is almost 100% ROP bottlenecked.
At 4K it basically trades blows with GTX980Ti/Titan X which means Fury X CF should be fairly competitive against 980Ti SLI.
Still, it's hilarious that the clueless/biased review sites are trying to say that 980 is a true competitor to Fury X in terms of gaming feel when objective sites with more than 5 games (or which have a spread of games and not just mostly GW titles) are finding Fury X destroying a 980 by 25-30% at high rez gaming and more or less trading blows with the reference 980Ti. Where Fury X starts to suffer is against after-market 980Ti cards but those are factory pre-overclocked 15-18% which makes sense that they would win since AMD's AIBs have so far didn't release any factory pre-overclocked Fury cards.
Right now, the logical choice is an after-market 980Ti but it remains to be seen if overclocking will improve with voltage control and if AMD can get another 5-10% performance increase with better drivers over time; and if AMD will allow AIBs to release faster clocked versions of the Fury X.
Generally speaking the biggest problem with this card right now is the price. I am not sure why there is so much bashing for 1080P performance since we know AMD has a DX11 driver overhead. Also, I would personally never buy a $650 GPU late June 2015 to play at 1080P. I think the key resolution for comparison is 1440P (or Ultra wide 3440x1440) for single chip and 1440P/1600P/3440x1440/4K for CF/SLI.
I agree. That's why I'm not going to waste $ to upgrade this gen because its a total non starter to pay so much for so little. I rather take the family out for an extra few nice dinners.
I am sticking to my point that this is a stop-gap generation. None of the cards out are fast enough for 4K on their own and with next gen we should get close to 980Ti SLI in a single chip. Also, most likely NV will drop SLI bridges and we will get at least 8GB HBM2 on flagship cards. This is a good time for HD7950/HD7970/680/770 owners to upgrade but for 970/290/290X owners in CF, the only way to truly 'upgrade' is to buy 2x 980Ti or 2x Fury Xs. That's a LOT of money for what smells like a stop-gap generation. I am not going to be surprised if 970's successor comes very close to the 980Ti in performance in 2H of 2016. For anyone who isn't in an urgent need to upgrade their GPUs, I would say there is no rush right now. On the NV side, their GM200 lacks 4K hardware decode/encoding which means it'll be outdated for 4K anyway when next gen GPUs launch.
This is a double-whammy: Fury X is slower than after-market 980Ti but once Pascal comes out NV is likely to focus all of is driver efforts on that architecture and performance for Maxwell cards may suffer like Kepler and Fermi cards did. That means spending $1300+ on 2 of these cards is going to look silly come 14nm/16 HBM2 gen.