Meanwhile, 3 GBs of VRAM is really meh these days (4 GBs is next) and I really can't find any justification for the 3 GB 1060. It's just not right. $200+ cards should never be VRAM-limited so soon after release, imo.
Overall takeaway: The Fury X is erratically-performing garbage (as objective people and Nvidia-lovers alike have known for a year and a half now) likely due to either its weak front-end and/or VRAM (and at other times due to CPU overhead). Contrary to the former rallying call of a few particular defenders, it is not aging better than the real 980 Ti (ie. aftermarket). It's even experiencing a bit of a Kepler effect relative to the 480 where its geometry throughput is stressed.
Meanwhile, 3 GBs of VRAM is really meh these days (4 GBs is next) and I really can't find any justification for the 3 GB 1060. It's just not right. $200+ cards should never be VRAM-limited so soon after release, imo.
Meanwhile actual AMD competitors to GM200/GP104 are still no-shows, let alone GP102. And GP104 masquerading as the flagship GPU, in combination with its high prices, is really getting boring. After all the 16/14nm hype, the 980 Ti provided a bigger performance jump from the 980 while still on 28nm than the 1080 did over the 980 Ti. And Nvidia gets rewarded for it with record profits while AMD are clearly outclassed and no longer a threat, so why would they even bother giving us impressive gains for reasonable prices again?
2016 sucked for high-performance microprocessors (including CPUs).
Every time I hear this, it sounds like an excuse. If it all comes down to AMD being slow because of Gameworks, then why is the 8 GB RX-480 right on par with the 6 GB GTX 1060? The Fury X falling so short of the 980 Ti seems more because of its 4 GB memory, to me.
I need Volta to get here ASAP. Titan X isn't enough for 144hz 1440p games of the future it seems and I'm not going to deal with crap multi GPU scaling in major titles just to get the performance I want in titles like Watch Dogs 2.
After market 1080ti would probably look nice. But we won't see it in a timely enough fashion to matter. Well, not when I know Volta is right around the corner. Navi is 2018? So is Vega supposed to compete with high end pascal/1080ti and Volta?
"The recommended volume of use of VRAM for the resolution of 1920 x 1080 will become 6144 MB VRAM for 2560h1440-6144 MB VRAM and to resolve 3840h2160 about 8192 MB video memory."
RIP 3G 1060! That performance drop between the 6 and 3 gig versions is insane. I have a friend who is looking to upgrade, and now I'm glad I told him to get 6 or 8 gigs of VRAM.
Those are 1440p results, hardly a resolution one would be using with a 1060 in this game. At 1080p, the 3gb is only 20 to 25% slower, pretty much in line with the price difference.
Those are 1440p results, hardly a resolution one would be using with a 1060 in this game. At 1080p, the 3gb is only 20 to 25% slower, pretty much in line with the price difference.
Performing a solid 20-25% slower at 1080p, presumably because of VRAM limitations, is a pretty big deal since that difference is only going to grow as games use more VRAM. Unless games magically use less memory, 3gigs just isn't going to cut it. I have a friend looking to get a card for under $300, and while I'm happily recommending a 1060, I'm making sure he stays the hell away from the 3gig versions.
I'm happy I held out. Either way if you're with amd your life sucks at the high end. At least you shouldn't put yourself through the insane Fiji performance to only get gimp ed by 4gb vram. Was that even addressed in this article? How is Fiji managing when it should need more vram for textures?
If 4GB of VRAM is an actual physical limitation in this title, why is Fury X CF beating GTX1080 8GB or RX 480 8GB at 1440p?
This forum loves to bash the Fury X's VRAM limitations but this^ benchmark isn't showing it. It's possible different AA settings that PCGamesHardware implemented is what's causing the discrepancy.
The Fury X bashing is amusing indeed since anyone who bought it for $600-650 has long paid for it with mining.
I see no one in this thread discussing how a console port that was made for PS4/XB1 is wiping the floor with high-end videocards that are magnitudes of times faster than the console GPUs. The graphics are nothing special either.
I need Volta to get here ASAP. Titan X isn't enough for 144hz 1440p games of the future it seems and I'm not going to deal with crap multi GPU scaling in major titles just to get the performance I want in titles like Watch Dogs 2.
Volta won't be enough either if 2015-2016 AAA console ports are anything to go by. I love it how you guys try to justify reasons why the next generation's $700-1200 videocard is worth buying instead of discussing the current abysmal state of optimization of many AAA PC console ports.
These graphics require a GTX1070/980Ti/1080 to run at 60 fps at 1080p? I am sure GPU manufacturers such as NV are loving it as gamers lap up this level of fail and keep throwing $$$ at the problem.
Ubisoft is the king of GIMP. Horrible performance and average graphics in the vast majority of their games outside of Far Cry series. Ubisoft has shown time and time again they don't know how to optimize their open world games. The last 2-3 Assassin's Creed games were horribly optimized.
If 4GB of VRAM is an actual physical limitation in this title, why is Fury X CF beating GTX1080 8GB or RX 480 8GB at 1440p?
This forum loves to bash the Fury X's VRAM limitations but this^ benchmark isn't showing it. It's possible different AA settings that PCGamesHardware implemented is what's causing the discrepancy.
Pcgameshardware using only aftermarket cards.Thats why rx480(overclocked and not throttling) is so fast vs furyX.Rest using reference crap cards.Pcgameshardware also have 980Ti at 1350mhz vs rest that using 980Ti at 1000-1100mhz(thats 30% more performance).So its 35% faster than furyx.
Pcgameshardware also have best benchmarks because we actually see all GPU frequencies.They are just best.
Performing a solid 20-25% slower at 1080p, presumably because of VRAM limitations, is a pretty big deal since that difference is only going to grow as games use more VRAM. Unless games magically use less memory, 3gigs just isn't going to cut it. I have a friend looking to get a card for under $300, and while I'm happily recommending a 1060, I'm making sure he stays the hell away from the 3gig versions.
Seems everyone loves to bash the 3gb card and expect it to perform up to the 6gb card despite the fact that it is cheaper and has less cuda cores as well. No one is arguing that the 6gb card isn't better, but in a price/performance analysis, which seems the gold standard for every other card, the 3gb card holds its own, even in this game.
What are you talking about? 780Ti is not doing well in SLI. It has the worst SLI scaling of all cards - 40% SLI scaling if you look at the minimums.
All other cards show 60% scaling in minimums.
In 1440p the best dual card performance is coming from R9 290 with 94% scaling in minimums and averages
780 Ti SLI looks like an abomination. It used to be within 10% of the 980. You'll need two of them to achieve that level in this game. One of them is almost as bad. Eaten by the 290 and getting tortured from a few feet behind by the 380X. Also 780 = 960.
What are you talking about? 780Ti is not doing well in SLI. It has the worst SLI scaling of all cards - 40% SLI scaling if you look at the minimums.
All other cards show 60% scaling in minimums.
In 1440p the best dual card performance is coming from R9 290 with 94% scaling in minimums and averages
780 Ti SLI looks like an abomination. It used to be within 10% of the 980. You'll need two of them to achieve that level in this game. One of them is almost as bad. Eaten by the 290 and getting tortured from a few feet behind by the 380X. Also 780 = 960.
well once again we were talking about vram limitations not sli or the gtx780ti's performance.
on a side note.......
WHo cares about a 780ti card from 3 generations ago ?
The only reason the 290 gets optimized is because AMD has not released a card much faster or with any better features in 3 years. 290/290x/390/390x/470/480, wow what choices.
The Fury line sucked, everybody with a brain bought a gtx980ti or just hung on for the mighty Vega, in fact they are still hanging on and will be for another 4 months.
checked this out on my system with someone's copy. Perf was better than indicated here. Not sure. Especially 1080p. Playing with 2 monitors without exclusive fullscreen resulted in poor perf, so maybe that's whats going on fpr pcgameshardware. gamegpu looks more accurate.
Needing a GTX 1080 to play this game at 1080p validates all the jokes about GTX 1080 being a 1080p card. $800 for that card and you get to play 1080p@60fps. Nice one. Very clever of them. But is it the card's fault? Nope. See truth statements below.
This game is an absolute PILE of steaming, hot, rotting barrel trash, period. End of story. This game doesn't look better than GTAV, yet GTAV gets around 90-100FPS @ 1440p on two 980ti's and by the looks of it, this game will get around 65-80fps on the same rig but at TEN EIGHTY PEE. How is it even possible that every single gamer on this big pearly blue planet isn't trashing the hell out of this game right now? It runs like an absolute joke.
We have:
Forza Horizon 3 - runs like crap
Mafia 3 - runs like crap
Watch Dogs 2 - runs like crap
All these games suck at 1080p on immensely powerful PC's, yet swank along just fine on a console that has the hardware equivalent of a smartphone? Send this trash of a game back to the sticky bin that it slimed its way out of. Do not want!
? My system runs Forza 3 great @ 1080p, high settings over 50fps. no stuttering and the game is beautifull.
Here is a gtx1060 overclocked running Watchdogs 2 @ 1080p, 1440p and 4k.
Looks fine and running great , except a few hiccups at 4k. https://www.youtube.com/watch?v=w74TUgQWNsQ
Mafia 3? I have no idea.
FUNNY HOW PEOPLE THOUGHT IT WAS GREAT THAT AMD GPU'S WERE IN CONSOLES.
NOW PEOPLE ARE COMPLAINING ABOUT CONSOLIZED TRASH.
funny HA?
Seems everyone loves to bash the 3gb card and expect it to perform up to the 6gb card despite the fact that it is cheaper and has less cuda cores as well. No one is arguing that the 6gb card isn't better, but in a price/performance analysis, which seems the gold standard for every other card, the 3gb card holds its own, even in this game.
1060-3G does have his price, but perhaps we have lost the necessary things. Assuming the barrel has a very low plank, does it mean that the other boards have lost their function? It seems that the recent 3A games in the proof of this matter
I was surprised that both NVIDIA and AMD have released the latest drivers, which optimizes the number of frames and effects for watchdog 2.
Surprisingly, so far, the two evaluation agencies, are using only the latest NVIDIA driver, without the use of AMD's latest driver.
I am a hardware enthusiast from China, our side of the hardware media always accept the bribes of manufacturers lead to evaluation scores fraud, so I came to the world's Web site to seek real information.
Is this the same?
That's Watchdogs 1. And the RX480 is literally 3 frames behind at 1080p according to pcgameshardawre and a whopping 1 frame behind the 1060 in the gamegpu test. Seems fine. The Fury X is all over the place. The fact that you need a 1080 to game at above 60fps at 1080p, ridiculous.
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.