I have both a GTX 1060 6GB and a RX 470 4GB in my household (one for me, one for the wife).
- Ignore the "AMD is DX12 futureproof" hype
The reality is that NV has less improvement-room in DX12 than the AMD RX 480, but GTX 1060 is the faster card to begin with. So if AMD = 100 and NV = 110 in DX11, maybe AMD = 115 and NV = 120 in DX12. It's basically a tie but AMD still hasn't beaten NV. Anandtech had a good writeup on why this is... ignore the lunatic claims that Pascal doesn't have DX12. It does. It just implements differently than AMD. Also, most of the games with DX12 aren't natively DX12, and thus are not useful indicators of what native DX12 games will be like. DX12 takes more programming skill to develop for and trying to bolt it onto DX11 games doesn't work out very well. One of the few games that is natively DX12 is Ashes, and if you look at what happens in that title, AMD Polaris cards do gain a little more performance than Pascal, but that just merely allows AMD to catch up, not surpass. Similarly, in Futuremark's Time Spy, they programmed DX12 in a vendor-agnostic way, and you see GTX 1060 beating RX 480 there in DX12. So unless a game company takes AMD or NV money to try to optimize it just for AMD or NV, we'll probably end up seeing a lot of ties and near-ties between the RX 480 and GTX 1060 in native DX12 games. And if it ever boils down to game companies taking money to optimize for one arch over another, guess what, NV has way deeper pockets.
Most sites are really lazy about testing DX12. E.g., many sites breathlessly talked about how AMD beat NV in the Deus Ex Mankind Divided DX12 beta on the basis of benchmarks. In reality, they didn't do real life gameplay, and they didn't do frametime analysis. Under realistic conditions like gameplay and looking at frametime, AMD lost to NV in Deus Ex Mankind Divided DX12. Badly. I mean, AMD did fine in DX11 but in DX12, AMD got killed by NV.
http://techreport.com/review/30639/...x-12-performance-in-deus-ex-mankind-divided/3 (Frametime matters more than fps because the human brain/eye hate microstutter. A high fps with bad frametimes feels worse to look at than okay fps with great frametimes.)
Also don't expect a lot of native DX12 games for a long time--long enough so that DX12 shouldn't be that big of a consideration when deciding between Polaris and Pascal. The Frankenstein-monster DX11 games that got DX12 bolted on don't seem representative. It's like, RotTR was NV-sponsored and was broken on AMD hardware. Deux Ex was AMD-sponsored and somehow wound up broken on DX12 for both NV and AMD.
- GCN is used in PS4/XBONE and AMD cards for now. I think it's unlikely AMD does a drastic overhaul of GCN until the next console gen comes out (the one after Scorpio/PS4 Pro). So although I wouldn't characterize GCN as automatically futureproof, I'd say that it's unlikely that game engines will run horribly on GCN until the next console gen. NV is less certain but they have a LOT more resources than AMD, and I really doubt NV would allow its products to falter as a result.
- However, NV Pascal has something to counter that potential GCN-bias: efficiency and more OC headroom.
Polaris 470/480 doesn't come with as much OC headroom as Pascal. OC vs OC, GTX 1060 pulls ahead by even more. Even in situations like broken DX12 implementations that favor AMD, can be partially resolved simply via OC'ing GTX 1060.
- No real diff between 4,6, and 8 GB VRAM right now for 1080p. Maybe for higher rez.
4GB vs 6GB doesn't matter at 1080p for practically all games right now, and might only start to matter above that resolution in some corner cases; even in corner cases, the difference is apparently minimal in most cases. 8GB is overkill at 1080p. A lot of newbie gamers think that buying more VRAM will somehow futureproof them... uh not exactly. The GPU has to be strong enough to make the VRAM worthwhile, else the extra VRAM is wasted. As a thought experiment, think of what happens if you try to tie 8GB of VRAM to a RX 460. Yeah, exactly. So by the time games really "need" 8GB+, you'd need a far stronger GPU than a GTX 1060/RX 480 anyway.
Overall, at the same price, the GTX 1060 is the bang for buck winner unless there is some other restriction, like if you must have FreeSync or something.
Furthermore GTX 1060 is more power efficient, particularly in multimonitor and media playback but also in gaming load and even in idle watts. For a typical gamer, this adds up to at least a few bucks per year. So using a hypothetical RX 480 vs GTX 1060 at the same price is not taking into account lower cost of ownership.
So why then did I buy a RX 470 for my wife's computer and a GTX 1060 for my own?
1. My wife's PC runs to a 1080p HDTV, so multimonitor is not an issue. I also have no restrictions on upgrade timetable for that PC. Therefore the idea is to run the RX 470 4GB (a better deal than the RX 480 4GB or 8GB imho) for 1-2 years and then swap to a better card down the road. I think it's clear to everybody closely watching the industry that NEITHER Polaris or Pascal are the bees knees architectures, so I'm not willing to pay the premium to buy a RX 480 or GTX 1060 when a RX 470 is good enough for now.
2. I can't swap the GPU in my PC for another few years, due to reasons I won't get into here. Thus I don't have the option to buy a RX 470 and upgrading in 1-2 years. Even if GTX 1060 were not the bang for buck leader, I wanted to get the most power-efficient GPU all else equal. My PC has 3 monitors, and Polaris guzzles energy when driving 2+ monitors. It's been like 6 years since Eyefinity and AMD still hasn't fully solved its multimonitor/media playback wattage woes. NV solved that years ago.
TL;DR:
GTX 1060 is the better card overall, even in native DX12 games b/c what would otherwise be ties go in favor of the GTX 1060 via the higher OC headroom. However the RX 480 isn't that far behind and realistically you can find them for a little cheaper than GTX 1060 6GB cards. However it shamefully guzzles energy on multimonitor/media playback/gameplay load which costs you a little extra in electricity bills, probably around $5-10/year for typical gamer usage patterns and electric rates.
If you're on higher than 1080p, I'd get the GTX 1060 6GB or RX 480 8GB, probably the GTX 1060 6GB if they are the same price, but the GTX 1060 is usually more expensive. Also, just get the RX 480 if you have a FreeSync monitor.
If you are on 1080p and plan to swap cards in 2 years or less, get a RX 470 4GB and wait for the next-gen cards. Trade in your RX 470 4GB at that point.
If you plan to swap cards in 3-4 years, it's too hard to say. Too many things can change in the interim. But if you pay higher electricity rates than average, I'd recommend the GTX 1060 6GB if you can get it for the same price as the RX 480 8GB, because you'll save at least a few dollars or more each year... perhaps more like $15/year if you live in high electricity rate places like Hawaii.
Edited: to correct typos and have more accurate power cost saving calculations.