I am merely stating a fact. If you think stating plain and simple facts is bad advice then you're going about things the wrong way. It is important to be able to separate raw facts from conjecture. It is a fact that you will save $40 over the course of two years by using a GTX 1060 in place of a RX480. What is not a fact is the following statement: you need 6GB, 3GB is not enough. That is conjecture. The facts as of today clearly show that you get more value per dollar with the 3GB card. This could change in a year, could change in two. But it is not a statement of fact.
The first claim of yours is even less of a fact than the claim that you call conjecture.
The cost of a GPU, strictly on power, is wholly determined by price of kwh for each individual user, which is highly variable. Yes, the 1060 sips less power in comparison, so with all else being equal for the user on a direct comparison, the 1060 will save money. But it isn't this nonsensical $20/year that you are claiming. It is more like $1-3 per year, strictly from power costs, for the vast majority of users. THE ONLY people that are pushing measurable power costs through GPUs are the users that are primarily mining cryptocurrency, and they are primarily using AMD cards to do it. Why? Because it is simply more efficient in compute, despite the differences in power consumption.
The fact here is that you are creating a nonsensical argument: gamers gaming an average of 5 hours/day for an entire year, expecting certain power costs on their bill, and how 1060 is better for these people for this reason. The actual fact is those users that are pushing those kind of hours are doing it for another purpose, and they are actually making
more money doing it than if they were using a 1060 instead of a 480.
Until you come back with data on gamers' hours/year and some nice average costs of kwh across the country (and various other countries), your argument is pure FUD.
The actual data we have to argue that 3gb is not enough compared to 6gb is actual history (our very best predictor in this field--and our only real predictor in this field). That is simply a fact based on what we know of Developer habits over the last several years and generations of low-VRAM GPU offering getting thrashed by their contemporaries within a year's time, and the simple acceptance that AAA games going forward are designed for consoles first (The games that actually need gPU--so the only reason we hare having this discussion: those specific handful of games) with much higher dedicated VRAM allotments, then ported to PC. It is simply an indisputable reality that VRAM limitations will always be an issue. Always. Continuing down this path of denying a known reality is really just pathetic at this point. 4Gb will be too little, 6gb will be too little and yes: 8gb (pretty much what we should be aiming at right now for decent future proofing, imho) will be too little. FPS benchmarks simply do not tell the story that the low-VRAM argument is using. FPS benchmarks already compensate for the concession of lower quality settings, so the failure of this card is already baked into the numbers.
That is simply a fact. The crux of this entire charge against the woeful existence of a 3gb "1060" (again, this thing really isn't a 1060), is that it forces the buyer to make real-time concessions on quality in a world where the rest of its field will be leaving it in the dust within months' time, with no real significant cost difference. This is a card that you very well know will be wholly un-recommendedable (by anyone with any self-respect) in about 5 or 6 months time, and the 6gb 1060 and 4/8gb 480s will be the only cards in that class worth considering (and honestly, that is true of the situation
today--only stubborn fanboys refuse to accept that their favorite silicon company can put out such a woefully ill-considered piece of hardware. Companies make mistakes, just accept it and move on.)