I find it frustrating that many of the people in here who are claiming that NVIDIA is "ripping off we the gamers" seem to be arguing from false premises.
First of all, using die size to compare whether one chip is a worthy successor to another is ridiculous. In this day and age, in order to move to new nodes, the capital intensity goes substantially up due to the need for double patterning.
This means that even though the foundries can achieve improved cost/transistor, cost/mm^2 is actually on the rise. Intel estimates that in going from 22nm (single patterning) -> 14nm (double patterning) led to a 30% increase in wafer cost. That's NOT small.
If we assume something similar for TSMC in going from 28nm to 16nm, then this means that the cost of 300mm^2 of silicon in 16FF+ is about that of ~390mm^2 of TSMC 28nm silicon, assuming normalized yields. TSMC 16nm yields are said to be crazy good right now, so let's assume similar yields.
In addition, the performance/transistor has gone way up, so while some people will sit there and gnash their teeth about how they're not getting the "right amount of silicon die area" it's ultimately delivered performance/$ that matters.
If 300mm^2 of 16FF+ silicon gives you way better performance than 600mm^2 of 28nm silicon, and it is being sold to you for the same price as the 600mm^2 part, then why complain? Why whine that your "rights" as "we the gamers" are being trampled upon by evil, greedy NVIDIA?
If you don't like it, buy an AMD alternative. And, I guess if AMD does the same thing (and they appear to be doing just that), the best I can suggest is to use the "free" Intel/AMD iGPU on your CPU of choice.