We still have a long ways to go before real-time graphics reach movie quality CG. As long as discrete graphics provide a noticeable difference, I suspect people will continue to buy them. As mentioned above, there are fundamental limitations to an iGPU, which may imply that the gap between iGPU and dGPU will be unbridgeable for some time to come.
True. However, if rumors about next gen consoles prove correct, then the ports from said titles will lead to an entire generation of consoles and their ports that utilize the high end discrete very poorly. I agree that in a perfect world where software is released that takes advantage of the performance of discrete, that discrete would continue to reign.
But we don't live in that perfect world. We live in the world where Activision, EA, Ubisoft, Bethesda, and the indies make for us. They know that limiting games to the highest of the high end is foolish. And as Intel continues to raise the bar slowly to the level where it can play console ports, the attraction to target those iGPU's will continue to become more tempting.
In a way, this is great for PC gamers because it will lead to more gaming PC's and the slow death by a thousand cuts of consoles as general performance devices (ie., smartphones, tablets, PC's, HTPC's to a lesser extent) make specialized performance devices (ie., consoles, blu-ray players and streaming players to a lesser extent) less relevant.
But it will lead to the discrete market becoming about increasing amounts of AA and other gimmicks that most gamers won't even miss. As the discrete market contracts (as the higher part becomes irrelevant and the lower part becomes pointless because it's replaced by iGPU's), pricing will increase as profits reduce. Eventually, nVidia will be selling high priced GPU's to a select few and eventually even that will dry up as the pricing will get too high.
In a perfect world, we'd see nVidia switch to a "GPU boosting" model that would boost the GPU performance of the iGPU without duplicating all the things it already has built in, which would reduce the size of their GPU and allow them to focus mainly on performance at a reduced cost to manufacture, but nVidia and Intel seem unlikely to work together toward that and Microsoft doesn't seem to care about the PC gaming market beyond having an app store.
As for AMD, I think they're a sinking ship, tossing out everything they can throw out, hoping against hope that the next thing they throw off the ship will help them retain some buoyancy. They're throwing people, products, upgrades, new chips, good ideas, etc all off the ship.
And that ship just keeps on sinking. I think they're hoping to sell the idea they can become profitable again long enough to get an infusion of cash and/or a buyout offer.
So I think when we talk about discrete GPU's, what we're really discussing is nVidia. I don't think AMD will be around in 10 years and I'm not confident they'll be around in five. I'm not alone in that perception and that, more than anything else, is going to make what they're trying to do in the short term very, very difficult.