Don't worry IDC, Intel's misdirection marketing mavens have already succeeded.
Pages and pages of this thread have seen (former?) overclocking enthusiasts debate the average-to-magnificent performance gains of Intel's stunning 14nm graphics technology. Performance so impressive that entry level 28nm dGPUs will be demolished. Again.
The brilliance of releasing globally (in a local sort of way) the Broadwell up-sized mobile CPU on the desktop with disappointing overclocking - but way better iGPU a few months before the next 14nm power-sipping powerhouse, softens up the resistance to the next generation of overclocking disappointment and 3-7% IPC gains.
The masters of die manipulation have already sold us on the vital importance of perf/watt - for CPUs. For graphics processing..well...that's what 10nm is for. Moar iGPU, because, today's desktop enthusiasts are obsessed with iGPU performance and the frame-spitting magic of eDRAM.
Seven versions of Iris!!! This is the stuff enthusiast..uh...shareholder dreams are made of.
This new era of dismal IPC gains and moar iGPUs is so exciting my decoder ring finger is getting twitchy.
At this point, this is the new normal.
The Sandy K series seem to be an anomaly, compared to what Intel has pushed out post Sandy Bridge.
Intel gave us a bone to chew on with 1155 K series chips, with the lol-tastic integrated graphics, it did not matter, and most people buying K series chips were going to have a fancy dedicated GPU.
But now, integrated on chip graphics is a BFD (thanks Apple, and maybe a little thanks to AMD too).
The X79 / X99 platform and chips is where we are supposed to go now (and after SB K was replaced with IB K), with "real" TIM between the die and heat-spreader, for the OC enthusiasts.
For a hot minute, we had enthusiast level chips on the mainstream platform, but, Intel changed that; server lite or bust for chips w/o integrated graphics and good TIM.