Ultrabook is the future of portable form factors, a fullsize laptop is not portable. Have fun with carrying a 17 pound hunk of junk around. If you want a full size gaming laptop that can't be carried, gets 1 hour of battery on a charge, and weights 17 pounds, yeah, dGPU is the way to go. Ultrabook is a different breed of machine and is the future.
Ultrabooks aren't designed specifically for gaming either, although Iris Pro is proving to perform VERY well even in the ultrabook form factor - clearly if you're a gamer and only a gamer, a laptop with a dGPU is the best purchase. Anyway, your last point is not correct since even mid range dGPUs add 40W alone to TDP. So 87W TDP versus 47w TDP (Iris Pro) + Quad mobile. Iris pro literally halves the TDP.
Bit of an exaggeration there.
Even under heavy battery life testing you can get 3 hours of use out of a monster 10 pounder.
And under gaming load tdp != power consumption. Given the inherent nature of iris and all igps, power consumption under gaming load will be around tdp levels (similar to how ULV cpus will consume tdp levels of power) as available thermal headroom is shunted toward GPU boost clocks. As in the AT review there was a gain when the tdp of the CPU was changed to 55 watts, indicating that turbo on the igp is not fully realized. Running a game on the CPU + dgpu power consumption is nothing like the two tdps added together. This is because the CPU tdp is the cores + igp tdp added together and the gpu tdp is that of the GPU under high stress loads. So under a really heavy game such as crysis 3 the cpu cores might be consuming 25-30 watts out of the 45 watt tdp with the 45 watt dgpu using around 30-35 watts.
Under furmark the 660m in my laptop consumes more than 45 watts. That's primarily because furmark increases the voltage to 1.0875 from 0.9375 (+0.15 volts) resulting in a massive increase in power consumption. In games the 660m runs at 0.9375 and generally only consumes about 30-40 watts depending on the game.
The benchmarks from a few months ago had old drivers, of which intel recently updated. The Iris Pro now beats the GT650M in most benchmarks while having HALF the TDP.
The GT650M alone is a 45W TDP part. The Iris Pro offers substantial power benefits.
Even using accelerated functions in a browser (which will use dGPU), an Iris Pro would completely destroy any dGPU setup for light usage, even browsing or media consumption in terms of battery life. Again, it's no surprise that Apple ditched discrete all together. This will only get worse in the future - AMD and nvidia will get shut out of high end ultrabooks (which is the future of form factors) and will be relegated only to full size 17 pound laptops. Impressive feat by intel IMHO.
You are completely wrong. Rarely, under light loads does the dgpu come into play. If it does and you want to stop it there is one really easy step you can do. Open nvidia control panel, set global setting to integrated then go to application profiles and set the game/ program exe files to 'high performance'. Takes a minute (unless you have a massive steam library) and lasts forever.
No, that's not true. dGPU is used during light usage for media consumption and accelerated browser functions, but it won't use 100% GPU load, obviously.
dGPU isn't "inactive" 100% of the time outside of gaming. I don't know where you get this idea. If that were true, dGPU would be useless since most ultrabooks and macbooks are NOT designed for gaming. The fact of the matter is, dGPUs are used a low clockspeeds during accelerated functions even in browsers. Chrome and firefox use your dGPU, even with optimus, during light usage. It won't match gaming GPU loads, but nonetheless - even these types of applications rely on dGPU for accelerated functions.
The notion that dGPU is completely turned off 100% of the time outside of the game is the silliest thing I have ever heard. I hope you understand that macbooks aren't used for gaming. The fact of the matter is that Iris Pro matches the GT650M while having half the TDP. Same performance, 47W TDP versus 92W TDP. I hope you see the implications of this.
Optimus does not generally select the dgpu for basic browser accelerations. Apple does do that for chrome but then that's a driver problem that apple needs to fix. And yes the dgpu is turned off 100% (though certain traces in the mobo are still active which consumes minuscule amounts of power) when not gaming under optimus unless certain conditions are met.
http://www.nvidia.ca/object/optimus_technology.html
As per the optimus whitepaper there are three primary calls that trigger optimus.
DX calls (3d game applications or DX program)
DXVA calls (video playback)
Cuda
Optimus is also quite intelligent. Play an older game and it may use the igp. Video has to be high quality to trigger dgpu usage. Set it to high performance gpu and use office and it will run on the igp. AT's heavy battery life tests include a 1080p H.264 12 Mbit/sec video playback that doesn't trigger the dgpu.