First off, that paper is rubbish. Whoever wrote it either wanted a specific outcome or was very ignorant.
The hardware they are comparing is a high performance 2013 system set next to a very unbalanced 2014 system. The 2013 system is something of a worst case scenario: the CPU (i7-4820K) and GPU (GTX 780) are lower tier binned products that typically have lower power efficiency than their more expensive counterparts (e.g. 4960X and GTX 780 Ti) since they came out of the factory with defective units and/or worse power characteristics than a fully yielding chip.
Even the RAM is a poor choice, being a 1.65v kit rather than 1.5v as is standard for DDR3. To top it off they used a 550W PSU, which for a system with that kind of power consumption is undersized. This causes it to run closer to its limits, and PSU efficiency drops off after 85% or so.
The 2014 system is an odd collection of parts that seems to be picked specifically to minimize power consumption under very narrow circumstances. That system combines a high-perf GTX 970 (a good card for efficiency) with a low-end Pentium G3258, and then goes with an even larger 760W PSU.
The problem with this whole test is that they're clearly using a extremely GPU limited test metric, which is why performance doesn't drop despite the major downgrade in CPUs. A GTX 970 is going to be CPU-limited in many games when paired with that CPU, which is why "balance" is a concern when building such a system.
But perhaps the most screwball part is the monitor choice. They ended up using an old
2004 Apple HD Cinema Display 23 for the 2013 system, which is a 23" CCFL-backlit IPS monitor. Meanwhile the 2014 system switches that out for a 24" LED-backlit TN monitor. Even ignoring the age difference for a moment (backlighting tech makes a difference here), you generally don't see users swap between IPS and TN due to the distinct tradeoffs between response time and color quality.
I don't want to accuse the author, but the only way these tests and configurations make much sense is if you built these systems to get a specific power outcome, while focusing exclusively on GPU performance to hide the downgrade of the other components.
It's the monitor. The thing has a max power rating of 90W, so it's nearly half the load (and I don't doubt for a second that these monitors are at max brightness).
dGPUs are already sub-10W. I don't want to say that power is "free," but when you're that low on the efficiency curve, I'm not sure the savings from firing up the iGPU are going to be felt beyond the PSU.