1. As Phy said, there is no need to pair an AMD GPU with an AMD CPU.
2. After accounting for RAM differences, GTX 980 is almost as efficient as 750 Ti. But NV charges a HUGE price premium for that efficiency, way more than Intel charges, and unlike CPUs, GPU idle wattage differences are smaller so it would take far longer to recoup the premium via lower power bills. But I was not talking about GPUs, I was talking about CPUs.
3. Greek electric rates are around 20-21 cents/kWh and rising, which is nearly double the US average price:
http://greece.greekreporter.com/2014/10/31/greece-champion-of-electricity-hikes/
http://greece.greekreporter.com/2014/10/31/greece-champion-of-electricity-hikes/
Even a 10 watt difference at the wall, if left on all day, is worth about 5 cents in Greece. That's about $20 per year. That's with just a 10 watt difference. Computers are not always on, sure, but they aren't always idle, either, and when under load Intel CPUs are WAY more than 10 watts more efficient for equivalent gaming performance.
Since Intel CPUs don't cost THAT much more than AMD CPUs, it's possible to recoup the Intel price premium via lower power bills. In fact it might even be cheaper than AMD if your electricity price or usage rate are high enough.