Power efficency of gaming PCs is bad

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

alecmg

Junior Member
Aug 13, 2015
11
0
0
I want to draw an analogy.

Rockets are awfully inefficient at flying people somewhere compared to jetliners. Yet you need a rocket to get to the orbit and no jet can even go close.

Gaming rig is built for max fps. Yea you could get 20 fps with fairly low power draw, maybe integrated GPU in a NUC, but 20fps doesn't get you there. You need 60-120 to enjoy a game, and at high quality if possible. Even if it means 10-20 times more power for 3 times more fps.

If I wanted an energy efficient gaming rig today I would build something like:
Broadwell i7-5775c
GTX 980 (not Ti)
Big SSD
mATX motherboard
400W PSU
 
Last edited:

JustMe21

Senior member
Sep 8, 2011
324
49
91
You build your gaming rig for what you play. I used to go for high end video cards, but I realized I wasn't playing graphically intensive games or that all the extra eye candy didn't warrant a faster video card to allow me to see it. I play MechWarrior Online with an i7-3770, Seasonic X-750 PS, and a 650 Ti and I pull 116W max according to my Kill-A-Watt meter. Non 3-D usage is about 60w. Also, doesn't having the fastest video card to play a multiplayer FPS seem unnecessary since people are going to dial down all the visual settings to get the best FPS possible?
 

john5220

Senior member
Mar 27, 2014
551
0
0
This is a serious concern I have, why isn't the integrated GPU in a intel CPU not used when the PC is on idle?

Its like you are completely wasting money on the IGPU part of a intel chip since majority of gamers never even use it.

Why don't intel just sell a cheaper i3 or i5 without the GPU? like what AMD does?
 

WhoBeDaPlaya

Diamond Member
Sep 15, 2000
7,414
401
126
When not running a game, most modern video cards only add 10-15w to the total system power consumption. Most modern components don't draw much power at idle. I imagine the big offenders at low-load are probably desktop motherboards and power supplies. Moving from a full ATX board to ITX and from a 650w PSU to a 380w dropped my idle power consumption in half.
Nope. Why on earth would downsizing the PSU change how much is drawn at the wall? (assuming identical efficiency at a given load)

If you have a big honking 2kW PSU that's more efficient at a given load than a 500W PSU, it's going to draw less from the wall
 

zir_blazer

Golden Member
Jun 6, 2013
1,184
459
136
Why don't intel just sell a cheaper i3 or i5 without the GPU? like what AMD does?
They did have such a part, the Core i5 2550K Sandy Bridge. Do you remember it? Obviously no one does, since everyone purchased the 2500K. And today, those that thinks like that would purchase instead a Xeon E3 1231 V3, but I don't like it cause the IGP represents like half of the physical die and Intel values it at around 20-30 U$Ds. 15% of the Processor price for being able to use 50% of the die sounds like a good deal.
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,881
3,230
126
i have a 370W+ idle.... :X
i probably have a 800W full load.

Gaming systems can become silly on the high end package in power draw....

The worst setup i think i had was when i was running a old quadfire setup.
I had to use dual PSU's for that setup, which made me end up running 1KW + 750W.
And yes i kept tripping the 1kw when i was undergoing full load... (both CPU + GPU), hence i had to add the second psu to the system.

Nope. Why on earth would downsizing the PSU change how much is drawn at the wall? (assuming identical efficiency at a given load)
If you have a big honking 2kW PSU that's more efficient at a given load than a 500W PSU, it's going to draw less from the wall

you just answered your own question... lol..
(assuming identical efficiency at a given load)

Typically the ITX psu's which use power bricks are way more efficient then a standard PSU @ sub 60W, where most "green systems" tend to float around.
Roughly anywhere from ~ 30-50% more efficient @ wattage below 60W.

This is a serious concern I have, why isn't the integrated GPU in a intel CPU not used when the PC is on idle?

Its like you are completely wasting money on the IGPU part of a intel chip since majority of gamers never even use it.

Why don't intel just sell a cheaper i3 or i5 without the GPU? like what AMD does?

Cuz intel can sell them to the enthusiast at a premium and call them -E instead like the 5930K

If I wanted an energy efficient gaming rig today I would build something like:
Broadwell i7-5775c
GTX 980 (not Ti)
Big SSD
mATX motherboard
400W PSU

a GTX 970 Mini might be a better choice as then you are open to a TON of tiny cases in which it can fit, with 75-80% of the GPU prowess that giant 980 takes @ half the footprint.
 
Last edited:

Zodiark1593

Platinum Member
Oct 21, 2012
2,230
4
81
This is a serious concern I have, why isn't the integrated GPU in a intel CPU not used when the PC is on idle?

Its like you are completely wasting money on the IGPU part of a intel chip since majority of gamers never even use it.

Why don't intel just sell a cheaper i3 or i5 without the GPU? like what AMD does?

There are some Haswell Xeons floating around that have disabled the igp, but come with hyperthreading and are priced at $250. Kinda wish I grabbed that actually, but my i5 is plenty, and the igp is actually used when my dgpu is hammering away a render and I still actually need to use my system.
 

know of fence

Senior member
May 28, 2009
555
2
71
Nope. Why on earth would downsizing the PSU change how much is drawn at the wall? (assuming identical efficiency at a given load)

If you have a big honking 2kW PSU that's more efficient at a given load than a 500W PSU, it's going to draw less from the wall

You can't assume identical efficiency, rather PSU efficiency certification (80+) is relative to max. wattage. A "80 PLUS Gold" rated power supply has to have 90% efficiency above 20% load (assuming 230 VAC EU non-redundant because it's round numbers). A 1000 W gold PSU will likely have a much worse efficiency at 100W(10%load) than a 500 W unit which guaranteed by the certification will probably be at least 90% efficient at the same 100 W(20% load) output.
A 1000 W PSU that could match the 500 W PSU's 90% efficiency at 100 W would also qualify for platinum certification and will probably also be sold as such.

This is also the reason why you can't find Platinum rated 300 W or 400 W PSUs, because it's hard to get things working above 90% efficiency providing just 30 to 40 W.

Also it's probably more important to match your TDPs to half of your PSU wattage, rather than spending big on the certification label. Ideally you do both of course.

Another thing I noticed is that, platinum power supplies are still pretty efficient at low loads, but their Power Factor will drop significantly according to my watt-meter, which is a different sort of trade off, a kind of trickery actually.
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,425
8,388
126
Nope. Why on earth would downsizing the PSU change how much is drawn at the wall? (assuming identical efficiency at a given load)

If you have a big honking 2kW PSU that's more efficient at a given load than a 500W PSU, it's going to draw less from the wall

because your assumption is wrong. efficiencies might be the same or even better at a given percentage load for the more powerful supply, but at absolute loads the lower wattage unit could still be more efficient. the efficiency curve for most supplies starts dropping like a rock once you get below ~15%.
 
Last edited:

tortillasoup

Golden Member
Jan 12, 2011
1,977
3
81
You can't assume identical efficiency, rather PSU efficiency certification (80+) is relative to max. wattage. A "80 PLUS Gold" rated power supply has to have 90% efficiency above 20% load (assuming 230 VAC EU non-redundant because it's round numbers). A 1000 W gold PSU will likely have a much worse efficiency at 100W(10%load) than a 500 W unit which guaranteed by the certification will probably be at least 90% efficient at the same 100 W(20% load) output.
A 1000 W PSU that could match the 500 W PSU's 90% efficiency at 100 W would also qualify for platinum certification and will probably also be sold as such.

This is also the reason why you can't find Platinum rated 300 W or 400 W PSUs, because it's hard to get things working above 90% efficiency providing just 30 to 40 W.

Also it's probably more important to match your TDPs to half of your PSU wattage, rather than spending big on the certification label. Ideally you do both of course.

Another thing I noticed is that, platinum power supplies are still pretty efficient at low loads, but their Power Factor will drop significantly according to my watt-meter, which is a different sort of trade off, a kind of trickery actually.

spot on
 

nvgpu

Senior member
Sep 12, 2014
629
202
81
Everyone should upgrade to Intel 14nm CPUs(except IVB users since they're already on FINFETs and gains are smaller) because FINFET transistors are much more power efficient than planar transistors, but they're more concerned about their 5GHz overclocks instead which wastes more power.
 

lyssword

Diamond Member
Dec 15, 2005
5,761
25
91
Todays gaming systems have vastly improved idle wattage, and about equal under load vs 10 yrs ago
 

MagickMan

Diamond Member
Aug 11, 2008
7,537
3
76
"You don't need that power-hungry gaming PC w/ a 6-core CPU, SLI, and 1200W PSU. It's irresponsible."

Go to hell, it's none of your business.
 
May 11, 2008
20,055
1,290
126
Nowadays, a modern pc in combination with an efficient psu (such as mine), is between 35 and 40 watts at normal use like browsing and playing some music in the background. When the fury nano comes out (and similar cards from the competition), a lot of gaming horsepower will be available in combination with low idle power consumption. Wait for another 6 months, and then you can assemble and buy a very power efficient pc.

Then again, if you only need to browse the internet, a tablet does wonders...
When i am just browsing or reading, i use my tablet. The pc remains off.

EDIT : For my pc specs :

http://forums.anandtech.com/showpost.php?p=37514293&postcount=38
 
Last edited:

TeknoBug

Platinum Member
Oct 2, 2013
2,084
31
91
Then again, if you only need to browse the internet, a tablet does wonders...
When i am just browsing or reading, i use my tablet. The pc remains off.
I like to surf on my tablets too, but the virtual keyboard is a PITA sometimes (iPad keyboard occasionally disappears while typing), wish I had gotten one of those Asus T100 with keyboard dock or something, gonna get my wife one for Christmas I think.
 
May 11, 2008
20,055
1,290
126
I like to surf on my tablets too, but the virtual keyboard is a PITA sometimes (iPad keyboard occasionally disappears while typing), wish I had gotten one of those Asus T100 with keyboard dock or something, gonna get my wife one for Christmas I think.

I agree, the virtual keyboard does indeed build a high level of patience and self control.
But maybe a small bluetooth keyboard or one of those nice HP laptop alike tablets with a keyboard on it. It looks like a laptop but it is a tablet with detachable keyboard.

 
May 11, 2008
20,055
1,290
126
I should mention that HP also has laptop alike tablet models lower in price. And very comparable to modern normal laptops with a good cpu and gpu. Models around 400 to 500 euro do exist. I expect that prices in North America are even lower.
 

Blue_Max

Diamond Member
Jul 7, 2011
4,227
153
106
Just think of how much MORE power we'd be using if we all still used our old CRT monitors!

That said, I love that much of our new hardware is more efficient than ever and is using less power while still offering more and more performance.

If you can accept mid-grade performance, there's some great gaming to be had at only ~100 watts, like the Alienware Alpha and its 860M video or only a 'handful' more wattage with a desktop GTX 750ti or even 960. Even the 970 is efficient for its high performance.

AMD has been lagging in this area but is making a few comebacks lately as well.
 

Ranulf

Platinum Member
Jul 18, 2001
2,409
1,309
136
Bah, the conclusion says it all. Its ultimately a bs paper to push power regulations for computer parts.

From the conclusion (written December last year via the title page):
"The mainstream gaming computer industry does not emphasize energy use or efficiency..."

Apparently the author's missed Nvidia's 970/980 release bonanza of info and bragging of power efficiency last September.

My Wii and cable tv box cost me $20+ a year sitting in idle most of the time. You save power by unplugging those things you don't really need/want to have instant on capabilities. Even better, swap out your lights to leds.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
First off, that paper is rubbish. Whoever wrote it either wanted a specific outcome or was very ignorant.

The hardware they are comparing is a high performance 2013 system set next to a very unbalanced 2014 system. The 2013 system is something of a worst case scenario: the CPU (i7-4820K) and GPU (GTX 780) are lower tier binned products that typically have lower power efficiency than their more expensive counterparts (e.g. 4960X and GTX 780 Ti) since they came out of the factory with defective units and/or worse power characteristics than a fully yielding chip.

Even the RAM is a poor choice, being a 1.65v kit rather than 1.5v as is standard for DDR3. To top it off they used a 550W PSU, which for a system with that kind of power consumption is undersized. This causes it to run closer to its limits, and PSU efficiency drops off after 85% or so.

The 2014 system is an odd collection of parts that seems to be picked specifically to minimize power consumption under very narrow circumstances. That system combines a high-perf GTX 970 (a good card for efficiency) with a low-end Pentium G3258, and then goes with an even larger 760W PSU.

The problem with this whole test is that they're clearly using a extremely GPU limited test metric, which is why performance doesn't drop despite the major downgrade in CPUs. A GTX 970 is going to be CPU-limited in many games when paired with that CPU, which is why "balance" is a concern when building such a system.

But perhaps the most screwball part is the monitor choice. They ended up using an old 2004 Apple HD Cinema Display 23 for the 2013 system, which is a 23" CCFL-backlit IPS monitor. Meanwhile the 2014 system switches that out for a 24" LED-backlit TN monitor. Even ignoring the age difference for a moment (backlighting tech makes a difference here), you generally don't see users swap between IPS and TN due to the distinct tradeoffs between response time and color quality.

I don't want to accuse the author, but the only way these tests and configurations make much sense is if you built these systems to get a specific power outcome, while focusing exclusively on GPU performance to hide the downgrade of the other components.

The measured power consumption in the paper looks off. On pg 13/14, the peak power consumption was 512W in gaming mode. The graph on the following pages shows a little above 200W in web browsing/video streaming mode. Even for a 2011 platform, that seems way too high to be correct.
It's the monitor. The thing has a max power rating of 90W, so it's nearly half the load (and I don't doubt for a second that these monitors are at max brightness).

The best solution would be Optimus on the desktop, for the dGPU problem.
dGPUs are already sub-10W. I don't want to say that power is "free," but when you're that low on the efficiency curve, I'm not sure the savings from firing up the iGPU are going to be felt beyond the PSU.
 
Last edited:

MongGrel

Lifer
Dec 3, 2013
38,751
3,068
121
First off, that paper is rubbish. Whoever wrote it either wanted a specific outcome or was very ignorant.

The hardware they are comparing is a high performance 2013 system set next to a very unbalanced 2014 system. The 2013 system is something of a worst case scenario: the CPU (i7-4820K) and GPU (GTX 780) are lower tier binned products that typically have lower power efficiency than their more expensive counterparts (e.g. 4960X and GTX 780 Ti) since they came out of the factory with defective units and/or worse power characteristics than a fully yielding chip.

Even the RAM is a poor choice, being a 1.65v kit rather than 1.5v as is standard for DDR3. To top it off they used a 550W PSU, which for a system with that kind of power consumption is undersized. This causes it to run closer to its limits, and PSU efficiency drops off after 85% or so.

The 2014 system is an odd collection of parts that seems to be picked specifically to minimize power consumption under very narrow circumstances. That system combines a high-perf GTX 970 (a good card for efficiency) with a low-end Pentium G3258, and then goes with an even larger 760W PSU.

The problem with this whole test is that they're clearly using a extremely GPU limited test metric, which is why performance doesn't drop despite the major downgrade in CPUs. A GTX 970 is going to be CPU-limited in many games when paired with that CPU, which is why "balance" is a concern when building such a system.

But perhaps the most screwball part is the monitor choice. They ended up using an old 2004 Apple HD Cinema Display 23 for the 2013 system, which is a 23" CCFL-backlit IPS monitor. Meanwhile the 2014 system switches that out for a 24" LED-backlit TN monitor. Even ignoring the age difference for a moment (backlighting tech makes a difference here), you generally don't see users swap between IPS and TN due to the distinct tradeoffs between response time and color quality.

I don't want to accuse the author, but the only way these tests and configurations make much sense is if you built these systems to get a specific power outcome, while focusing exclusively on GPU performance to hide the downgrade of the other components.

It's the monitor. The thing has a max power rating of 90W, so it's nearly half the load (and I don't doubt for a second that these monitors are at max brightness).

dGPUs are already sub-10W. I don't want to say that power is "free," but when you're that low on the efficiency curve, I'm not sure the savings from firing up the iGPU are going to be felt beyond the PSU.

This, all over the place.

But that pretty much goes without saying.

:thumbsup:
 
Last edited:

TeknoBug

Platinum Member
Oct 2, 2013
2,084
31
91
I think power usage went up right around the Athlon 64/Phenom and Pentium 4 days, before that Pentium III's were 15W and then Pentium 4's were 84W, but back then we used CRT's which were close to 75-100W now we have LCD/LED's that do 30W at best. Video cards varied too, my HD5770 I still have is about the same as my GTX750Ti, both around 65W, my 10 year old Nvidia FX 5700 was 130W (IIRC).
 

Pwndenburg

Member
Mar 2, 2012
172
0
76
In honor of the author, I'm going to run some kind of stress test at stock to burn power for 3 days. No justification except I'm paying for it and "bite me." I am sorry that I have the normal i7 4770. I just can't kick that power use into overdrive. After reading his paper in full, at least I can say I'm more honest than he is. I do have a 980, guess I'll kick it to max power draw.
 
Last edited:

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
dGPUs are already sub-10W. I don't want to say that power is "free," but when you're that low on the efficiency curve, I'm not sure the savings from firing up the iGPU are going to be felt beyond the PSU.
It's another 10-15W to be saved at the wall. If comparing against non-dGPU systems, that would do the trick.
 

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
I think power usage went up right around the Athlon 64/Phenom and Pentium 4 days, before that Pentium III's were 15W and then Pentium 4's were 84W, but back then we used CRT's which were close to 75-100W now we have LCD/LED's that do 30W at best. Video cards varied too, my HD5770 I still have is about the same as my GTX750Ti, both around 65W, my 10 year old Nvidia FX 5700 was 130W (IIRC).

Even a complete PC with a 4 year old 2600K and GTX 980 idles below 50W.

http://techreport.com/review/28751/intel-core-i7-6700k-skylake-processor-reviewed/5

The greatest stealth improvement in PC since the last decade is the introduction of low-power idle modes, the next improvement is doing the same thing while OCing since Sandy Bridge...idling at full clocks and volts is simply a massive waste of energy.
 
Last edited:
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |