End Power Supply Myths!

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

jonnyGURU

Moderator <BR> Power Supplies
Moderator
Oct 30, 1999
11,815
104
106
Originally posted by: mpilchfamily
Well what about overall efficiency? Isn't the lower wattage unit going to offer better efficiency then the 1200W unit? Not because one has a better efficiency rating then the other but because of the overall loads put on each unit.

Wha?

The lower wattage unit IS NOT going to offer better efficiency.

1st off, and I know you already know this, a computer is only going to use however much power it needs.

Just want to get that statement out of the way.

Whether or not one power supply is more efficient at lower loads than another completely depends on the individual power supply. Not how much power they put out.

Yes, a power supply typically has a sort of "bell curve" to their efficiency. It starts at a relatively low number, peaks in the middle and then drops at the end... typically. So typically that would mean that a PC that consumed less power (say 20% of the PSU's capability) would be less efficient than a PC that uses more power (say 50% of the PSU's capability) on the same power supply.....

BUT....

I've seen 1200W power supplies that are still 80% efficient or better at loads as low as 150W and I've seen other, lower wattage power supplies, touted as "super efficient", just hitting 80% at 150W and certainly never getting up around 85 or 87% efficient at ANY load.

So... you CAN NOT make blanket statement that state that "this lower wattage power supply is more efficient than a bigger power supplly" simply because EVERY power supply is COMPLETELY different than the next.

Even in the case of the Thermaltake Toughpowers. I don't NEED a 1kW power supply. But if I had to choose between a Toughpower 750W and a Toughpower 1200W, I would still choose the 1200W (if the monetary budget allowed it) because the 1200W has better voltage regulation and the efficiency is better at THE SAME LOADS than the 750W.

Originally posted by: Phlargo
So to follow up, Jonny - how much power do we actually need? We have some power supply calculators giving a rough estimate (even if purportedly too high) but, as you pointed out earlier, even if those were correct they assume a single point in time. Do we have histogram information on power supplies expressing their efficiency and output over time?

It's hard to say. Calculators tend to calculate all hard drives spinning, all opticals spinning, CPU and GPU all under load at the same time... How often does that really happen? I think the best thing to do is to buy a Kill-A-Watt, or at least a clamping ammeter, and see what your PC is currently using.

If you're building a new PC from the ground up, I'm going to say just buy the best power supply you can buy that fits your needs. I just can't comprehend the mentality of "what's the least I can get by with" when it comes to power supplies. Why am I going to spend $300 each on two video cards, but try to keep my power supply cost under $100? I mean, there's some darn good 400W power supplies out there, but if I've only got $60 to spend on a PSU and expect the PSU to last two years, I would also have to consider... do I really need SLI? Would I be Ok with the E6600 over the E6700?

Originally posted by: PhlargoWhat is the average "rate of decay" for power supplies? If I need 400 Watts for my system and I buy a power supply now that I want to last for 2 years, what kind of overshoot is necessary to account for that decay?

Depends. Different PSU's have different derating curves. And everyone's PC has different loads. Different operating temperatures. Different configurations that may require the PSU to suck up more heat than someone else's. There is no set answer to that question. It's not like when they tell you a particlar car has a life expectancy of 300,000 miles because that car is an assembled product, manufactured to operate within certain parameters. Take that car and put it in a demolition derby. Your 300,000 miles just turned into 30 minutes tops. Let's say you take that engine out of that car and drop it into a race car, guess what? You probably hacked that engine's life expectency into 1/20th.

Originally posted by: PhlargoOn another front, did anyone hear about the recent EPA regulation proposal requiring higher efficiency from power supplies (computer included) - how do you think that'll affect the PC power supply industry?

I think it's good. But I'm also growing tired of all of the talk about Google's "super efficient" computers. The ignorance is unrivaled.

One Google representative is quoted as saying that a PC only uses half of the power delivered to it. Half. That's 50%. Even a crappy $20 Allied power supply is at least 75% efficient. In fact, for a PSU to be "ATX12V" one of the requirement is that the PSU is at least 75% efficient. Not 50% efficient. Unless of course Google has been powering their servers with wall-wart power packs all of these years.

They also boast that their new power supplies are 90 to 95% efficient. What they fail to mention is that these are DC to DC power supplies. DC to DC power supplies tend to be 90 to 95% efficient. But how efficient is the AC to DC conversion that's feeding all of these power supplies? THAT is where most of your power loss occurs! Not in DC distribution, but in AC to DC conversion. Assuming that their AC to DC conversion is as high as 90%, 90 to 95% of 90% is 81% to 85.5% total efficient! It would have been cheaper and lower maitainence to just install a standard, 80 Plus certified, AC to DC power supply in each of their servers! But you don't read that in any of the press, do you?

 

Conky

Lifer
May 9, 2001
10,709
0
0
Originally posted by: jonnyGURUI think it's good. But I'm also growing tired of all of the talk about Google's "super efficient" computers. The ignorance is unrivaled.

One Google representative is quoted as saying that a PC only uses half of the power delivered to it. Half. That's 50%. Even a crappy $20 Allied power supply is at least 75% efficient. In fact, for a PSU to be "ATX12V" one of the requirement is that the PSU is at least 75% efficient. Not 50% efficient. Unless of course Google has been powering their servers with wall-wart power packs all of these years.

They also boast that their new power supplies are 90 to 90% efficient. What they fail to mention is that these are DC to DC power supplies. DC to DC power supplies tend to be 90 to 90% efficient. But how efficient is the AC to DC conversion that's feeding all of these power supplies? THAT is where most of your power loss occurs! Not in DC distribution, but in AC to DC conversion. Assuming that their AC to DC conversion is as high as 90%, 90 to 95% of 90% is 81% to 85.5% total efficient! It would have been cheaper and lower maitainence to just install a standard, 80 Plus certified, AC to DC power supply in each of their servers! But you don't read that in any of the press, do you?
This might make sense if you only consider the power supply as the only area where inefficiency might factor in a computer system.

But I think you are forgetting that all the other system parts are not 100% efficient. If they were 100% efficient there would be no need for heatsinks on cpu's or videocards. Or fans of any kind. Even RAM puts out heat. HD's and DVD/CD burners can put out a lot of heat.

I think 50% is a realistic figure.

 

CrystalBay

Platinum Member
Apr 2, 2002
2,175
1
0
Originally posted by: jonnyGURU
Originally posted by: mpilchfamily
Well what about overall efficiency? Isn't the lower wattage unit going to offer better efficiency then the 1200W unit? Not because one has a better efficiency rating then the other but because of the overall loads put on each unit.

Wha?

The lower wattage unit IS NOT going to offer better efficiency.

1st off, and I know you already know this, a computer is only going to use however much power it needs.

Just want to get that statement out of the way.

Whether or not one power supply is more efficient at lower loads than another completely depends on the individual power supply. Not how much power they put out.

Yes, a power supply typically has a sort of "bell curve" to their efficiency. It starts at a relatively low number, peaks in the middle and then drops at the end... typically. So typically that would mean that a PC that consumed less power (say 20% of the PSU's capability) would be less efficient than a PC that uses more power (say 50% of the PSU's capability) on the same power supply.....

BUT....

I've seen 1200W power supplies that are still 80% efficient or better at loads as low as 150W and I've seen other, lower wattage power supplies, touted as "super efficient", just hitting 80% at 150W and certainly never getting up around 85 or 87% efficient at ANY load.

So... you CAN NOT make blanket statement that state that "this lower wattage power supply is more efficient than a bigger power supplly" simply because EVERY power supply is COMPLETELY different than the next.

Even in the case of the Thermaltake Toughpowers. I don't NEED a 1kW power supply. But if I had to choose between a Toughpower 750W and a Toughpower 1200W, I would still choose the 1200W (if the monetary budget allowed it) because the 1200W has better voltage regulation and the efficiency is better at THE SAME LOADS than the 750W.

Originally posted by: Phlargo
So to follow up, Jonny - how much power do we actually need? We have some power supply calculators giving a rough estimate (even if purportedly too high) but, as you pointed out earlier, even if those were correct they assume a single point in time. Do we have histogram information on power supplies expressing their efficiency and output over time?

It's hard to say. Calculators tend to calculate all hard drives spinning, all opticals spinning, CPU and GPU all under load at the same time... How often does that really happen? I think the best thing to do is to buy a Kill-A-Watt, or at least a clamping ammeter, and see what your PC is currently using.

If you're building a new PC from the ground up, I'm going to say just buy the best power supply you can buy that fits your needs. I just can't comprehend the mentality of "what's the least I can get by with" when it comes to power supplies. Why am I going to spend $300 each on two video cards, but try to keep my power supply cost under $100? I mean, there's some darn good 400W power supplies out there, but if I've only got $60 to spend on a PSU and expect the PSU to last two years, I would also have to consider... do I really need SLI? Would I be Ok with the E6600 over the E6700?

Originally posted by: PhlargoWhat is the average "rate of decay" for power supplies? If I need 400 Watts for my system and I buy a power supply now that I want to last for 2 years, what kind of overshoot is necessary to account for that decay?

Depends. Different PSU's have different derating curves. And everyone's PC has different loads. Different operating temperatures. Different configurations that may require the PSU to suck up more heat than someone else's. There is no set answer to that question. It's not like when they tell you a particlar car has a life expectancy of 300,000 miles because that car is an assembled product, manufactured to operate within certain parameters. Take that car and put it in a demolition derby. Your 300,000 miles just turned into 30 minutes tops. Let's say you take that engine out of that car and drop it into a race car, guess what? You probably hacked that engine's life expectency into 1/20th.

Originally posted by: PhlargoOn another front, did anyone hear about the recent EPA regulation proposal requiring higher efficiency from power supplies (computer included) - how do you think that'll affect the PC power supply industry?

I think it's good. But I'm also growing tired of all of the talk about Google's "super efficient" computers. The ignorance is unrivaled.

One Google representative is quoted as saying that a PC only uses half of the power delivered to it. Half. That's 50%. Even a crappy $20 Allied power supply is at least 75% efficient. In fact, for a PSU to be "ATX12V" one of the requirement is that the PSU is at least 75% efficient. Not 50% efficient. Unless of course Google has been powering their servers with wall-wart power packs all of these years.

They also boast that their new power supplies are 90 to 90% efficient. What they fail to mention is that these are DC to DC power supplies. DC to DC power supplies tend to be 90 to 90% efficient. But how efficient is the AC to DC conversion that's feeding all of these power supplies? THAT is where most of your power loss occurs! Not in DC distribution, but in AC to DC conversion. Assuming that their AC to DC conversion is as high as 90%, 90 to 95% of 90% is 81% to 85.5% total efficient! It would have been cheaper and lower maitainence to just install a standard, 80 Plus certified, AC to DC power supply in each of their servers! But you don't read that in any of the press, do you?





This is an excellent post that should be stickied some where in a PSU FAQ here in this forum, along with other FAQs..
 

jonnyGURU

Moderator <BR> Power Supplies
Moderator
Oct 30, 1999
11,815
104
106
Originally posted by: Conky
Originally posted by: jonnyGURUI think it's good. But I'm also growing tired of all of the talk about Google's "super efficient" computers. The ignorance is unrivaled.

One Google representative is quoted as saying that a PC only uses half of the power delivered to it. Half. That's 50%. Even a crappy $20 Allied power supply is at least 75% efficient. In fact, for a PSU to be "ATX12V" one of the requirement is that the PSU is at least 75% efficient. Not 50% efficient. Unless of course Google has been powering their servers with wall-wart power packs all of these years.

They also boast that their new power supplies are 90 to 90% efficient. What they fail to mention is that these are DC to DC power supplies. DC to DC power supplies tend to be 90 to 90% efficient. But how efficient is the AC to DC conversion that's feeding all of these power supplies? THAT is where most of your power loss occurs! Not in DC distribution, but in AC to DC conversion. Assuming that their AC to DC conversion is as high as 90%, 90 to 95% of 90% is 81% to 85.5% total efficient! It would have been cheaper and lower maitainence to just install a standard, 80 Plus certified, AC to DC power supply in each of their servers! But you don't read that in any of the press, do you?
This might make sense if you only consider the power supply as the only area where inefficiency might factor in a computer system.

But I think you are forgetting that all the other system parts are not 100% efficient. If they were 100% efficient there would be no need for heatsinks on cpu's or videocards. Or fans of any kind. Even RAM puts out heat. HD's and DVD/CD burners can put out a lot of heat.

I think 50% is a realistic figure.

You're thinking is all wrong on that. The components CONSUME this energy. The waste of this energy is heat. That's a given. The power supply CONSUMES power from the mains. There is energy wasted as heat in the conversion.

For your statement to be correct, the components would have to actually consume more than what the power supply is putting out.

If a power supply consumes 800W DC and puts out 500W DC, the efficiency is still 75%. No matter how inefficient the components are on the DC side of the equation, it doesn't change this. If they are very inefficient components, they will certainly consume more power, but the power supply will also consume more power. This doesn't change your efficiency equation. This merely causes your computer to consume more power.

50% is NOT a realistic figure at all.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
JonnyGURU,

In your opinion, what is the best bang for the buck power supply for someone who plans on building a Quad Core rig + A single next gen graphics card (G90 or R700?)? I am guessing that we don't know for a fact the power requirements of the next gen video cards, but it seems you have a good idea reading your website reviews.

I have a hard time swallowing $200 for a 700 Watt unit when I might not need it... I have pretty much went over all the reviews on your website. But when it comes to power, I am pretty ignorant. I just know that my 500w cheapy dynamax does the trick just fine for my current system and has been for almost two years now. Actually, that is the first time I admitted that in public (that I am using a cheapy Dynamax).

Thanks!
 

Aluvus

Platinum Member
Apr 27, 2006
2,913
1
0
Originally posted by: Conky
This might make sense if you only consider the power supply as the only area where inefficiency might factor in a computer system.

But I think you are forgetting that all the other system parts are not 100% efficient. If they were 100% efficient there would be no need for heatsinks on cpu's or videocards. Or fans of any kind. Even RAM puts out heat. HD's and DVD/CD burners can put out a lot of heat.

I think 50% is a realistic figure.

Essentially all energy consumed by system components ultimately ends up as heat. The power efficiency of a computer in that respect is effectively 0%, unless you count it as a space heater (in which case it's ~100%, huzzah!).

If you wanted to treat fans and other "support" components as waste power and then compare that to the power consumed by components doing actual useful work, I suppose that would be a sort of efficiency. But producing consistent numbers that way would be a bear, particularly as you get into the somewhat philosophical aspects (power lost in the power supply's leads is wasted, but what about power lost in motherboard traces - and how would you measure, anyway?), and I don't know that the results would be especially meaningful anyway.

Power supply efficiency is the most meaningful measure of efficiency for a computer.
 

jonnyGURU

Moderator <BR> Power Supplies
Moderator
Oct 30, 1999
11,815
104
106
Originally posted by: ArchAngel777
JonnyGURU,

In your opinion, what is the best bang for the buck power supply for someone who plans on building a Quad Core rig + A single next gen graphics card (G90 or R700?)? I am guessing that we don't know for a fact the power requirements of the next gen video cards, but it seems you have a good idea reading your website reviews.

I'm going to tell you the Ultra X3 600W when they land in the next couple months... but I'll also tell you right up front that I'm biased. Not only am I using one right now to power my 4000+ and two 7800GTX's (all water cooled) but I'd also like to think that I'm the one who convinced Ultra to migrate their high end product over to Andyson's factory.

Originally posted by: Aluvus
Essentially all energy consumed by system components ultimately ends up as heat. The power efficiency of a computer in that respect is effectively 0%, unless you count it as a space heater (in which case it's ~100%, huzzah!).

Exactly.
 

Conky

Lifer
May 9, 2001
10,709
0
0
Originally posted by: jonnyGURU
Originally posted by: Conky
Originally posted by: jonnyGURUI think it's good. But I'm also growing tired of all of the talk about Google's "super efficient" computers. The ignorance is unrivaled.

One Google representative is quoted as saying that a PC only uses half of the power delivered to it. Half. That's 50%. Even a crappy $20 Allied power supply is at least 75% efficient. In fact, for a PSU to be "ATX12V" one of the requirement is that the PSU is at least 75% efficient. Not 50% efficient. Unless of course Google has been powering their servers with wall-wart power packs all of these years.

They also boast that their new power supplies are 90 to 90% efficient. What they fail to mention is that these are DC to DC power supplies. DC to DC power supplies tend to be 90 to 90% efficient. But how efficient is the AC to DC conversion that's feeding all of these power supplies? THAT is where most of your power loss occurs! Not in DC distribution, but in AC to DC conversion. Assuming that their AC to DC conversion is as high as 90%, 90 to 95% of 90% is 81% to 85.5% total efficient! It would have been cheaper and lower maitainence to just install a standard, 80 Plus certified, AC to DC power supply in each of their servers! But you don't read that in any of the press, do you?
This might make sense if you only consider the power supply as the only area where inefficiency might factor in a computer system.

But I think you are forgetting that all the other system parts are not 100% efficient. If they were 100% efficient there would be no need for heatsinks on cpu's or videocards. Or fans of any kind. Even RAM puts out heat. HD's and DVD/CD burners can put out a lot of heat.

I think 50% is a realistic figure.

You're thinking is all wrong on that. The components CONSUME this energy. The waste of this energy is heat. That's a given. The power supply CONSUMES power from the mains. There is energy wasted as heat in the conversion.

For your statement to be correct, the components would have to actually consume more than what the power supply is putting out.

If a power supply consumes 800W DC and puts out 500W DC, the efficiency is still 75%. No matter how inefficient the components are on the DC side of the equation, it doesn't change this. If they are very inefficient components, they will certainly consume more power, but the power supply will also consume more power. This doesn't change your efficiency equation. This merely causes your computer to consume more power.

50% is NOT a realistic figure at all.
If the power supply consumed all of the energy there would be none left for the other components.

What I am saying is that there is still a loss of efficiency beyond the power supply as not all components are 100% efficient.

Why bother using lower wattage cpu's and other more efficient components if the only factor of a computer systems power efficiency is how efficient the power supply is?

I'm sure the Google person quoted wasn't talking only about PSU's.
 

Nathelion

Senior member
Jan 30, 2006
697
1
0
Originally posted by: Conky
Originally posted by: jonnyGURU
Originally posted by: Conky
Originally posted by: jonnyGURUI think it's good. But I'm also growing tired of all of the talk about Google's "super efficient" computers. The ignorance is unrivaled.

One Google representative is quoted as saying that a PC only uses half of the power delivered to it. Half. That's 50%. Even a crappy $20 Allied power supply is at least 75% efficient. In fact, for a PSU to be "ATX12V" one of the requirement is that the PSU is at least 75% efficient. Not 50% efficient. Unless of course Google has been powering their servers with wall-wart power packs all of these years.

They also boast that their new power supplies are 90 to 90% efficient. What they fail to mention is that these are DC to DC power supplies. DC to DC power supplies tend to be 90 to 90% efficient. But how efficient is the AC to DC conversion that's feeding all of these power supplies? THAT is where most of your power loss occurs! Not in DC distribution, but in AC to DC conversion. Assuming that their AC to DC conversion is as high as 90%, 90 to 95% of 90% is 81% to 85.5% total efficient! It would have been cheaper and lower maitainence to just install a standard, 80 Plus certified, AC to DC power supply in each of their servers! But you don't read that in any of the press, do you?
This might make sense if you only consider the power supply as the only area where inefficiency might factor in a computer system.

But I think you are forgetting that all the other system parts are not 100% efficient. If they were 100% efficient there would be no need for heatsinks on cpu's or videocards. Or fans of any kind. Even RAM puts out heat. HD's and DVD/CD burners can put out a lot of heat.

I think 50% is a realistic figure.

You're thinking is all wrong on that. The components CONSUME this energy. The waste of this energy is heat. That's a given. The power supply CONSUMES power from the mains. There is energy wasted as heat in the conversion.

For your statement to be correct, the components would have to actually consume more than what the power supply is putting out.

If a power supply consumes 800W DC and puts out 500W DC, the efficiency is still 75%. No matter how inefficient the components are on the DC side of the equation, it doesn't change this. If they are very inefficient components, they will certainly consume more power, but the power supply will also consume more power. This doesn't change your efficiency equation. This merely causes your computer to consume more power.

50% is NOT a realistic figure at all.
If the power supply consumed all of the energy there would be none left for the other components.

What I am saying is that there is still a loss of efficiency beyond the power supply as not all components are 100% efficient.

Why bother using lower wattage cpu's and other more efficient components if the only factor of a computer systems power efficiency is how efficient the power supply is?

I'm sure the Google person quoted wasn't talking only about PSU's.

It's not that you aren't right, per se. It's just that the point you're making is, to be honest, completely devoid of any real-world usefulness. Any electrical device is going to use energy. That energy then turns into heat (or radiation). That's the way it is. Inefficiency is a concept we use to denote the discrepancy between heat generated in order to specifically achieve the results we want, and the total heat generated by the system (one could substitute energy for heat in the previous sentence, if desired. Or entropy, if you want). What you're talking about is the fact that all devices, not just the power supply, come with a certain degree of inefficiency. This is true. However, it is also not particularly useful. to compare the inefficiencies generated by all the myriad components in the computer, you'd have to go through extreme levels of trouble, not to mention that I don't know how you'd even define inefficiency in many cases. Like mentioned previously, is energy lost in the traces on the motherboard considered inefficiency? Is energy lost in the microscopic traces inside the cpu itself inefficiency? Is the continual energy required to keep the hard drives and fans spinning inefficiency, or a part of what'd be considered useful work? The whole concept more or less break down when attempting to de-abstract the inner workings of the computer. In the case of the PSU, however, it is actually easy to define a useful measure of efficiency: You can measure what goes in and you can measure what comes out the other side. So it's not necessarily that PSU efficiency is necessarily the measurement to end all measurements, it's just that it's the best practical measurement we have. To the extent that efficiency can realistically be discussed regarding the actual computational components, thermal envelopes is really the only practical measurement that can be made. Although note that there is no stringent definition of efficiency in that case, since we don't really even know what efficiency means in terms of a microprocessor. So for the very absence of other useful measurements, PSU efficiency is the only way computer system efficiency can be strictly compared.
Besides, in the case of the google systems, they're using the same components as everyone else, so the "internal efficiency" of their computers would be no different than that of yours or mine.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: jonnyGURU
Originally posted by: ArchAngel777
JonnyGURU,

In your opinion, what is the best bang for the buck power supply for someone who plans on building a Quad Core rig + A single next gen graphics card (G90 or R700?)? I am guessing that we don't know for a fact the power requirements of the next gen video cards, but it seems you have a good idea reading your website reviews.

I'm going to tell you the Ultra X3 600W when they land in the next couple months... but I'll also tell you right up front that I'm biased. Not only am I using one right now to power my 4000+ and two 7800GTX's (all water cooled) but I'd also like to think that I'm the one who convinced Ultra to migrate their high end product over to Andyson's factory.

Nothing wrong with bias unless it gets in the way of good advice. I'll keep an eye out for that PSU.

Thanks!
 

bamacre

Lifer
Jul 1, 2004
21,029
2
61
First off, this is a wonderful thread. I cringe everytime I see someone recommend an OCZ 700w power supply for systems that would never hit 400w under full load.

Anandtech did a test here in this article...
http://www.dailytech.com/article.aspx?newsid=4812

With a quad core Intel cpu, and an 8800 GTX video card, their test system hit about 321W under full load.

I also don't get the "future considerations" when it comes to buying a power supply. Sure many people may add a hard drive, or a second optical drive, to their system. But the two main components that draw the most power are the cpu and video card. We've already seen what Intel and AMD can do with cpu efficiency, especially with Intel's Core technology. But Nvidia and ATI (or AMD now) are headed down the same path. I honestly think that we are seeing a peak in regards to how much power video cards use, and it will not be long before we see video cards using a lot less power.

OEM's don't just use el cheapo cards. Dell has always offered the top dog-video card with their higher-end systems, like the current XPS 410. They are offering the 8800 GTX card with that system and its 375W (dual 12V @18A) power supply. So there is not just a push by consumers for more efficient gpu's, but also from OEM's who would like to offer the highest-end video cards, without having to run them with higher-wattage power supplies.
 

Phlargo

Senior member
Jul 21, 2004
865
0
0
Originally posted by: bamacre
First off, this is a wonderful thread. I cringe everytime I see someone recommend an OCZ 700w power supply for systems that would never hit 400w under full load.

Anandtech did a test here in this article...
http://www.dailytech.com/article.aspx?newsid=4812

With a quad core Intel cpu, and an 8800 GTX video card, their test system hit about 321W under full load.

I also don't get the "future considerations" when it comes to buying a power supply. Sure many people may add a hard drive, or a second optical drive, to their system. But the two main components that draw the most power are the cpu and video card. We've already seen what Intel and AMD can do with cpu efficiency, especially with Intel's Core technology. But Nvidia and ATI (or AMD now) are headed down the same path. I honestly think that we are seeing a peak in regards to how much power video cards use, and it will not be long before we see video cards using a lot less power.

OEM's don't just use el cheapo cards. Dell has always offered the top dog-video card with their higher-end systems, like the current XPS 410. They are offering the 8800 GTX card with that system and its 375W (dual 12V @18A) power supply. So there is not just a push by consumers for more efficient gpu's, but also from OEM's who would like to offer the highest-end video cards, without having to run them with higher-wattage power supplies.

While I tend to be in complete agreement with you, bamacre - if you buy a power supply that will support your CPU (with overclock) and GPU, I had a friend who just hit that limit. He wanted to add a second hard drive to a SLI'd 7900GS AthlonX2-3000 system and it couldn't handle any more - he actually got a bios message (that's what he said).

As far as video cards starting to use less power - sooner the better. I can't my 8800 GTS get so hot and uses so much power. Even though it's freakin' fast, it's in many ways a step backwards in terms of power efficiency. I mean, performance per watt might have improved.. but not much - it's a power hungry beast.
 

jonnyGURU

Moderator <BR> Power Supplies
Moderator
Oct 30, 1999
11,815
104
106
Originally posted by: bamacre

With a quad core Intel cpu, and an 8800 GTX video card, their test system hit about 321W under full load.

I would hardly call those PC's under "full load" using those benchmark programs.
 

bamacre

Lifer
Jul 1, 2004
21,029
2
61
Originally posted by: jonnyGURU
Originally posted by: bamacre

With a quad core Intel cpu, and an 8800 GTX video card, their test system hit about 321W under full load.

I would hardly call those PC's under "full load" using those benchmark programs.

Really? Well I certainly cannot disagree, not my numbers.

What would you estimate that system would hit under full load?
 

jonnyGURU

Moderator <BR> Power Supplies
Moderator
Oct 30, 1999
11,815
104
106
Originally posted by: bamacre
Originally posted by: jonnyGURU
Originally posted by: bamacre

With a quad core Intel cpu, and an 8800 GTX video card, their test system hit about 321W under full load.

I would hardly call those PC's under "full load" using those benchmark programs.

Really? Well I certainly cannot disagree, not my numbers.

What would you estimate that system would hit under full load?

Probably about 400or 450 if you had both cores working out as well as the GPU optimized with proper DX10 drivers, etc.

Also, the test system is just one HDD, no optical, not TEC or water cooled, so those variables are going to add onto that number.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: jonnyGURU
Originally posted by: bamacre
Originally posted by: jonnyGURU
Originally posted by: bamacre

With a quad core Intel cpu, and an 8800 GTX video card, their test system hit about 321W under full load.

I would hardly call those PC's under "full load" using those benchmark programs.

Really? Well I certainly cannot disagree, not my numbers.

What would you estimate that system would hit under full load?

Probably about 400or 450 if you had both cores working out as well as the GPU optimized with proper DX10 drivers, etc.

Also, the test system is just one HDD, no optical, not TEC or water cooled, so those variables are going to add onto that number.

Watercooling usually uses external power sources so that would actually slightly reduce the draw. Unless you think a radiator fan would consume more power than a CPU fan. The difference in either case is inconsequential.

TEC cooling is "exotic" and completely not in the norm.

Having more than one HD is again against the norm, but becoming more common as HD prices have been fantastic lately, HDs dont draw a lot of power either.
 

rise

Diamond Member
Dec 13, 2004
9,116
46
91
ii'll hook up my killawatt later and load up my rig with a m12 500w suppplying swiftech h2o 120, 8800gts and 6400 @ 3.0.

 

jonnyGURU

Moderator <BR> Power Supplies
Moderator
Oct 30, 1999
11,815
104
106
Originally posted by: Acanthus

Watercooling usually uses external power sources so that would actually slightly reduce the draw. Unless you think a radiator fan would consume more power than a CPU fan. The difference in either case is inconsequential.

TEC cooling is "exotic" and completely not in the norm.

Having more than one HD is again against the norm, but becoming more common as HD prices have been fantastic lately, HDs dont draw a lot of power either.

Well, any time I've done water cooling, I've used DC pumps hooked up to the main PSU. And I have a TEC assisted water cooled system that, again, uses the main power supply. Also, ChillTEC, Monsoon, CoolIT.. these are considered TEC systems, use a lot of power and... again.... use power from the main power supply.

Certainly, all of the aforementioned options are not "the norm" but I wouldn't say they are "exotic" and I wouldn't say that pumps typically use external power sources. I just came back from Showdown LAN in San Jose and I must say that about half of those guys were running water cooling and almost all of those guys were using their main PSU for their pumps. So I know I'm not in the minority there.

And having more than one hard drive is not against the norm for most enthusiasts. And we have to assume we're addressing enthusiasts when replying to posts at Anandtech. I would think that two hard drives minimum would be the norm. Either a boot and storage solution or a RAID0 or maybe even a RAID1. Most of the machines I build, I do a mirror drive. Saves my butt when one of the drives dies, that's for sure.
 

poohbear

Platinum Member
Mar 11, 2003
2,284
5
81
hello, just wanna post my experiences upgrading from socket A to high-end socket 939 system that "needed a true" 24pin 480w PSU. I have a 4yr old "Whisper-Quiet" Enermax 460wt, ATX ver 1.2, 20pin psu (model name EG465P-VE, it was $120 when i bought it, which 4 years ago was a $hitload for a psu). It has 33a on the +12v line. Im using it w/ the system in my sig w/ NO PROBS (DFI ultra D & opty 165 @ 2800mhz & x1900xt 512mb). according to the calculator at the beggining of this thread i need 580wt to power this rig, which is absolute BS. i bought a 24pin-20pin convertor and everything is dandy. in speedfan my +12v line reads @ 11.71v, which is still well within its needs.

this saturday im replacing it w/ a DFI Expert which has a 8pin workstation connector, but im just buying a 8pin to 4pin convertor cause my psu only has the older 4pin connector. Everywhere i read that this is not gonna work cause the DFI expert "needs" atleast 480wt and an 8pin connector on the psu. We'll see how it turns out, but my DFI ultra is working absolutely fine despite many posts that it wouldnt work. if it doesnt, i'll upgrade, but i wont be doing that until im sure it needs it.

anybody willing to bet whether it will/wont work?

 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: Phlargo

Yeah.. these are great resources - I especially found the Legion Hardware article useful. Thanks - I'll add them to primary post
"How much power is enough" is a good one to read. It makes me feel better about buying the 520 w corsair vs the 620 w for a Q6600 build.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: Phlargo
Originally posted by: bamacre
First off, this is a wonderful thread. I cringe everytime I see someone recommend an OCZ 700w power supply for systems that would never hit 400w under full load.

Anandtech did a test here in this article...
http://www.dailytech.com/article.aspx?newsid=4812

With a quad core Intel cpu, and an 8800 GTX video card, their test system hit about 321W under full load.

I also don't get the "future considerations" when it comes to buying a power supply. Sure many people may add a hard drive, or a second optical drive, to their system. But the two main components that draw the most power are the cpu and video card. We've already seen what Intel and AMD can do with cpu efficiency, especially with Intel's Core technology. But Nvidia and ATI (or AMD now) are headed down the same path. I honestly think that we are seeing a peak in regards to how much power video cards use, and it will not be long before we see video cards using a lot less power.

OEM's don't just use el cheapo cards. Dell has always offered the top dog-video card with their higher-end systems, like the current XPS 410. They are offering the 8800 GTX card with that system and its 375W (dual 12V @18A) power supply. So there is not just a push by consumers for more efficient gpu's, but also from OEM's who would like to offer the highest-end video cards, without having to run them with higher-wattage power supplies.

While I tend to be in complete agreement with you, bamacre - if you buy a power supply that will support your CPU (with overclock) and GPU, I had a friend who just hit that limit. He wanted to add a second hard drive to a SLI'd 7900GS AthlonX2-3000 system and it couldn't handle any more - he actually got a bios message (that's what he said).

As far as video cards starting to use less power - sooner the better. I can't my 8800 GTS get so hot and uses so much power. Even though it's freakin' fast, it's in many ways a step backwards in terms of power efficiency. I mean, performance per watt might have improved.. but not much - it's a power hungry beast.
yeah, I got a bios message with my 350 w cheapie psu with an opteron 180/7600gt/2sata system, too. I would get to the bios and the computer would shut down. I'd call that a message...
 

Phlargo

Senior member
Jul 21, 2004
865
0
0
Originally posted by: TheWatcher1955
Another thread that has strayed 100% off topic.

I wouldn't worry about it too much - as long as we're keeping up some solid PSU dialectic and not dabbling in smoke and mirrors. I was just so tired of thread after thread of "The problem is totally your power supply - to run that 8800 gts, you need at least a $200 PSU". But feel free to bring up a new topic to in this discussion thread to qualify your own post as on topic! I fully encourage it. This is an open thread and I want people to speak their minds and get good constructive feedback.
 

Phlargo

Senior member
Jul 21, 2004
865
0
0
Originally posted by: bryanW1995
Originally posted by: Phlargo
Originally posted by: bamacre
First off, this is a wonderful thread. I cringe everytime I see someone recommend an OCZ 700w power supply for systems that would never hit 400w under full load.

Anandtech did a test here in this article...
http://www.dailytech.com/article.aspx?newsid=4812

With a quad core Intel cpu, and an 8800 GTX video card, their test system hit about 321W under full load.

I also don't get the "future considerations" when it comes to buying a power supply. Sure many people may add a hard drive, or a second optical drive, to their system. But the two main components that draw the most power are the cpu and video card. We've already seen what Intel and AMD can do with cpu efficiency, especially with Intel's Core technology. But Nvidia and ATI (or AMD now) are headed down the same path. I honestly think that we are seeing a peak in regards to how much power video cards use, and it will not be long before we see video cards using a lot less power.

OEM's don't just use el cheapo cards. Dell has always offered the top dog-video card with their higher-end systems, like the current XPS 410. They are offering the 8800 GTX card with that system and its 375W (dual 12V @18A) power supply. So there is not just a push by consumers for more efficient gpu's, but also from OEM's who would like to offer the highest-end video cards, without having to run them with higher-wattage power supplies.

While I tend to be in complete agreement with you, bamacre - if you buy a power supply that will support your CPU (with overclock) and GPU, I had a friend who just hit that limit. He wanted to add a second hard drive to a SLI'd 7900GS AthlonX2-3000 system and it couldn't handle any more - he actually got a bios message (that's what he said).

As far as video cards starting to use less power - sooner the better. I can't my 8800 GTS get so hot and uses so much power. Even though it's freakin' fast, it's in many ways a step backwards in terms of power efficiency. I mean, performance per watt might have improved.. but not much - it's a power hungry beast.
yeah, I got a bios message with my 350 w cheapie psu with an opteron 180/7600gt/2sata system, too. I would get to the bios and the computer would shut down. I'd call that a message...

Haha.. that's a nice story. I totally got to get my friend on here to post his experience with his bios and to actually share what happened with him. I think it would be enlightening.
 

magreen

Golden Member
Dec 27, 2006
1,309
1
81
Originally posted by: jonnyGURU
Originally posted by: bamacre
Originally posted by: jonnyGURU
Originally posted by: bamacre

With a quad core Intel cpu, and an 8800 GTX video card, their test system hit about 321W under full load.

I would hardly call those PC's under "full load" using those benchmark programs.

Really? Well I certainly cannot disagree, not my numbers.

What would you estimate that system would hit under full load?

Probably about 400or 450 if you had both cores working out as well as the GPU optimized with proper DX10 drivers, etc.

Also, the test system is just one HDD, no optical, not TEC or water cooled, so those variables are going to add onto that number.
First of all, jonnyGURU, thank you for gracing us all with your contributions here.


Secondly, great thread. :thumbsup:

I'd like to know what you think of this data: AnandTech measured power consumption of an X6800 & 8800GTX system and found it to be about 275W. Again, they're stress testing the GPU in those tests, so it could be the X6800 CPU was not under full load. But how much power difference in the CPU are we talking about going from 50% load to 100% load on both cores? Maybe 30W? Here's a link: AT tested power consumption of X6800 at idle and full CPU load. They tested it at the wall outlet and got 172W under full load and 114W at idle... that's a 58W difference. So if you figure about half of that for our purposes (since the CPU wasn't idle in the GPU stress test, let's say it was using about 50% of the processor) then it's about a 30W difference. Bringing us up to 305W.

And maybe another 40W if you overclock the CPU? We're still talking 350W max draw for an OC'd dual core system. So acc. to these numbers 400W to 450W for a non-OC'd quadcore seems a bit high. What do you think?

Btw, there's also a decent thread here with some info.

Cheers.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |