- Oct 30, 1999
- 11,815
- 104
- 106
Originally posted by: mpilchfamily
Well what about overall efficiency? Isn't the lower wattage unit going to offer better efficiency then the 1200W unit? Not because one has a better efficiency rating then the other but because of the overall loads put on each unit.
Wha?
The lower wattage unit IS NOT going to offer better efficiency.
1st off, and I know you already know this, a computer is only going to use however much power it needs.
Just want to get that statement out of the way.
Whether or not one power supply is more efficient at lower loads than another completely depends on the individual power supply. Not how much power they put out.
Yes, a power supply typically has a sort of "bell curve" to their efficiency. It starts at a relatively low number, peaks in the middle and then drops at the end... typically. So typically that would mean that a PC that consumed less power (say 20% of the PSU's capability) would be less efficient than a PC that uses more power (say 50% of the PSU's capability) on the same power supply.....
BUT....
I've seen 1200W power supplies that are still 80% efficient or better at loads as low as 150W and I've seen other, lower wattage power supplies, touted as "super efficient", just hitting 80% at 150W and certainly never getting up around 85 or 87% efficient at ANY load.
So... you CAN NOT make blanket statement that state that "this lower wattage power supply is more efficient than a bigger power supplly" simply because EVERY power supply is COMPLETELY different than the next.
Even in the case of the Thermaltake Toughpowers. I don't NEED a 1kW power supply. But if I had to choose between a Toughpower 750W and a Toughpower 1200W, I would still choose the 1200W (if the monetary budget allowed it) because the 1200W has better voltage regulation and the efficiency is better at THE SAME LOADS than the 750W.
Originally posted by: Phlargo
So to follow up, Jonny - how much power do we actually need? We have some power supply calculators giving a rough estimate (even if purportedly too high) but, as you pointed out earlier, even if those were correct they assume a single point in time. Do we have histogram information on power supplies expressing their efficiency and output over time?
It's hard to say. Calculators tend to calculate all hard drives spinning, all opticals spinning, CPU and GPU all under load at the same time... How often does that really happen? I think the best thing to do is to buy a Kill-A-Watt, or at least a clamping ammeter, and see what your PC is currently using.
If you're building a new PC from the ground up, I'm going to say just buy the best power supply you can buy that fits your needs. I just can't comprehend the mentality of "what's the least I can get by with" when it comes to power supplies. Why am I going to spend $300 each on two video cards, but try to keep my power supply cost under $100? I mean, there's some darn good 400W power supplies out there, but if I've only got $60 to spend on a PSU and expect the PSU to last two years, I would also have to consider... do I really need SLI? Would I be Ok with the E6600 over the E6700?
Originally posted by: PhlargoWhat is the average "rate of decay" for power supplies? If I need 400 Watts for my system and I buy a power supply now that I want to last for 2 years, what kind of overshoot is necessary to account for that decay?
Depends. Different PSU's have different derating curves. And everyone's PC has different loads. Different operating temperatures. Different configurations that may require the PSU to suck up more heat than someone else's. There is no set answer to that question. It's not like when they tell you a particlar car has a life expectancy of 300,000 miles because that car is an assembled product, manufactured to operate within certain parameters. Take that car and put it in a demolition derby. Your 300,000 miles just turned into 30 minutes tops. Let's say you take that engine out of that car and drop it into a race car, guess what? You probably hacked that engine's life expectency into 1/20th.
Originally posted by: PhlargoOn another front, did anyone hear about the recent EPA regulation proposal requiring higher efficiency from power supplies (computer included) - how do you think that'll affect the PC power supply industry?
I think it's good. But I'm also growing tired of all of the talk about Google's "super efficient" computers. The ignorance is unrivaled.
One Google representative is quoted as saying that a PC only uses half of the power delivered to it. Half. That's 50%. Even a crappy $20 Allied power supply is at least 75% efficient. In fact, for a PSU to be "ATX12V" one of the requirement is that the PSU is at least 75% efficient. Not 50% efficient. Unless of course Google has been powering their servers with wall-wart power packs all of these years.
They also boast that their new power supplies are 90 to 95% efficient. What they fail to mention is that these are DC to DC power supplies. DC to DC power supplies tend to be 90 to 95% efficient. But how efficient is the AC to DC conversion that's feeding all of these power supplies? THAT is where most of your power loss occurs! Not in DC distribution, but in AC to DC conversion. Assuming that their AC to DC conversion is as high as 90%, 90 to 95% of 90% is 81% to 85.5% total efficient! It would have been cheaper and lower maitainence to just install a standard, 80 Plus certified, AC to DC power supply in each of their servers! But you don't read that in any of the press, do you?