Originally posted by: apoppin
Originally posted by: taltamir
Originally posted by: thilan29
Originally posted by: Jax Omen
taltamir, even if my entire 700W PSU was running at full-blast all day (presumably consuming 700W from the wall 24/7) it wouldn't be a significant chunk of my power bill. No more than $5-$10 a month. A GPU isn't even close to that.
Sometimes it isn't just about the money. If all computers consumed less electricity that is a GREAT thing not only for our own electricity bills but also for the environment (yes yes I'm a bit of a tree hugger) since...you know...you actually have to PRODUCE that electricity you're consuming. In my area it costs about 7-9 cents/kWh (doesn't include delivery and all those other charges on the bill...that's another thing that most people don't factor in when doing calculations on money saved/spent) and only about 24% of the grid power is made from fossil fuels so it isn't that big of a deal. However in some areas that percentage may be much higher so it does make a difference.
To each his own though...if you don't care...well...you don't care.
EDIT: Just to go along with your 700w PSU theory:
0.7kW * 24hours * 30 days * $0.15/kWh(total cost including GST, etc. in my area) = $75.60/month
That's not insignificant and much more than the $5-$10 you assumed. Obviously a regular (regular for ATers anyway) computer at consumes only around ~200w at idle so the cost would be less.
Congrats thilian... I told people to do the math like I did and see it is otherwise and I get things like "no dude it totally costs only 5-10$ a month"..
Congrats to thilian on actually doing math.
Also congrats on remembering the extra cost. There is ALSO the extra cost of AC... if your PC generates all that extra heat, you have to pay more on your AC bill cause the AC has to cool all that extra heat. In fact I found out that in freezing winter I have to turn the COOLING on if I have more then 2 computers on at the same time.
During winter:
0-1 PC on: heating on
2 PC on: off
3+ PC on: cooling on.
@nRolle: 10$ a month might be pocket change to you, but when figuring out the price/performance of a certain part it makes a serious difference. If you keep a card a year (often times more) and it costs 10$ more a month than another card, then that is 120$ a year. A significan't difference.
If I left my PC on 24/7 I would give utmost priority to hybrid power exactly because of that.
(since I live in texas its 10$ per 100 watt per month. not 10$ for 200watts like you get, and it is mostly from coal, although I opted for the cleaner natural gas)
What about for me where the PC provides heat for my living room and warms my toes in the morning?
- right now , it's almost 9:30 PM and it is in the upper 50sF outside - the "extra" heat is welcome; only July and August is really hot here
My home is far more "cool" than "warm" and i NEVER - ever - turn on the AC except to test it - once-a-year; i use evaporative cooling in the high-desert - and my electricity bill
averages less than $90 a month total - averaging all 12 months; my stove and water heater is electric! I have another apartment on the same meter with a storage heater [1750w] that is on in winter, at least 12 hours a day!
Now how much am i really saving? ..
over using a GPU that saves only UP-TO "200w" .. we are
not "saving" all 850w, you know
-my PS is 850w, i have HD2900 CrossFire .. and my rig is on 12 hours a day >
mostly for gaming and benchmarking
Your math sucks
NO one - leaves their rig
running games 24-hours a day
on max load
PERIOD
maybe $3 to $8 a month; i would say mostly less than $3 a month practical DIFFERENCE [between using $750w and using say 500w with the "power saver" on]
You care that much? What kind of CAR do you Drive; If it is an SUV you are a hypocrite; switch to CFL or turn the thermostat up or down 2 degres to make a *real* differerence; try insulation and replace older appliances - instead of trying to save a nickel a day on entertainment
get real