Originally posted by: nZone
Originally posted by: Genx87
Originally posted by: Lord Athlon
Originally posted by: Ocguy31
No offense, but if you have to worry about a couple dollars in power bills, a top of the line card probably isnt for you anyway.
This card would be much more attractive @ $599, which is where Im sure it will end up soon. EVGA has it listed for $649, and e-tailers generally have the cards for around $50 less after launch.
I disagree on the first part
You pay for the card once while you pay your for electricity monthly so it's not really comparable , especially your bill goes up by 50 $ or more.
Show me how one of these will run your bill 50 dollars higher per month than it already it is? Even if you run this thing 100% at full tilt 24/7 I bet it would amount to less than 10 bucks. That said most may run it full tilt for a few hours a week. Which will be a few bucks tops.
It is valid. The electric bill does go up quite a bit. I have the older card 8800GTX and on average I spent on playing game about 5 hours a day. The electric bill usually runs around $175/month. This last couple of months, well I have some other things to do, I didn't spend a single hour playing game; and I saw my electric bill drop down to $130/month.
You don't need to guess, you can use the power of MATH!
X = amount of watts something takes (from the wall)
Y = amount of hours per day you use said device (so if you are checking video card and you play an average of 2 hours a day EVERY day, this would be 2)
Z = cost of electricity in your region in $/kwh. (in texas it is 14 cents per kwh, so that makes it 0.14$/kwh)
X watts * .001 kw/watt * Y hour/day * Z $/kwh = $/day cost of operation...
X watts * .001 kw/watt * Y hour/day * Z $/kwh * 365 days/year = $/year cost of operation...
If you are running a video card you should do this twice.
Once with hours per day you game at max power draw of card. (ex, 2 hours a day at 150 watts)
And a second time with the difference in power consumption in idle mode times the hours a day you have the computer on but don't game (ex, 10 hours a day of web surfing at 30 watts - 5 watts of an IGP... or you could do 30 watts of new card - 35watts of current card = -5 watts, aka, save power during idle).
If it is a GTX 280 in a media center then you should do it a third time... for how many hours a day you will use it for decoding video (it has 3 modes... 25 watts in idle, 35watts in video decode mode... and I think it was over 150 watts during gaming)