ATI 4xxx Series Thread

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: allies
The only reason Nvidia is so ahead of ATi right now is because of the leap the 8800GTX provided. It was eerily similar to the 9700Pro leap. I bet next generation ATI is a similar leap.

While I wish you were right, I kind of doubt it. I think right now GPU's have hit stagnation for the most part. I think now we will see marginal increases for the most part. Again, I hope I am wrong as I would absolutely be thrilled if either company can come out with another 9700Pro/8800GTX type card, but those types of things can't continue to happen as much as we would all hope for it.

Time will tell... Right now the new R700 looks to be pretty interesting. If they can double the performance of the 3870, I might be in the market for a new card. But, with GT200 close behind it, who knows what will happen. But GT200 even now looks to be marginal increase if the rumors (just rumors) are true.
 

SilentRunning

Golden Member
Aug 8, 2001
1,493
0
76
Originally posted by: batmang
I hope the 4800 series destroys the 9800 series and GT200. Its time for ATi to take the power back!!! (RATM FTW.)

That is a given, it just depends on which 9800 series you are talking about (ATI or Nvidia.)

 

thilanliyan

Lifer
Jun 21, 2005
11,944
2,175
126
Originally posted by: Jax Omen
taltamir, even if my entire 700W PSU was running at full-blast all day (presumably consuming 700W from the wall 24/7) it wouldn't be a significant chunk of my power bill. No more than $5-$10 a month. A GPU isn't even close to that.

Sometimes it isn't just about the money. If all computers consumed less electricity that is a GREAT thing not only for our own electricity bills but also for the environment (yes yes I'm a bit of a tree hugger) since...you know...you actually have to PRODUCE that electricity you're consuming. In my area it costs about 7-9 cents/kWh (doesn't include delivery and all those other charges on the bill...that's another thing that most people don't factor in when doing calculations on money saved/spent) and only about 24% of the grid power is made from fossil fuels so it isn't that big of a deal. However in some areas that percentage may be much higher so it does make a difference.

To each his own though...if you don't care...well...you don't care.

EDIT: Just to go along with your 700w PSU theory:
0.7kW * 24hours * 30 days * $0.15/kWh(total cost including GST, etc. in my area) = $75.60/month

That's not insignificant and much more than the $5-$10 you assumed. Obviously a regular (regular for ATers anyway) computer at consumes only around ~200w at idle so the cost would be less.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: taltamir
Originally posted by: nRollo
On power: I'd note that the amount of power using a pc half the day consumes, it's pretty much a drop in the bucket on your electricity bill anyway.

On upcoming, I have a hunch you won't be surprised BFG. Good stuff coming from the Santa Clara team. (or so say the rumors anyway)

no, it isn't. It is a significant expense. Do the math and you will see for yourself.

Yes it is. it's pocket change.

Energy costs $.0767/KWh where I live. Let's say I run a 200W video card 24/7.

That's 4800W a day, or 4.8KWh/day.

4.8KWh X $.0767 =$.37/day to run 200w of video card 24/7.

That's $11.47 a month.

I tipped the bartender more than that for a couple drinks last night after work.

P.S. Of course this $11.47/month is only if you're using 200W of video juice all day long every day.

Me, I don't come anywhere close to using my computer 24/7- I have a career, family, other hobbies.

IMO if $11/mo is a concern, computer gaming doesn't make sense economically.
 

thilanliyan

Lifer
Jun 21, 2005
11,944
2,175
126
Originally posted by: nRollo
Heh- Thilan and I must be psychically linked....

Don't forget the extra costs other than the cost of electricity itself...(ie. GST, delivery, transmission,...and bunch of others I forgot). For me it's about $0.15/kWh including everything.
 

Killrose

Diamond Member
Oct 26, 1999
6,230
8
81
Originally posted by: thilan29
Don't forget the extra costs other than the cost of electricity itself...(ie. GST, delivery, transmission,...and bunch of others I forgot). For me it's about $0.15/kWh including everything.

We just had an avalanche take out three towers from our Hydro plant and are going to be charged $0.50+/kWh for the next three months or so due to we are now 80% or so on deisel. Normally we pay about $0.11/kWh.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: thilan29
Originally posted by: Jax Omen
taltamir, even if my entire 700W PSU was running at full-blast all day (presumably consuming 700W from the wall 24/7) it wouldn't be a significant chunk of my power bill. No more than $5-$10 a month. A GPU isn't even close to that.

Sometimes it isn't just about the money. If all computers consumed less electricity that is a GREAT thing not only for our own electricity bills but also for the environment (yes yes I'm a bit of a tree hugger) since...you know...you actually have to PRODUCE that electricity you're consuming. In my area it costs about 7-9 cents/kWh (doesn't include delivery and all those other charges on the bill...that's another thing that most people don't factor in when doing calculations on money saved/spent) and only about 24% of the grid power is made from fossil fuels so it isn't that big of a deal. However in some areas that percentage may be much higher so it does make a difference.

To each his own though...if you don't care...well...you don't care.

EDIT: Just to go along with your 700w PSU theory:
0.7kW * 24hours * 30 days * $0.15/kWh(total cost including GST, etc. in my area) = $75.60/month

That's not insignificant and much more than the $5-$10 you assumed. Obviously a regular (regular for ATers anyway) computer at consumes only around ~200w at idle so the cost would be less.

Congrats thilian... I told people to do the math like I did and see it is otherwise and I get things like "no dude it totally costs only 5-10$ a month"..

Congrats to thilian on actually doing math.

Also congrats on remembering the extra cost. There is ALSO the extra cost of AC... if your PC generates all that extra heat, you have to pay more on your AC bill cause the AC has to cool all that extra heat. In fact I found out that in freezing winter I have to turn the COOLING on if I have more then 2 computers on at the same time.
During winter:
0-1 PC on: heating on
2 PC on: off
3+ PC on: cooling on.

@nRolle: 10$ a month might be pocket change to you, but when figuring out the price/performance of a certain part it makes a serious difference. If you keep a card a year (often times more) and it costs 10$ more a month than another card, then that is 120$ a year. A significan't difference.

If I left my PC on 24/7 I would give utmost priority to hybrid power exactly because of that.
(since I live in texas its 10$ per 100 watt per month. not 10$ for 200watts like you get, and it is mostly from coal, although I opted for the cleaner natural gas)
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: taltamir
Originally posted by: thilan29
Originally posted by: Jax Omen
taltamir, even if my entire 700W PSU was running at full-blast all day (presumably consuming 700W from the wall 24/7) it wouldn't be a significant chunk of my power bill. No more than $5-$10 a month. A GPU isn't even close to that.

Sometimes it isn't just about the money. If all computers consumed less electricity that is a GREAT thing not only for our own electricity bills but also for the environment (yes yes I'm a bit of a tree hugger) since...you know...you actually have to PRODUCE that electricity you're consuming. In my area it costs about 7-9 cents/kWh (doesn't include delivery and all those other charges on the bill...that's another thing that most people don't factor in when doing calculations on money saved/spent) and only about 24% of the grid power is made from fossil fuels so it isn't that big of a deal. However in some areas that percentage may be much higher so it does make a difference.

To each his own though...if you don't care...well...you don't care.

EDIT: Just to go along with your 700w PSU theory:
0.7kW * 24hours * 30 days * $0.15/kWh(total cost including GST, etc. in my area) = $75.60/month

That's not insignificant and much more than the $5-$10 you assumed. Obviously a regular (regular for ATers anyway) computer at consumes only around ~200w at idle so the cost would be less.

Congrats thilian... I told people to do the math like I did and see it is otherwise and I get things like "no dude it totally costs only 5-10$ a month"..

Congrats to thilian on actually doing math.

Also congrats on remembering the extra cost. There is ALSO the extra cost of AC... if your PC generates all that extra heat, you have to pay more on your AC bill cause the AC has to cool all that extra heat. In fact I found out that in freezing winter I have to turn the COOLING on if I have more then 2 computers on at the same time.
During winter:
0-1 PC on: heating on
2 PC on: off
3+ PC on: cooling on.

@nRolle: 10$ a month might be pocket change to you, but when figuring out the price/performance of a certain part it makes a serious difference. If you keep a card a year (often times more) and it costs 10$ more a month than another card, then that is 120$ a year. A significan't difference.

If I left my PC on 24/7 I would give utmost priority to hybrid power exactly because of that.
(since I live in texas its 10$ per 100 watt per month. not 10$ for 200watts like you get, and it is mostly from coal, although I opted for the cleaner natural gas)

What about for me where the PC provides heat for my living room and warms my toes in the morning?
- right now , it's almost 9:30 PM and it is in the upper 50sF outside - the "extra" heat is welcome; only July and August is really hot here
My home is far more "cool" than "warm" and i NEVER - ever - turn on the AC except to test it - once-a-year; i use evaporative cooling in the high-desert - and my electricity bill averages less than $90 a month total - averaging all 12 months; my stove and water heater is electric! I have another apartment on the same meter with a storage heater [1750w] that is on in winter, at least 12 hours a day!

Now how much am i really saving? .. over using a GPU that saves only UP-TO "200w" .. we are not "saving" all 850w, you know
-my PS is 850w, i have HD2900 CrossFire .. and my rig is on 12 hours a day > mostly for gaming and benchmarking

Your math sucks

NO one - leaves their rig running games 24-hours a day on max load

PERIOD



maybe $3 to $8 a month; i would say mostly less than $3 a month practical DIFFERENCE [between using $750w and using say 500w with the "power saver" on]
You care that much? What kind of CAR do you Drive; If it is an SUV you are a hypocrite; switch to CFL or turn the thermostat up or down 2 degres to make a *real* differerence; try insulation and replace older appliances - instead of trying to save a nickel a day on entertainment
get real
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: apoppin
Originally posted by: taltamir
Originally posted by: thilan29
Originally posted by: Jax Omen
taltamir, even if my entire 700W PSU was running at full-blast all day (presumably consuming 700W from the wall 24/7) it wouldn't be a significant chunk of my power bill. No more than $5-$10 a month. A GPU isn't even close to that.

Sometimes it isn't just about the money. If all computers consumed less electricity that is a GREAT thing not only for our own electricity bills but also for the environment (yes yes I'm a bit of a tree hugger) since...you know...you actually have to PRODUCE that electricity you're consuming. In my area it costs about 7-9 cents/kWh (doesn't include delivery and all those other charges on the bill...that's another thing that most people don't factor in when doing calculations on money saved/spent) and only about 24% of the grid power is made from fossil fuels so it isn't that big of a deal. However in some areas that percentage may be much higher so it does make a difference.

To each his own though...if you don't care...well...you don't care.

EDIT: Just to go along with your 700w PSU theory:
0.7kW * 24hours * 30 days * $0.15/kWh(total cost including GST, etc. in my area) = $75.60/month

That's not insignificant and much more than the $5-$10 you assumed. Obviously a regular (regular for ATers anyway) computer at consumes only around ~200w at idle so the cost would be less.

Congrats thilian... I told people to do the math like I did and see it is otherwise and I get things like "no dude it totally costs only 5-10$ a month"..

Congrats to thilian on actually doing math.

Also congrats on remembering the extra cost. There is ALSO the extra cost of AC... if your PC generates all that extra heat, you have to pay more on your AC bill cause the AC has to cool all that extra heat. In fact I found out that in freezing winter I have to turn the COOLING on if I have more then 2 computers on at the same time.
During winter:
0-1 PC on: heating on
2 PC on: off
3+ PC on: cooling on.

@nRolle: 10$ a month might be pocket change to you, but when figuring out the price/performance of a certain part it makes a serious difference. If you keep a card a year (often times more) and it costs 10$ more a month than another card, then that is 120$ a year. A significan't difference.

If I left my PC on 24/7 I would give utmost priority to hybrid power exactly because of that.
(since I live in texas its 10$ per 100 watt per month. not 10$ for 200watts like you get, and it is mostly from coal, although I opted for the cleaner natural gas)

What about for me where the PC provides heat for my living room and warms my toes in the morning?
- right now , it's almost 9:30 PM and it is in the upper 50sF outside - the "extra" heat is welcome; only July and August is really hot here
My home is far more "cool" than "warm" and i NEVER - ever - turn on the AC except to test it - once-a-year; i use evaporative cooling in the high-desert - and my electricity bill averages less than $90 a month total - averaging all 12 months; my stove and water heater is electric! I have another apartment on the same meter with a storage heater [1750w] that is on in winter, at least 12 hours a day!

Now how much am i really saving? .. over using a GPU that saves only UP-TO "200w" .. we are not "saving" all 850w, you know
-my PS is 850w, i have HD2900 CrossFire .. and my rig is on 12 hours a day > mostly for gaming and benchmarking

Your math sucks

NO one - leaves their rig running games 24-hours a day on max load

PERIOD



maybe $3 to $8 a month; i would say mostly less than $3 a month practical DIFFERENCE [between using $750w and using say 500w with the "power saver" on]
You care that much? What kind of CAR do you Drive; If it is an SUV you are a hypocrite; switch to CFL or turn the thermostat up or down 2 degres to make a *real* differerence; try insulation and replace older appliances - instead of trying to save a nickel a day on entertainment
get real

ofcourse they don't run games 24/7. its funny but the power saving features we mentioned ONLY matters if you are NOT playing games 24/7... the MORE games you play the LESS effective power saving features become, since they only come into play during IDLEING. not during actual gameplay.

But we were discussing the ability to completely turn off a video card when it is not in use (hybrid power). And many video cards today take 100-150 watts of IDLE power. Per video card (so SLI/xfire have it worse).

You are just pulling numbers out of thin air. how did you calculate 3-8$?
It is fairly simple to calculate, if you know how many hours a day you play games, and how many hours a day you do 2d stuff or just leave the computer idling.
You calculate load power and idle power, and then just multiply it out.

If it is an SUV you are a hypocrite; switch to CFL or turn the thermostat up or down 2 degres to make a *real* differerence; try insulation and replace older appliances - instead of trying to save a nickel a day on entertainment
Bold and silly assumptions to make. Do you honestly think someone who says that they are concerned about the environement has not done all that? i drive a 4 cylinder high efficiency honda, i install insulation, i switched to CFLs, etc... way to go catching the "hypocrisy".

You insist again and again on just tossing numbers and NOT doing any math.
Tell me your video card, how many hours a day you have the computer idle but on, how many hours a day off, and how many hours a day you game (average hours for all of these) and I will do the math for you.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Seems odd to be a month away, without some type of 3d mark score. The prices still make it seem like a midrange solution to me.
 

allies

Platinum Member
Jun 18, 2002
2,572
0
71
Originally posted by: ArchAngel777
Originally posted by: allies
The only reason Nvidia is so ahead of ATi right now is because of the leap the 8800GTX provided. It was eerily similar to the 9700Pro leap. I bet next generation ATI is a similar leap.

While I wish you were right, I kind of doubt it. I think right now GPU's have hit stagnation for the most part. I think now we will see marginal increases for the most part. Again, I hope I am wrong as I would absolutely be thrilled if either company can come out with another 9700Pro/8800GTX type card, but those types of things can't continue to happen as much as we would all hope for it.

Time will tell... Right now the new R700 looks to be pretty interesting. If they can double the performance of the 3870, I might be in the market for a new card. But, with GT200 close behind it, who knows what will happen. But GT200 even now looks to be marginal increase if the rumors (just rumors) are true.

I didn't mean the RV770! Oops... if anything, it will be NEXT generation, not this one to have such a huge leapfrog. IMO.
 

thilanliyan

Lifer
Jun 21, 2005
11,944
2,175
126
Originally posted by: Killrose
Originally posted by: thilan29
Don't forget the extra costs other than the cost of electricity itself...(ie. GST, delivery, transmission,...and bunch of others I forgot). For me it's about $0.15/kWh including everything.

We just had an avalanche take out three towers from our Hydro plant and are going to be charged $0.50+/kWh for the next three months or so due to we are now 80% or so on deisel. Normally we pay about $0.11/kWh.

Wow that's a lot more expensive. I'm lucky in that there aren't really any natural disasters in my area.

Originally posted by: kreacher
Tom's news link - http://www.tomshardware.com/ne...-radeon-4800,5223.html

Thanks for the link. Some nice info if all is true.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: taltamir
Originally posted by: apoppin
Originally posted by: taltamir
Originally posted by: thilan29
Originally posted by: Jax Omen
taltamir, even if my entire 700W PSU was running at full-blast all day (presumably consuming 700W from the wall 24/7) it wouldn't be a significant chunk of my power bill. No more than $5-$10 a month. A GPU isn't even close to that.

Sometimes it isn't just about the money. If all computers consumed less electricity that is a GREAT thing not only for our own electricity bills but also for the environment (yes yes I'm a bit of a tree hugger) since...you know...you actually have to PRODUCE that electricity you're consuming. In my area it costs about 7-9 cents/kWh (doesn't include delivery and all those other charges on the bill...that's another thing that most people don't factor in when doing calculations on money saved/spent) and only about 24% of the grid power is made from fossil fuels so it isn't that big of a deal. However in some areas that percentage may be much higher so it does make a difference.

To each his own though...if you don't care...well...you don't care.

EDIT: Just to go along with your 700w PSU theory:
0.7kW * 24hours * 30 days * $0.15/kWh(total cost including GST, etc. in my area) = $75.60/month

That's not insignificant and much more than the $5-$10 you assumed. Obviously a regular (regular for ATers anyway) computer at consumes only around ~200w at idle so the cost would be less.

Congrats thilian... I told people to do the math like I did and see it is otherwise and I get things like "no dude it totally costs only 5-10$ a month"..

Congrats to thilian on actually doing math.

Also congrats on remembering the extra cost. There is ALSO the extra cost of AC... if your PC generates all that extra heat, you have to pay more on your AC bill cause the AC has to cool all that extra heat. In fact I found out that in freezing winter I have to turn the COOLING on if I have more then 2 computers on at the same time.
During winter:
0-1 PC on: heating on
2 PC on: off
3+ PC on: cooling on.

@nRolle: 10$ a month might be pocket change to you, but when figuring out the price/performance of a certain part it makes a serious difference. If you keep a card a year (often times more) and it costs 10$ more a month than another card, then that is 120$ a year. A significan't difference.

If I left my PC on 24/7 I would give utmost priority to hybrid power exactly because of that.
(since I live in texas its 10$ per 100 watt per month. not 10$ for 200watts like you get, and it is mostly from coal, although I opted for the cleaner natural gas)

What about for me where the PC provides heat for my living room and warms my toes in the morning?
- right now , it's almost 9:30 PM and it is in the upper 50sF outside - the "extra" heat is welcome; only July and August is really hot here
My home is far more "cool" than "warm" and i NEVER - ever - turn on the AC except to test it - once-a-year; i use evaporative cooling in the high-desert - and my electricity bill averages less than $90 a month total - averaging all 12 months; my stove and water heater is electric! I have another apartment on the same meter with a storage heater [1750w] that is on in winter, at least 12 hours a day!

Now how much am i really saving? .. over using a GPU that saves only UP-TO "200w" .. we are not "saving" all 850w, you know
-my PS is 850w, i have HD2900 CrossFire .. and my rig is on 12 hours a day > mostly for gaming and benchmarking

Your math sucks

NO one - leaves their rig running games 24-hours a day on max load

PERIOD



maybe $3 to $8 a month; i would say mostly less than $3 a month practical DIFFERENCE [between using $750w and using say 500w with the "power saver" on]
You care that much? What kind of CAR do you Drive; If it is an SUV you are a hypocrite; switch to CFL or turn the thermostat up or down 2 degres to make a *real* differerence; try insulation and replace older appliances - instead of trying to save a nickel a day on entertainment
get real

ofcourse they don't run games 24/7. its funny but the power saving features we mentioned ONLY matters if you are NOT playing games 24/7... the MORE games you play the LESS effective power saving features become, since they only come into play during IDLEING. not during actual gameplay.

But we were discussing the ability to completely turn off a video card when it is not in use (hybrid power). And many video cards today take 100-150 watts of IDLE power. Per video card (so SLI/xfire have it worse).

You are just pulling numbers out of thin air. how did you calculate 3-8$?
It is fairly simple to calculate, if you know how many hours a day you play games, and how many hours a day you do 2d stuff or just leave the computer idling.
You calculate load power and idle power, and then just multiply it out.
Like you did ... only i did not assume 750w .. the difference is more like 100-150 w .. i divided your assumed figure by ~4

If it is an SUV you are a hypocrite; switch to CFL or turn the thermostat up or down 2 degres to make a *real* differerence; try insulation and replace older appliances - instead of trying to save a nickel a day on entertainment
Bold and silly assumptions to make. Do you honestly think someone who says that they are concerned about the environement has not done all that? i drive a 4 cylinder high efficiency honda, i install insulation, i switched to CFLs, etc... way to go catching the "hypocrisy".
i actually honestly do. imo, only a hypocrite will b!tch that we should save a nickel on something while they personally don't give a sh!t about their own carbon footprint and only bring it up to make a point about someone else.

I am NOT talking about *you* specifically - rather, "in general" - yes it is very hypocritical to be conscious of the pennies while wasting dollars
You insist again and again on just tossing numbers and NOT doing any math.
Tell me your video card, how many hours a day you have the computer idle but on, how many hours a day off, and how many hours a day you game (average hours for all of these) and I will do the math for you.

i already told you "how" .. i divided your unrealistic average by ~4.

i also have taken into consideration my OWN electricity bills and how much more they went up over 3 months by adding an over-clocked 2900Pro .. less than a dollar for each month [over last year] as far as i can see. $3 was my gift to your faulty math.




 

kb2114

Member
May 8, 2006
86
0
0
Hmm... based on that article looks like I'll be picking up either dual 4870's or the X2. Unless nvidia can offer a single card solution that blows both out of the water...
 

phexac

Senior member
Jul 19, 2007
315
4
81
I am certainly hoping that the new line of ATI cards does great. But I will likely hold off buying until both ATI's ad NVIDIA's next gen is out on the market.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,001
126
Originally posted by: taltamir
Originally posted by: thilan29
Originally posted by: Jax Omen
taltamir, even if my entire 700W PSU was running at full-blast all day (presumably consuming 700W from the wall 24/7) it wouldn't be a significant chunk of my power bill. No more than $5-$10 a month. A GPU isn't even close to that.

Sometimes it isn't just about the money. If all computers consumed less electricity that is a GREAT thing not only for our own electricity bills but also for the environment (yes yes I'm a bit of a tree hugger) since...you know...you actually have to PRODUCE that electricity you're consuming. In my area it costs about 7-9 cents/kWh (doesn't include delivery and all those other charges on the bill...that's another thing that most people don't factor in when doing calculations on money saved/spent) and only about 24% of the grid power is made from fossil fuels so it isn't that big of a deal. However in some areas that percentage may be much higher so it does make a difference.

To each his own though...if you don't care...well...you don't care.

EDIT: Just to go along with your 700w PSU theory:
0.7kW * 24hours * 30 days * $0.15/kWh(total cost including GST, etc. in my area) = $75.60/month

That's not insignificant and much more than the $5-$10 you assumed. Obviously a regular (regular for ATers anyway) computer at consumes only around ~200w at idle so the cost would be less.

Congrats thilian... I told people to do the math like I did and see it is otherwise and I get things like "no dude it totally costs only 5-10$ a month"..

Congrats to thilian on actually doing math.

Also congrats on remembering the extra cost. There is ALSO the extra cost of AC... if your PC generates all that extra heat, you have to pay more on your AC bill cause the AC has to cool all that extra heat. In fact I found out that in freezing winter I have to turn the COOLING on if I have more then 2 computers on at the same time.
During winter:
0-1 PC on: heating on
2 PC on: off
3+ PC on: cooling on.

@nRolle: 10$ a month might be pocket change to you, but when figuring out the price/performance of a certain part it makes a serious difference. If you keep a card a year (often times more) and it costs 10$ more a month than another card, then that is 120$ a year. A significan't difference.

If I left my PC on 24/7 I would give utmost priority to hybrid power exactly because of that.
(since I live in texas its 10$ per 100 watt per month. not 10$ for 200watts like you get, and it is mostly from coal, although I opted for the cleaner natural gas)

No. This math is retarded. First, no one has a 700 watt power supply that is forcing it to output 700 watts 24 hours a day, 7 days a week. The vast majority of people are probably well under the max output of their power supply. Secondly, you are comparing running a 700 watt power supply at 100% load 24/7 to running nothing. You are not comparing two rigs with different video cards. Rig A may be running a video card that takes 100 watts. Rig B may be running a video card that swallows up 130 watts. Assuming everything else is the same, you are looking at 30 watts difference, not 700watts to nothing difference. Not to mention that is only when the video card is at 100% load... so unless you game 24/7 I don't see that adding up to much.
 

Jax Omen

Golden Member
Mar 14, 2008
1,654
2
81
Thilan, I am so sorry about energy costs in your area. holy crap.

75% of the power around here is hydro-electric. The rest I dunno, never cared to find out.

it's like $.05/KWh here. Seriously. So ok, maybe $25 for the entire system per month, if it was consuming every ounce of energy the PSU could provide.

The most power-hungry GPU I've heard of is, what, 250W? I'll go with Rollo's numbers. $10-$15 a month isn't significant to anyone who can afford a high-end GPU.
 

thilanliyan

Lifer
Jun 21, 2005
11,944
2,175
126
Originally posted by: SlowSpyder
No. This math is retarded. First, no one has a 700 watt power supply that is forcing it to output 700 watts 24 hours a day, 7 days a week. The vast majority of people are probably well under the max output of their power supply. Secondly, you are comparing running a 700 watt power supply at 100% load 24/7 to running nothing. You are not comparing two rigs with different video cards. Rig A may be running a video card that takes 100 watts. Rig B may be running a video card that swallows up 130 watts. Assuming everything else is the same, you are looking at 30 watts difference, not 700watts to nothing difference. Not to mention that is only when the video card is at 100% load... so unless you game 24/7 I don't see that adding up to much.

Ummm...I thought it was pretty obvious that I only chose 700w because in the post I quoted the poster said 700w was only $5-$10 per month. I was only showing that had that been the case it was NOT an insignificant amount. I did that math more to show that there are other costs besides the basic charge per kWh which should be factored in. Here's what I wrote right afterwards:
"Obviously a regular (regular for ATers anyway) computer at consumes only around ~200w at idle so the cost would be less."

So yes I know that most computers don't consume 700w. And going back to my original post...besides the cost to the individual consumer, if all computers could for example turn off their discrete video cards taken as a whole those are huge energy savings which benefits almost everyone.


Originally posted by: Jax Omen
Thilan, I am so sorry about energy costs in your area. holy crap.

75% of the power around here is hydro-electric. The rest I dunno, never cared to find out.

it's like $.05/KWh here. Seriously. So ok, maybe $25 for the entire system per month, if it was consuming every ounce of energy the PSU could provide.

Ours is mostly hydro and nuclear...about 24% from fossil fuels. It's about $0.07/kWh here for the basic charge but remember there are other charges that should be outlined in your utility bill and for me it brings the total to about $0.15 per kWh, so it more than doubles the basic charge.
 

Jax Omen

Golden Member
Mar 14, 2008
1,654
2
81
I'm well aware of the other charges. I get charged $.025/KWh for the power, another $.025/KWh for delivery/taxes/etc, and a $5/month service fee.

Yes, he chose 700W because of my comment. And while I did estimate low, based on power costs in my area I wasn't *that* far off.



I still stand by that anyone who can afford high-end GPUs shouldn't be affected by the power costs associated with said GPUs. It's pocket change by comparison. And if they are? Turn off your damn AC/heat! Those consume more power than everything else combined in the average home. Next-most power-hungry is the fridge/freezer. PCs are pretty far down the list.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
After seeing more firm specs on the RV770 I don't think there's too much good news here. The only good news for ATI is that they'll have the fastest single-gpu card for a month or two until GT200 releases, at which point they'll get lapped again in terms of performance. Than an X2 version might put them in a competitive position again at which point NV will respond with a die-shrink or SLI-on-a-card solution of their own or both. All while maintaining a comfortable lead at the high-end with a $2000 GT200 Tri-SLi solution.

As for 4870, I don't think it'll be much faster than 9800GTX/8800GTX/Ultra in terms of performance. Maybe 15-25% faster, max. 16 > 32 TMUs seem to be the biggest gain here and specifically mentioned as a major bottleneck for ATi R600 parts. Still, that only puts ATI's texture fill-rate equivalent to a 9600GT, not counting any advantages from different vendor design. The rest of the specs seem rather unspectacular with questionable gains, although shaders may also scale well as that seemed to be another weak point of R600. Going from 64 to 96 real shaders, or 320 to 480 super scalar along with unlinked shader clocks should help close any gaps in shader performance in unoptimized games where NV held a lead previously.

This part would've been a great answer to G80/G92 6 months ago when RV670 released, or even a year ago when R600 released. But at this point I think it'll be obvious that its too little too late, mostly competing with G80/G92 and made obsolete again when NV fires back with GT200 later this quarter.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |