dust
Golden Member
- Oct 13, 2008
- 1,339
- 2
- 71
The 6970/6950 were a step back for AMD power wise.
Where did you get that?
That's not what I see here:
http://www.anandtech.com/show/4061/amds-radeon-hd-6970-radeon-hd-6950/24
The 6970/6950 were a step back for AMD power wise.
Where did you get that?
Against the Radeon HD 5970, the HD 6970 consumed 3% less power yet we found that on average it was 15% slower, 26% slower in Crysis Warhead, which is what we used for stress testing. The Radeon HD 6970 used roughly the same amount of power as the GeForce GTX 570 consuming 1% more when stressed and 4% less at idle. Given that both graphics cards provide similar performance, this time around it seems they are equally matched.
:thumbsup: Ya, seriously. People who drop $300 on CPUs and overclock them like Core i7 920 @ 4.0ghz-4.2ghz or even $200 Phenom II systems that suck power when overclocked and then use the "power savings" argument for GPUs make me laugh!! I expect every single one of them to convert to SB immediately.
It's funny how none of the same people advocate moving towards the Core i5 661 for gaming...
But, when the 480 is overclocked the power used increases a great deal.
LOL, the point went entirely above your head.Really no card was competing with the gtx480 until the 6970. 8 months? The 5870 was some 18% slower then the 5870 going by your math in another thread.
Unless you count the 5970? If you do, then I guess the gtx295 was in competition with the 5870 in those 6 months.
ROFL I knew you would say something bogus like this. The 3800 was a die shrink from the 2900, so please try again. Going by your standards the 3000 series wasn't worthy of a new generation either, and there were only like 4 cards released for the entire series. So it's quite funny you would use this for your argument, because it really doesn't support it at all.HD3870 --> HD4870 was accomplished on the same 55nm. The performance jump was enormous.
Not necessarily. I can tell you with certainty, when you bought a mid-high end or a high-end card, the performance difference was substantial.
- $500 9700Pro was 50-70% faster than a $200 9500Pro.
- $425 6800 Ultra was 70-100% faster than a $200 6600GT. $350 6800GT was an even better value.
- $500 X850XT was a good 30% faster than a $350 X850Pro
- $500 5950 Ultra was 70-100% faster than the $200 5700 Ultra.
- $300 HD4870 was 30-40% faster than the $200 HD4850
- $400 7800GTX was 70%+ faster than a $200 7600GT.
^^^There are far too many examples to list.
Today, the increase in price is not commensurate with increased performance of these $350-500 cards over the $200 offerings. This generation especially, the price/performance for high end cards is one of the worst in the past 10 years. Just think about it, HD5870 (high-end) was at least 70% faster on average than an HD5770 (mid-range). Is HD6970 70% faster than an HD6850? Not even close.
It was almost unthinkable in the past to take a mid-range card and overclock it to high-end's performance. The only time this happened was GeForce 4200 if I remember correctly (unless you consider 9500Pro unlocking into 9700 series). It wasn't until 8800GT that mid-range cards started to be so powerful. Before that, when you paid a premium for a high-end card, you got a massive performance increase!! It was really the 8800GT that opened our eyes into the world of mid-range $200 goodness.
I have been building computers for 10 years. I am of the view that in the past a high-end card really did justify the $400-500 price. Today, a $350 card is barely 20% faster than a $200 card. $500 cards are barely 40-50% faster than $200 cards. It's a free market, so people are free to buy those $500 cards. That doesn't change the fact that they are poor value. You may think my expectations are too high, but I think the market has changed like you said. Nowdays, consumers just expect way less. Not sure why that is.
It really isn't an issue for anyone, unless you live in boonies, eat cold food, hand wash and air dry your clothes, and don't use lights because you go to bed when the sun sets. Until then, you're not allowed to speak of power savings when it comes to hardwired (non battery powered) electronic devices.
Your video card uses 1/100th of the power of all the other appliances and commodities you use on a daily basis, that you could easily survive without. If you are concerned with saving money on your power bill, shut off your AC, shut off your electric water heater, don't use the oven, don't use the electric cook top, don't use the dryer, and replace all your mechanical switches with electronic dimmers or replace all your light bulbs with half of the wattage.
That's how you save money, not by bragging that ATI uses 50W less than nvidia. 50W is the wattage of one halogen light bulb in your kitchen for **** sakes.
1. Go ahead and show me these people who overclock like mad and use the power argument. I doubt there aren't too many of them because obviously they prioritize performance.Ya, seriously. People who drop $300 on CPUs and overclock them like Core i7 920 @ 4.0ghz-4.2ghz or even $200 Phenom II systems that suck power when overclocked and then use the "power savings" argument for GPUs make me laugh!! I expect every single one of them to convert to SB immediately.
What's funny is that I haven't come across a proper review for processor power consumption during gaming. You simply can't use the typical power consumption that reviewers use because they test processors under full 100% load, and that is definitely not the same situation a processor undergoes when gaming. So again, efficiency comes into play. It's more like what power does a processor need when it's 10% loaded? 20% loaded? 30%? 40%? 50%? 60%? 70%? 80%?It's funny how none of the same people advocate moving towards the Core i5 661 for gaming...
http://www.xbitlabs.com/articles/vid..._13.html#sect0
The 6970 performs similar to a 570 yet uses more power. So their smaller more efficient chip strategy seems to have gone out the window.
http://www.xbitlabs.com/articles/video/display/radeon-hd6970-hd6950_13.html#sect0
The 6970 performs similar to a 570 yet uses more power. So their smaller more efficient chip strategy seems to have gone out the window.
According to that chart above from Hocp, the 6970 uses 50 more watts than the 6950, I've read the unlocked 50's to 70's can use an additional 25 or so. That 50-75 watts more, or a amount thats being argued about by the 'green power savers'.
According to that chart above from Hocp, the 6970 uses 50 more watts than the 6950, I've read the unlocked 50's to 70's can use an additional 25 or so. That 50-75 watts more, or a amount thats being argued about by the 'green power savers'.
So do owners unlock their 6950 and be wasteful, does that argument about needing a better psu come in to play ?
I'm being playfully sarcastic with some of these rhetorical questions.
Its called INEFFICIENCY. You see it in heat and added power use from silicon that less perfect. Its similar to any electrical situation where added resistance is introduced.Where did you read this nonsense? Thats impossible, its the exact same chip at the same clocks, at the exact same voltage. The only difference is the memory chips used are rated for a slower speed and that has nothing to do with the power consumption.
http://www.semiaccurate.com/2010/12/27/radeon-hd-6950-flashable-6970/
Also make sure you have a decent power supply, as the card will go from using about 169W at default settings to 202W with the Shaders unlocked all the way to 252W with the card overclocked to the same speeds as a 6970. That means you're actually using 24W more than what a real 6970 would do, again, not likely to be a huge concern for anyone that just wants a bit of extra performance out of their card.
According to that chart above from Hocp, the 6970 uses 50 more watts than the 6950, I've read the unlocked 50's to 70's can use an additional 25 or so. That 50-75 watts more, or a amount thats being argued about by the 'green power savers'.
So do owners unlock their 6950 and be wasteful, does that argument about needing a better psu come in to play ?
I'm being playfully sarcastic with some of these rhetorical questions.
ts called INEFFICIENCY. You see it in heat and added power use from silicon that less perfect. Its similar to any electrical situation where added resistance is introduced.
Quote: http://www.semiaccurate.com/2010/12/...lashable-6970/
Also make sure you have a decent power supply, as the card will go from using about 169W at default settings to 202W with the Shaders unlocked all the way to 252W with the card overclocked to the same speeds as a 6970. That means you're actually using 24W more than what a real 6970 would do, again, not likely to be a huge concern for anyone that just wants a bit of extra performance out of their card.
I was one of the ones that actually fell for the power issue being a huge concern. I figured I would be spending an extra $20 a month in electricity. Man what a fool I was. That marketing is some clever stuff. Makes me wonder what else I've foolishly believed over the years.
We see this all the time, better binned cpu's / gpu's o/c further with less voltage, ergo they use less power.
You take a gpu that was binned a 6950 probably because its a inferior piece and clock it to non standard clocks and it uses more power.
Are we in denial over that ?
Quoting HappyMedium "No real reviewer uses Furmark any more".
From http://www.hardocp.com/article/2010/12/14/amd_radeon_hd_6970_6950_video_card_review/8 using BFBC2.
You're also accepting more performance, which has to be taken into consideration along with the increase in power. I've already explained this...In the above scenario's , I would unlock my 6950, just like I o/c my gtx 460's and accept the power penalties.
At similar price points and performance, makes no sense buying the product that consumes more power.
Wow, you really exaggerrated on the 4850 to 4870 jump. It's more like 25%, not 30-40%.
Let's compare the GTX 580 ($500) to the GTX 460 ($200). The 580 is 70% faster!
The 6970 ($370) is 40% (a lot more in some games and situations) faster than the 6850 ($180), which doesn't seem too far off