Would AMD be better off if they had not developed Bulldozer?

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Chiropteran

Diamond Member
Nov 14, 2003
9,811
110
106
This isn't last year. As I said above, I'm not talking about the future. I'm also not talking about the past. You can live in the past if you want, but I am comparing the best GPUs available today. Not a year ago, not 6 months ago- today.

You can make excuses for nvidia if you want, but the fact is right now today they are making cards which require more power and perform less.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
.A 2600k with a gtx580 would probably beat a 7970 with a BD chip since the AMD cpu is holding back the video card lol

In some games yes, in others no.

Latest FPS DX-11 games like AVP, Crysis 2 DX-11, BF3 and more doesn't scale with faster CPUs at and above 1080p.
 

blckgrffn

Diamond Member
May 1, 2003
9,198
3,185
136
www.teamjuchems.com
This isn't last year. As I said above, I'm not talking about the future. I'm also not talking about the past. You can live in the past if you want, but I am comparing the best GPUs available today. Not a year ago, not 6 months ago- today.

You can make excuses for nvidia if you want, but the fact is right now today they are making cards which require more power and perform less.

Perhaps the key difference is that for a while they were the big kid on the block and that is what made the power consumption tolerable? It is only now they play second fiddle while carrying the burden of higher power consumption.

Plus, in compute they were (are?) nearly untouchable. Even now if you are running a CUDA app they are still the top card. (590 excluded)

If that is analogous to the lead Intel had before BD launched in gaming, AMD has not narrowed that gap.
 
Last edited:

Chiropteran

Diamond Member
Nov 14, 2003
9,811
110
106
In some games yes, in others no.

Latest FPS DX-11 games like AVP, Crysis 2 DX-11, BF3 and more doesn't scale with faster CPUs at and above 1080p.

I suspect the bulldozer+7970 would use less power as well. Most sites only tested power with all cores pushed 100% When bulldozer performs poorly in games, usually it's because the games are not very multi-threaded, and would only push 2-3 of the 8 cures. Which means power usage is greatly reduced compared to the tests people refer to when trying to prove how much power bulldozer uses.

I have a kill-a-watt meter coming shortly, I plan to do a few simple power tests myself out of curiosity.
 

Chiropteran

Diamond Member
Nov 14, 2003
9,811
110
106
Perhaps the key difference is that for a while they were the big kid on the block and that is what made the power consumption tolerable? It is only now they play second fiddle while carrying the burden of higher power consumption.

Plus, in compute they were (are?) nearly untouchable. Even now if you are running a CUDA app they are still the top card. (590 excluded)

If that is analogous to the lead Intel had before BD launched in gaming, AMD has not narrowed that gap.

Interesting how your reply is timed before my post.

I agree with your analysis, and I think the same is the case with bulldozer-

If 8150 destroyed i7 2600, the power usage would be an "oh, that sucks, but still an awesome CPU" type thing. Since it's performance wasn't that good, people are really focusing on the power usage.

I just wanted to point out the similarity- a GTX 580 falls into the exact same sort of situation. It's not as fast as the best, AND it uses a lot more power.
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
232
106
If 8150 destroyed i7 2600, the power usage would be an "oh, that sucks, but still an awesome CPU" type thing. Since it's performance wasn't that good, people are really focusing on the power usage.

I just wanted to point out the similarity- a GTX 580 falls into the exact same sort of situation. It's not as fast as the best, AND it uses a lot more power.
Spot on.
 

grkM3

Golden Member
Jul 29, 2011
1,407
0
0
wow now we are talking about cards that are only out today.but the 14 months the 580 was on top dosnt count for anything right?

in 3 months when the new gtx hits the streets we can compare power draw but you have to compare what the 580 was made to go up against not something that came out 10 days ago.

you are doing exaclty the same thing and comparing a new card to an old card and not comparing card that were the same gen.Good for AMD on getting there nexg gen out first but you cant compare its power draw to nvidia last gen.

and the new card pulls alittle less totall power draw over the 580,its maybe 30-50 watts in most games and is on a 28nm node

if the rumors are right the mid range gtx will out perform the top end AMD card so Im sure it will have a higher power draw,but if its 15-20% faster than the comp Im cool with that.I need to upgrade my cards soo anyways and Im holding off to see what nvidia can do.

I have no problems running an AMD card but Im not about to drop 550 on a card that might get beat by a 400 dollar card in a few months
 
Last edited:

blckgrffn

Diamond Member
May 1, 2003
9,198
3,185
136
www.teamjuchems.com
Interesting how your reply is timed before my post.

I agree with your analysis, and I think the same is the case with bulldozer-

If 8150 destroyed i7 2600, the power usage would be an "oh, that sucks, but still an awesome CPU" type thing. Since it's performance wasn't that good, people are really focusing on the power usage.

I just wanted to point out the similarity- a GTX 580 falls into the exact same sort of situation. It's not as fast as the best, AND it uses a lot more power.

I am from the future and this is how I have chosen to use my power.

Be careful on your drive home today.

 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
232
106
Electric kettles and vacuum cleaners use far more power.

All these power savings make sense in a data center. I wouldn't worry about a single computer
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
You are comparing 28nm to 40nm what did AMD have last year when we were comapring the 580 to the cards before the 7970,you know,the ones that actually were going against the 580?

Radeon 5970 is faster and consumes slightly less power.
 

grkM3

Golden Member
Jul 29, 2011
1,407
0
0
Radeon 5970 is faster and consumes slightly less power.

exactly my point.anything that is in the same level performance wise is pulling just as much power and Not a hell of a lot less power like the other guy said.
 

gramboh

Platinum Member
May 3, 2003
2,207
0
0
I suspect the bulldozer+7970 would use less power as well. Most sites only tested power with all cores pushed 100% When bulldozer performs poorly in games, usually it's because the games are not very multi-threaded, and would only push 2-3 of the 8 cures. Which means power usage is greatly reduced compared to the tests people refer to when trying to prove how much power bulldozer uses.

I have a kill-a-watt meter coming shortly, I plan to do a few simple power tests myself out of curiosity.

I'm interested in seeing that kind of data/analysis (power consumption during real world gaming at various resolutions versus 100% load synthetic tests).

Personally, I don't care about the differential in power cost (talking about even 100 watts during gaming for say 10-20 hours a week max, who cares). But the heat associated with increased power draw, and subsequently the noise to cool it, are what matter to me.
 

Abwx

Lifer
Apr 2, 2011
11,167
3,862
136
Certainly that the spared 15/20$ annual over comsumption of a BD
is the shortest road to get much richer...
 

CHADBOGA

Platinum Member
Mar 31, 2009
2,135
832
136
Certainly that the spared 15/20$ annual over comsumption of a BD
is the shortest road to get much richer...
It will be a lot more than $15/$20 if you try overclocking BD to above 4.5Ghz, compared to overclocking SB to that speed.
 

bononos

Diamond Member
Aug 21, 2011
3,894
162
106
I suspect the bulldozer+7970 would use less power as well. Most sites only tested power with all cores pushed 100% When bulldozer performs poorly in games, usually it's because the games are not very multi-threaded, and would only push 2-3 of the 8 cures. Which means power usage is greatly reduced compared to the tests people refer to when trying to prove how much power bulldozer uses.

I have a kill-a-watt meter coming shortly, I plan to do a few simple power tests myself out of curiosity.

Power usage may be reduced but its still a ~30W difference btwn a 2500 and a 8150 running a single threaded app. Games would typically be running a few threads and the power usage would be higher.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
AMD wouldn't be better off, however most the people who got sapped into buying AM3+ boards before it released probably would be.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
New AMD FX series CPUs will be launch shortly,

FX4170 @ 4,2GHz 125W TDP and
FX6200 @ 3,8GHz 125W TDP

FX6100 and FX4100 should see a price reduction.

Also, FX8150 Liquid Cooling

 

BD231

Lifer
Feb 26, 2001
10,568
138
106
AMD wouldn't be better off, however most the people who got sapped into buying AM3+ boards before it released probably would be.

Yeah cus SLI was really an option before AM3+, who the hell wants support for that when they already own an x6 right?
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
I can say with certainty that most Ph II x6 users are not better off with BD. They are still enjoying their 6-core PC with no regrets about waiting for BD.
 

Chiropteran

Diamond Member
Nov 14, 2003
9,811
110
106
Power usage may be reduced but its still a ~30W difference btwn a 2500 and a 8150 running a single threaded app. Games would typically be running a few threads and the power usage would be higher.

I agree completely. My point is, that 30W difference is OBLITERATED by the 100W difference between using a GTX 580 vs a 7970. If a 30W difference makes a CPU undesirable, the incredibly power usage of a GTX 580 should be noted as an even bigger negative.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
I agree completely. My point is, that 30W difference is OBLITERATED by the 100W difference between using a GTX 580 vs a 7970. If a 30W difference makes a CPU undesirable, the incredibly power usage of a GTX 580 should be noted as an even bigger negative.

OC both CPUs and see that power gap widen by almost a factor of 10. You are cherry-picking your argument here.
 

Chiropteran

Diamond Member
Nov 14, 2003
9,811
110
106
OC both CPUs and see that power gap widen by almost a factor of 10. You are cherry-picking your argument here.

I don't get your point. Are you suggesting I should argue something when it is false?

I'm aware of bulldozer's power usage when overclocked. Guess what else increases in power usage when overclocked? GTX 580! Sounds to me like if anyone is cherry picking it is you, as you assume a gamer would overclock a CPU when the GPU is actually the greater performance bottleneck.

Also, you don't even have data to prove your point. If you do have a link, please share- because I'm pretty sure all the outrageous power usage overclocking tests are done with ALL cores pegged at 100%. That will not be the case while gaming, so it's not an accurate reflection of actual gaming power usage, overclocked or not.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Yeah cus SLI was really an option before AM3+, who the hell wants support for that when they already own an x6 right?

In most cases SLI performs terribly on AMD setups.

However for those few who did want SLI there was a hack that allowed it on boards prior to AM3+ anyways. Not sure how much I'd value "support" in this situation, as the lanes were already there it simply lacked certification.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
I don't get your point. Are you suggesting I should argue something when it is false?

I'm aware of bulldozer's power usage when overclocked. Guess what else increases in power usage when overclocked? GTX 580! Sounds to me like if anyone is cherry picking it is you, as you assume a gamer would overclock a CPU when the GPU is actually the greater performance bottleneck.

Also, you don't even have data to prove your point. If you do have a link, please share- because I'm pretty sure all the outrageous power usage overclocking tests are done with ALL cores pegged at 100%. That will not be the case while gaming, so it's not an accurate reflection of actual gaming power usage, overclocked or not.

Here are some links, since you are trolling the boat for AMD so much in this thread.

Stock freq: http://www.anandtech.com/show/4955/the-bulldozer-review-amd-fx8150-tested/9

2500K: 133.3w
8150: 229

Overclocked power consumption: http://www.guru3d.com/article/amd-fx-8150-processor-review/7 and http://www.guru3d.com/article/core-i5-2500k-and-core-i7-2600k-review/7

2600k: +34w over stock
8150 (4.6ghz): +200w over stock

That is with each CPU at approx a 1ghz OC. The 2600K will use some additional power to get to 4.6, but you see the difference.

Literally, the 8150 uses 2X the power when overclocked. I would like to see the 7970 use 1/2 the power at max (standard) OC compared to the 580. Keep trolling my friend.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |