AMD back in gear, Centurion FX

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

dastral

Member
May 22, 2012
67
0
0
Take a 3770K do the "maximum stable undervolt at stock", run Prime.
Take the same 3770K do the "maximum stable overclock", run Prime.

A +100W difference (instead of +200W for the FX) between both scenario wouldn't surprise me.
The total draw of course, but a ginormous difference in overall consumption ?

idontcare's numbers seem to point towards a 125+194W FX8350 (lets assume there is zero loss from the MB/PSU).
I wouldnt be surprised to see a 77+100W 3770K in the same scenario.
 
Last edited:

Pilum

Member
Aug 27, 2012
182
3
81
You would struggle to get that sort of draw out of a 3930k in fact, its monstrous amounts of power.
Oh, depends on clockspeed. I just recently saw results from a 3930K drawing 500W... at 4.5 GHz. Of course this is a different league of performance than any AMD CPU at 500W can achieve; the crown in performance and performance/W always goes to Intel.
 

FlanK3r

Senior member
Sep 15, 2009
313
38
91
You dont seem to understand the temperatures, or how its measured. Or when AMD chips either throttle or simply shut down compared to Intel chips.

Also the FX8350 draws 140W+ as IDC perfectly showed. Not to mention MSI, Asrock etc complains about the chip running out of spec for users and being throttled by the VRMs.

Just like all developers praise PS4. Nobody praised PS3, right?

because bad VRM design (price cost)..FX-8350 has garance of 125W TDP, but thats not mean, your chip maximum peak for short time must be up to 125W. Lowend boards for AMD and for Intel is different story. ASR, Giga, MSI in lowend are really big sh*ts (price about 50-70 dollars). But now is situation a bit better, because now with Vishera are not so problems as with Bulldozers. Bulldozers was more power hungry for current and 32nm was very strange, you could have one Bulldozer with -40W lower consumption in load tahn second.
Personaly, I tested it at about 5 pieces of FX-8150. 2 was the same batch others not the same.
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
I simply don't buy the power figures at idle in that review. They are not consistent with the reviews from any other site I have ever seen. And obviously, since they are measuring at the wall and the CPU is idle, most of those watts are coming from other devices by definition. They simply aren't meaningful.

At any rate, if you do want to accept their power numbers, then you also have to accept the performance numbers on the preceding pages, which show the i7-3770K utterly embarrassing the FX-8350 despite using less power under load.

The power figures are compatible with others; e.g., the 47 W difference between the FX-8350 and the i7-3770k under load are reasonable. What matters is total consumption from the wall. You cannot isolate power consumption from the CPU alone and even if you can do it the resulting value is useless, because you are never running the CPU alone.

Yes, I accept their performance numbers, but they are not "utterly embarrassing" as you believe. I am not going to do here an exhaustive analysis, but here go some basic points:

The review uses W7 SP1, this is an OS with a bad scheduler that unoptimizes threads for the Bulldozer/Piledriver architecture. Microsoft released two fixes for the FX-chips, but they do not work. This puts extra performance on the Intel side.

Intel chips run with stock or overclocked RAM. The i7-3930x extreme chips run with RAM overclocked up to 1.98 GHz!! The FX-8350 run with underclocked RAM. This puts extra performance on the Intel side.

Third, the software used (e.g. Cinebench 11.5) is compiled with the Intel CPU dispatcher, which forces the code to run slower when detects an AMD chip through the CPUID. This puts extra performance on the Intel side.

When you correct all that, the FX perform very very well... and at lower price.
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
another source showing the 130w TDP 3960x using less power (Interesting, the 125w TDP CPU from AMD uses the same power as teh 150w TDP CPU from intel.)

From the same source that you give:

A 150 W TDP should have been enough for us to know that high power consumption would be measurable. Indeed, the -3970X spikes up to use the most power under load.



and since the 8350 takes longer to complete things:


The FX-8350 takes more time than the real because the review is favouring the Intel chips. See my recent reply to Charles Kozierok for some technical details.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
Yes, I accept their performance numbers, but they are not "utterly embarrassing" as you believe. I am not going to do here an exhaustive analysis, but here go some basic points:

The review uses W7 SP1, this is an OS with a bad scheduler that unoptimizes threads for the Bulldozer/Piledriver architecture. Microsoft released two fixes for the FX-chips, but they do not work. This puts extra performance on the Intel side.

Intel chips run with stock or overclocked RAM. The i7-3930x extreme chips run with RAM overclocked up to 1.98 GHz!! The FX-8350 run with underclocked RAM. This puts extra performance on the Intel side.

Third, the software used (e.g. Cinebench 11.5) is compiled with the Intel CPU dispatcher, which forces the code to run slower when detects an AMD chip through the CPUID. This puts extra performance on the Intel side.

When you correct all that, the FX perform very very well... and at lower price.

The answer to all your "points" (excuses) is "So what?". You have all these excuses why Intel chips perform better, but the simple fact is these are real world uses. Like Win7, Intel compilers, etc.

There's nothing to "correct", it is correct as it is just because it is what it is. Get it?
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
Someone should do a reality check chart on power usage, comparing the power usage by a computer to the power used by air conditioner, washer/dryer, dishwasher, plasma tv, etc.

These peak numbers are cute and all, but irrelevant for most users most of the time. Yeah I get it, if you want to run prime95 all the time then it might be important to you. Or maybe you could make the argument that people who run prime95 all the time obviously don't care about the environment or electricity waste, so what is an extra 100W?

It's a catch-22. Power usage is only remotely relevant if you run your CPU at 100% load 24/7, but if you are running your CPU at 100% load 24/7 you have already proven that you don't care about power usage. I may own 2 FX-8120 CPU, but I can confidently say that with my usage patterns I have used less electricity with these two computers than any single i5 CPU of any generation would use if it was running prime95 24/7 during the same time frame.

Your usage pattern has a huge impact on power, your CPU choice has a relatively minor impact. Reviews should make this more clear, I think a lot of clueless newbies are looking at the peak power usage charts and thinking that they actually use that much power when they are browsing the web. Highly deceptive, IMO.

People using the FX chip knows that substituting a single incandescent-bulb on their home provides more economic benefits than 40--60 W more consumption under full load.
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
So your solution to AMD's massive power consumption problem is to..... not use the processor. Seems legit.

This also applies to Intel chips consuming between 76 W and 84 W more than the AMD chip. True?
 
Last edited:

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
The power figures are compatible with others; e.g., the 47 W difference between the FX-8350 and the i7-3770k under load are reasonable. What matters is total consumption from the wall. You cannot isolate power consumption from the CPU alone and even if you can do it the resulting value is useless, because you are never running the CPU alone.

Yes, I accept their performance numbers, but they are not "utterly embarrassing" as you believe. I am not going to do here an exhaustive analysis, but here go some basic points:

The review uses W7 SP1, this is an OS with a bad scheduler that unoptimizes threads for the Bulldozer/Piledriver architecture. Microsoft released two fixes for the FX-chips, but they do not work. This puts extra performance on the Intel side.

Intel chips run with stock or overclocked RAM. The i7-3930x extreme chips run with RAM overclocked up to 1.98 GHz!! The FX-8350 run with underclocked RAM. This puts extra performance on the Intel side.

Third, the software used (e.g. Cinebench 11.5) is compiled with the Intel CPU dispatcher, which forces the code to run slower when detects an AMD chip through the CPUID. This puts extra performance on the Intel side.

When you correct all that, the FX perform very very well... and at lower price.


AMD is the problem here, not MS. It is not fair from a business perspective to expect MS to spend millions (probably more) to try and fix the scheduler in windows 7, something that has worked for years without problems. AMD did a stupid thing here, releasing hardware that no current software can properly use (expecting that software would be re-written).

You RAM comparison is utterly flawed. Same speed (same price) for comparison. If one needs faster (and more expensive) RAM to run then they are going to be docked. (Not to mention that if you are overclocking either cpu, overclocking the RAM is fair game).

Obviously every piece of software must be using that compiler from over 6 years ago. Cinebench gets pretty much what you would expect, looking at other performance metrics.

In the end it comes down to not what the processor is capable of but what the processor DELIVERS. I don't care (and pretty much no one else cares either) if its twice as fast in theory when it takes twice as long to run the calculation. The CELL cpu (PS3) was supposed to be awesome (theoretically) but real-world computations fell short.
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
Same chip with slightly different cache fuse-off. Being allowed to go to a higher clock won't increase idle power consumption. Having more cache will only very slightly impact idle power consumption, at most. A nearly 50W delta between the two can't be justified by differences in the CPUs themselves.

A 25% more cache is not what I would call "slightly different". Moreover I was refering to the slightly higher base clock {*}. I don't know more details of the chips. Maybe the BIOS of the mobo has some role on this as well.


{*} I do not affirm that this is the explanation but a very rough computation using both effects gives about 40 W delta. Is it a causality? Maybe.
 

Ajay

Lifer
Jan 8, 2001
16,094
8,106
136
Well, I'm clocking out of this thread until we have concrete information as opposed to rumors. Cheers!
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
A 25% more cache is not what I would call "slightly different". Moreover I was refering to the slightly higher base clock {*}. I don't know more details of the chips. Maybe the BIOS of the mobo has some role on this as well.


{*} I do not affirm that this is the explanation but a very rough computation using both effects gives about 40 W delta. Is it a causality? Maybe.

The great thing about SRAM is it draws almost nothing when it isn't being accessed. I would definitely consider 25% more L3 cache to be slightly different in the context of impact on idle power consumption.

And I don't see how a different base clock would possibly impact idle power consumption, if it's really idle then the clocks will never be anywhere close to respective base clocks. Sorry but I have no idea what you're saying in your footnote.
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
Did everyone miss the undervoltage (1.268V, measured) VS 1.560V for stability (measured).

+200W is ginormous, but going from a "stable undervolt" to "max stable OC" is a huge gap.
I dont doubt your values, but they are visually biased unless someone reads your entire post.
You compare your "best vs worse" scenarios under a specific load.

I'm willing to be bet the same could almost be done with an 3770K : "stable undervolt" vs "MAX OC".
And the results would probably be very similar (albeit not not that enormous) with a 300W CPU....

I believe that I already wrote in a previous message that there are many issues with their numbers...
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
The answer to all your "points" (excuses) is "So what?". You have all these excuses why Intel chips perform better, but the simple fact is these are real world uses. Like Win7, Intel compilers, etc.

There's nothing to "correct", it is correct as it is just because it is what it is. Get it?

If there is nothing to correct, why did Microsoft release the FX patches?

If there is nothing to correct, why was Intel obligated (court case) to add a disclaimer to its compiler?

If there is nothing to correct, why are programmers developing alternative compilers?

If there is nothing to correct, why you need to underclock the RAM used in the FX down to 1.6 GHz but you need to overclock up to 1.9 GHz the RAM used in the Intel extreme chips before comparing both chips?

The facts are run a more modern OS, non-cheated benchmarks, and stock RAM and 'magically' the Intel chips do not perform so well as you believed...
 
Last edited:

galego

Golden Member
Apr 10, 2013
1,091
0
0
AMD is the problem here, not MS. It is not fair from a business perspective to expect MS to spend millions (probably more) to try and fix the scheduler in windows 7, something that has worked for years without problems. AMD did a stupid thing here, releasing hardware that no current software can properly use (expecting that software would be re-written).

Millions? If nobody was releasing hardware beyond the available software and paradigms then no innovation would be possible.

Your claim that no current software can properly use the hardware is false.

Moreover, you missed my main point.

You RAM comparison is utterly flawed. Same speed (same price) for comparison. If one needs faster (and more expensive) RAM to run then they are going to be docked. (Not to mention that if you are overclocking either cpu, overclocking the RAM is fair game).

Please don't make me laugh. Using stock speed for the FX is flawed, but running the Intel extreme chips with ram overclocked beyond 1.9 GHz is fair?

Obviously every piece of software must be using that compiler from over 6 years ago. Cinebench gets pretty much what you would expect, looking at other performance metrics.

No problem with that, if programmers had known since the first day that the compiler has a biased cpu dispatcher. Something that Intel denied up to that after the court case, Intel was obligated to add a disclaimer about the bizarre behaviour of its compiler, which does not optimize code according to the real capabilities of the cpu but in base to brand name.

Now programmers know the issue and can choose what compiler they wan to use.

In the end it comes down to not what the processor is capable of but what the processor DELIVERS. I don't care (and pretty much no one else cares either) if its twice as fast in theory when it takes twice as long to run the calculation. The CELL cpu (PS3) was supposed to be awesome (theoretically) but real-world computations fell short.

Exactly that was my point. Some people takes some benchmarks/reviews as universal truth. Use another OS, use stock RAM, and use well-compiled software and your processors delivers its real performance.
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
The great thing about SRAM is it draws almost nothing when it isn't being accessed. I would definitely consider 25% more L3 cache to be slightly different in the context of impact on idle power consumption.

And I don't see how a different base clock would possibly impact idle power consumption, if it's really idle then the clocks will never be anywhere close to respective base clocks. Sorry but I have no idea what you're saying in your footnote.

How do you know that the cache is not being accessed? Is there no hundred of background tasks in a bloated OS as W7?

Take a look to the idle power consumption for both stock speed and overclocked. More cycles = more power.

Using the 25% more cache and the higher clock a very rough estimation gives 40 W more idle power for the extreme chip. Maybe a causality, but surprising close to the 50 W measured delta.
 

MustangSVT

Lifer
Oct 7, 2000
11,554
12
81
If there is nothing to correct, why did Microsoft release the FX patches?

If there is nothing to correct, why was Intel obligated (court case) to add a disclaimer to its compiler?

If there is nothing to correct, why are programmers developing alternative compilers?

If there is nothing to correct, why you need to underclock the RAM used in the FX down to 1.6 GHz but you need to overclock up to 1.9 GHz the RAM used in the Intel extreme chips before comparing both chips?

The facts are run a more modern OS, non-cheated benchmarks, and stock RAM and 'magically' the Intel chips do not perform so well as you believed...

damn. too bad we all use "old" OS, cheated benchmarks and ram. I wish we could live in AMD magic land. I do I do!
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Millions? If nobody was releasing hardware beyond the available software and paradigms then no innovation would be possible.

Your claim that no current software can properly use the hardware is false.

Moreover, you missed my main point.



Please don't make me laugh. Using stock speed for the FX is flawed, but running the Intel extreme chips with ram overclocked beyond 1.9 GHz is fair?



No problem with that, if programmers had known since the first day that the compiler has a biased cpu dispatcher. Something that Intel denied up to that after the court case, Intel was obligated to add a disclaimer about the bizarre behaviour of its compiler, which does not optimize code according to the real capabilities of the cpu but in base to brand name.

Now programmers know the issue and can choose what compiler they wan to use.



Exactly that was my point. Some people takes some benchmarks/reviews as universal truth. Use another OS, use stock RAM, and use well-compiled software and your processors delivers its real performance.

Um, I hardly think that RAM speed is going to affect LGA 2011 processors that badly (its quad channel and even at slower frequencies 1600 vs 1866 for the FX is more than 50% more bandwidth). There are also a huge number of reviews showing that there is little to no difference in RAM speed affecting performance (more latency but not bandwidth). In benchmarks RAM should be at the same speed.
I don't know these reviews, can you please point them out?

Do you seriously think that every program uses that compiler? In reality few programs will use that compiler (Most will use visual studio--MS or GCC)? I believe that problem has now been fixed. And cinebench performs where you would expect it to, given the performance of the 8350 in applications such as rendering or encoding (the periodic wins of the FX are also likely because cinebench scales well with hyperthreading which some programs may do poorly or not at all).


If I'm making money using CS6 and it runs better on an intel cpu then ultimately thats all I care about. I don't care about crippled performance or theoretical yields but what can get my job done the fastest.
 

insertcarehere

Senior member
Jan 17, 2013
639
607
136
At the end of the day, AMD just wasn't targeting the right market with releasing CMT for the desktop/laptop users, no one cares if it is theoretically faster on the 'right' compilers, this is not HPC, software developers make software to reach as wide a market as possible (within reason), AMD should have realised this.
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
IMO they should save this type of halo product for 28nm Steamroller. A 5 module Steamroller part AMD certified to hit 5GHz is much more likely to make me consider a premium price than anything they can deliver with 32nm Vishera.
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
How do you know that the cache is not being accessed? Is there no hundred of background tasks in a bloated OS as W7?

Because idle, by definition, means nothing is running. The great majority of background tasks are spending all of their time sleeping as well, and the few that wake up only do so very intermittently. Look at CPU time - if it says 1% that means 99% of the time is spent between halt instructions. Where the cache isn't being accessed. Those times when it is don't really contribute to anything power-wise, the latency of going in and out of even the lowest power modes is not high.

Take a look to the idle power consumption for both stock speed and overclocked. More cycles = more power.

I don't have an explanation for that, but a lot of other things about the measurements both defy common sense and are highly inconsistent with what other sites have reported so that makes it hard to be motivated to try to explain anything here..

Using the 25% more cache and the higher clock a very rough estimation gives 40 W more idle power for the extreme chip. Maybe a causality, but surprising close to the 50 W measured delta.

You can see that in laptops SB i7s use only a few watts at most while idle. If this wasn't the case battery life would be a much bigger problem. These are the same chips as the desktop ones. A bulk of the power consumption you see at the wall while a modern CPU is idling comes from the rest of the system.

When the system is really idle it falls into a lower power state where clocks are gated. You can read about it in the ACPI specifications. Even when it's not idle, if you're not using the system heavily it'll step back the clocks to a minimum speed like 800MHz. Look at what Windows reports for the clock speed when you're not using the CPU.

There's simply no possibly explanation for why an SB-E would use 20 times more power at idle than an SB, nor is there one for why an SB-E with slightly more cache enabled would use 40W more power at idle. I really don't know how you would estimate 40W more power consumption from a rather small base clock increase and a bit less cache fused off, that sounds like making numbers up..
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
AMD is the problem here, not MS. It is not fair from a business perspective to expect MS to spend millions (probably more) to try and fix the scheduler in windows 7, something that has worked for years without problems. AMD did a stupid thing here, releasing hardware that no current software can properly use (expecting that software would be re-written).

damn. too bad we all use "old" OS, cheated benchmarks and ram. I wish we could live in AMD magic land. I do I do!

When Intel introduced Hyper-Threading (HT) it wasn’t working properly with Windows 2000/Linux and there were only few applications supporting it in desktop. In fact, people were guided by Intel to deactivate HT on Windows 2000 because of degradation in performance when HT was enabled.
 

Mallibu

Senior member
Jun 20, 2011
243
0
0
The facts are run a more modern OS, non-cheated benchmarks, and stock RAM and 'magically' the Intel chips do not perform so well as you believed...

Actually the facts are quite the opposite.
Windows 8 benefit both intel & amd by ~ 3-4 %. No magic gains for amd only, here.
RAM speed, excluding winrar benchmarks, gives no performance gains.
And lastly, I highly doubt that all commercial products use an Intel compiler, that purposedly gimps AMD cpus. That's fairy tales for "I WANT TO BELIEVE" fanboys, that cannot simply accept that AMD's performance is dissapointing the last years and blame everything else.
 

SPBHM

Diamond Member
Sep 12, 2012
5,058
410
126
When Intel introduced Hyper-Threading (HT) it wasn’t working properly with Windows 2000/Linux and there were only few applications supporting it in desktop. In fact, people were guided by Intel to deactivate HT on Windows 2000 because of degradation in performance when HT was enabled.

when HT was released on the desktop, Windows XP was at least an year old, and supported HT properly, there was a clear gain in many things, but obviously since most softwares had no use for more than 1 core / tread this was not always obvious.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |