a10-5800k+hd7770?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
@SlowSpyder

Dont think so, 22nm Core i3 will have lower power consumption even at 100% load. But and that is a BIG but, FX6300 power consumption in gaming will not be that much higher. Using 970 Chipset instead of the 990FX will lower power consumption between 10W to 15W at full load. At idle FX6300 at default 3500MHz on a 970 chipset motherboard with HD7770 + 128GB SSD was measured at ~50W.
 

pauldun170

Diamond Member
Sep 26, 2011
9,142
5,089
136
@SlowSpyder

Dont think so, 22nm Core i3 will have lower power consumption even at 100% load. But and that is a BIG but, FX6300 power consumption in gaming will not be that much higher. Using 970 Chipset instead of the 990FX will lower power consumption between 10W to 15W at full load. At idle FX6300 at default 3500MHz on a 970 chipset motherboard with HD7770 + 128GB SSD was measured at ~50W.

My i3-2100 and MSI HD7770 idles around 50W. This is with ssd + 2 WD black 1tb drives. I haven't measure load but it certainly doesn't stress out my EA380 psu. I use this box as HTPC\GAMING\general use box.

An i3 is better for gaming with the stock cooler.
You can overclock the A8-5600K (or comparable) to achieve parity or slight better than parity with the i3 but then you really should think about aftermarket cooling which pushes up cost. You also need to consider the quality of the board you pair it up with to make sure it can handle the overclock.

An i3 can be paired with cheapest of cheapo boards, paired up with whatever vid card meets your needs and be a solid gaming box and an excellent general use PC.

If you want to go the AMD route, thats fine and they will get the job done. However as those in the AMD camp have already noted, if you want to really compete you should overclock and if you overclock you should set a little more money aside for quality board and aftermarket cooler.

In summary...buy whatever is on sale and buy a nicer vid card if this will be a gaming box (screw crossfire\sli....just buy one good vid card for a budget box)
 
Aug 11, 2008
10,451
642
126
Those benchmarks (7-zip, Cinebench, x264, POV-Ray and more) are what AT reviews use, i haven't seen anyone here proclaiming those are niche and Cherrypicking benchmarks until a cheaper OverClocked AMD CPU beat Intels Core i5.

Those are 4 benchmarks out of what, 20 or more that bench uses. And of course you dont show any gaming benchmarks because intel is far ahead in that area. So to me that IS the epitome of cherry picking.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
And of course you dont show any gaming benchmarks because intel is far ahead in that area.


Really ???

Post 20,
http://forums.anandtech.com/showpost.php?p=34351894&postcount=20

Two games from AT review that shows FX6300 vs Core i3 3220, FX6300 at default clocks (3500MHz) is faster.

Post 24
http://forums.anandtech.com/showpost.php?p=34352017&postcount=24

I have provided Hardwarecanucks review link, FX6300 is on par at 1080p gaming with Core i3 3220. It wins in some it losses in others, at default clocks they are equals in gaming.

Also, uk.hardware.info review shows the FX6300 (default 3500MHz) on par in 1080p gaming with Core i3 3220

Post 39
http://forums.anandtech.com/showpost.php?p=34356031&postcount=39

Post 39 shows how fast FX6300 @ 4.2GHz is in MT applications.

Again, FX6300 at 4.2GHz is faster than Core i3 in MT applications and in the majority of Games. At the same price point there is no competition, the FX6300 is the clear winner.

ps: for the 100th time, no need for better cooling for 4.2GHz OC.
 

Mallibu

Senior member
Jun 20, 2011
243
0
0
Really ???

Post 20,
http://forums.anandtech.com/showpost.php?p=34351894&postcount=20

Two games from AT review that shows FX6300 vs Core i3 3220, FX6300 at default clocks (3500MHz) is faster.

Post 24
http://forums.anandtech.com/showpost.php?p=34352017&postcount=24

I have provided Hardwarecanucks review link, FX6300 is on par at 1080p gaming with Core i3 3220. It wins in some it losses in others, at default clocks they are equals in gaming.

Also, uk.hardware.info review shows the FX6300 (default 3500MHz) on par in 1080p gaming with Core i3 3220

Post 39
http://forums.anandtech.com/showpost.php?p=34356031&postcount=39

Post 39 shows how fast FX6300 @ 4.2GHz is in MT applications.

Again, FX6300 at 4.2GHz is faster than Core i3 in MT applications and in the majority of Games. At the same price point there is no competition, the FX6300 is the clear winner.

ps: for the 100th time, no need for better cooling for 4.2GHz OC.

You cherrypick 3-4 graphs from each review to prove your point. Your first Skyrim graph has i3 destroying the FX, and then you link another site where in Skyrim again, the FX is 1 fps faster, essentially proving your own self wrong.
The hardware cunnucks review also consistently shows the i3 2100 faster than 3220, and the i5 2400 faster than higher i5 cpus, therefore proving there's error and false results in their methodology.

You also spam 7zip, x264 (2nd pass, no 1st) and PovRay, and spamming the same phrase all over again "In MT apps".
There are no "MT" and "ST" apps, there is no 0 and 1 only. 99% of Real world consist of lightly to medium threaded apps (2-4) where the FX is equal or slower to the i3, and games where the vast majority are faster with the i3. (don't bother linking the exceptions again).

Conclusion: Yes FX is a nice competitor to the i3. You need aftermarket cooler since the stock cpu cooler makes A LOT of noise. I would probably also take the FX 6300 over an i3 with the usage I do.
However, you took your fanboysm too far trying to say that it's also better than an i5 which is not the case, since minus some exceptions, the i5 destroys the FX in overall performance in apps and games, power consumption, and overclocking.

Yes the FX 6300 is a sweet value for money but your exaggerated tries to market it aren't helping. I know you are selling AMD cpus, but your hyperbole-filled advertising gets tiring after a while.
A good product can stand on it's own.
 
Last edited:

SPBHM

Diamond Member
Sep 12, 2012
5,059
413
126
Here's a recommendation from Tom's Hardware:

http://www.tomshardware.com/reviews/gaming-cpu-review-overclock,3106-3.html

It compares the AMD FX 4170 and the Intel i3-3220 both priced at $120.
Judge for yourself which compromise you want to go with.:whiste:

FX 6100 is $15 cheaper than the 4170
http://www.newegg.com/Product/Produc...&Tpk=fx%206100

now it's slower without overclock for gaming, but you probably can overclock it quite easily to around 4GHz, and it will require probably the same level of motherboard/cooling for operating with stability (?) than the 4170, and I think it's a better CPU...
 

NTMBK

Lifer
Nov 14, 2011
10,322
5,352
136
FX 6100 is $15 cheaper than the 4170
http://www.newegg.com/Product/Produc...&Tpk=fx%206100

now it's slower without overclock for gaming, but you probably can overclock it quite easily to around 4GHz, and it will require probably the same level of motherboard/cooling for operating with stability (?) than the 4170, and I think it's a better CPU...

Meh, with the obvious improvements of Piledriver over Bulldozer I wouldn't recommend a 6100 over a 6300.
 

inf64

Diamond Member
Mar 11, 2011
3,864
4,546
136
Meh, with the obvious improvements of Piledriver over Bulldozer I wouldn't recommend a 6100 over a 6300.
FX6300 is just much better chip than any FX61xx/FX41xx. It clocks to the same level as FX83xx and can do ~4.2Ghz with minimal or no voltage increase(just multiplier change). Couple the clocking headroom with higher IPC(games especially) and with lower power draw (either stock or OCed) and you should never think of first gen. Bulldozer again .
 

sm625

Diamond Member
May 6, 2011
8,172
137
106
If you game or encode for 4 hours a day, an FX6300 will easily consume an extra 400 watts per day. At 4.2 GHz it could be as much as 600 watts a day. Even at 400 extra watts a day, you're looking at an extra 5 cents a day or $60 over 3 years. So for anyone who games or encodes for more than a few hours a day, and intends to do so for 3 or more years, the FX6300 is directly competing with an i5-33xx or even a 3570k (it all depends on exactly how much power you're using). That's why so many enthusiasts are so hard on AMD right now. Once you factor in the amount of power that the average enthusiast is using, the comparisons become laughable.
 

NTMBK

Lifer
Nov 14, 2011
10,322
5,352
136
If you game or encode for 4 hours a day, an FX6300 will easily consume an extra 400 watts per day. At 4.2 GHz it could be as much as 600 watts a day. Even at 400 extra watts a day, you're looking at an extra 5 cents a day or $60 over 3 years. So for anyone who games or encodes for more than a few hours a day, and intends to do so for 3 or more years, the FX6300 is directly competing with an i5-33xx or even a 3570k (it all depends on exactly how much power you're using). That's why so many enthusiasts are so hard on AMD right now. Once you factor in the amount of power that the average enthusiast is using, the comparisons become laughable.

Given that you can't get the unit for energy right, I'm not inclined to trust your pulled-out-of-your-arse estimates for energy costs.

Clue: watts are a unit of power.

EDIT: Okay, let's do this right.

First off, I'll assume a worst case scenario for power usage- running x264 constantly, meaning your CPU is at 100% load all of the time. This is significantly more than the power usage during gaming, might I add.

According to Anand's review, the difference in power usage between the FX-6300 system and the i5-3570k system is 44.4W, or 0.0444kW. (About the same as a lightbulb.)

According to Wikipedia, US energy prices range from 0.08$/kWh to 0.17$/kWh. I shall pick a number in the middle of this, $0.12, for my calculations, but adjust this for your local energy prices.

Finally, we shall use your figure of average 4 hours' use a day over 3 years, for 4380hrs. This gives us our final sum:

Cost over 3 years = 0.12 * 0.0444 * 4380 = $23

So far lower than your guesstimate. And let me reiterate, that is assuming the PC is at full load the entire time- a highly unrealistic assumption for anyone not running Distributed Computing, a rendering farm, etc. The difference in idle power consumption is only 14.7W, about a third that at load, and we can assume that average power consumption will be somewhere between the two.

Next time, do the maths instead of making it up and hoping no-one will call you on it.
 
Last edited:

SPBHM

Diamond Member
Sep 12, 2012
5,059
413
126
Meh, with the obvious improvements of Piledriver over Bulldozer I wouldn't recommend a 6100 over a 6300.

well, that's more expensive, while the 6100 is cheaper than the a8 5600k,
the A8 might have an improved architecture but it lacks l3 cache (significant for gaming) and an extra module,
 

inf64

Diamond Member
Mar 11, 2011
3,864
4,546
136
If you game or encode for 4 hours a day, an FX6300 will easily consume an extra 400 watts per day. At 4.2 GHz it could be as much as 600 watts a day. Even at 400 extra watts a day, you're looking at an extra 5 cents a day or $60 over 3 years. So for anyone who games or encodes for more than a few hours a day, and intends to do so for 3 or more years, the FX6300 is directly competing with an i5-33xx or even a 3570k (it all depends on exactly how much power you're using). That's why so many enthusiasts are so hard on AMD right now. Once you factor in the amount of power that the average enthusiast is using, the comparisons become laughable.

From where you pulled those 400W and 600W numbers? It's just ridiculously high. Also OCing FX6300 with stock Vcore with just a multiplier will not raise the power draw by 200W, that sort of OC would raise the drawn power by CPU alone by a factor of 1.1-1.2 (depending what clock you take for reference point). And stock FX6300 TDP rating is 95W so at most 95x1.2=115 or 20W more. Linear increase in clock without Vcore change results in linear icnrease in power drawn(roughly).
 

sm625

Diamond Member
May 6, 2011
8,172
137
106
Given that you can't get the unit for energy right, I'm not inclined to trust your pulled-out-of-your-arse estimates for energy costs.

Clue: watts are a unit of power.

EDIT: Okay, let's do this right.

First off, I'll assume a worst case scenario for power usage- running x264 constantly, meaning your CPU is at 100% load all of the time. This is significantly more than the power usage during gaming, might I add.

According to Anand's review, the difference in power usage between the FX-6300 system and the i5-3570k system is 44.4W, or 0.0444kW. (About the same as a lightbulb.)

According to Wikipedia, US energy prices range from 0.08$/kWh to 0.17$/kWh. I shall pick a number in the middle of this, $0.12, for my calculations, but adjust this for your local energy prices.

Finally, we shall use your figure of average 4 hours' use a day over 3 years, for 4380hrs. This gives us our final sum:

Cost over 3 years = 0.12 * 0.0444 * 4380 = $23

So far lower than your guesstimate. And let me reiterate, that is assuming the PC is at full load the entire time- a highly unrealistic assumption for anyone not running Distributed Computing, a rendering farm, etc. The difference in idle power consumption is only 14.7W, about a third that at load, and we can assume that average power consumption will be somewhere between the two.

Next time, do the maths instead of making it up and hoping no-one will call you on it.

My units of "watts per day" is perfectly valid when describing differences in power consumption. You're just nitpicking in an attempt to sound like you know something other than how to sound condescending. First of all, I was comparing an i3-3220 vs an FX6300, genius. Those are the chips the OP is trying to choose from. The difference in power between those two chips is about 80 watts under a gaming load. And second, mr genius, no one pays 12 cents a kwh. When you itemize out your electric bill you find that with delivery and surcharges, everyone is paying 15-20 cents per kwh. My guesstimate is exactly that, a good guesstimate, and it is a hell of a lot more accurate than yours. Someone who games 4 hours a day on an i3-3220 will save very close to $60 on their electric bill over 3 years, not the $23 you claim as accurate. So take your condescending blathering and stick it.

The i5 was thrown in there to point out the fact that this is what you could buy with the extra money from electricity savings; I never made any claims that this much higher performing i5 would not consume extra power; of course it would.
 
Last edited:
Aug 11, 2008
10,451
642
126
Given that you can't get the unit for energy right, I'm not inclined to trust your pulled-out-of-your-arse estimates for energy costs.

Clue: watts are a unit of power.

EDIT: Okay, let's do this right.

First off, I'll assume a worst case scenario for power usage- running x264 constantly, meaning your CPU is at 100% load all of the time. This is significantly more than the power usage during gaming, might I add.

According to Anand's review, the difference in power usage between the FX-6300 system and the i5-3570k system is 44.4W, or 0.0444kW. (About the same as a lightbulb.)

According to Wikipedia, US energy prices range from 0.08$/kWh to 0.17$/kWh. I shall pick a number in the middle of this, $0.12, for my calculations, but adjust this for your local energy prices.

Finally, we shall use your figure of average 4 hours' use a day over 3 years, for 4380hrs. This gives us our final sum:

Cost over 3 years = 0.12 * 0.0444 * 4380 = $23

So far lower than your guesstimate. And let me reiterate, that is assuming the PC is at full load the entire time- a highly unrealistic assumption for anyone not running Distributed Computing, a rendering farm, etc. The difference in idle power consumption is only 14.7W, about a third that at load, and we can assume that average power consumption will be somewhere between the two.

Next time, do the maths instead of making it up and hoping no-one will call you on it.


You are using the same units that he was. His calculation process is correct. I dont feel it was necessary to be so condescending to his mathematical capabilities. The difference is that you are estimating a different delta in power usage and using only 4 hours per day at load vs 8 hours per day for the other poster. If you estimate 8 hours per day at load, and your numbers rounded to 50 watts per hour and 12 cents per kwhr I come up with approximately 18.00 per year. That is not including the taxes, surcharges, etc which are invariably tacked on to energy bills that could increase the cost another 10 to 20 percent.
And no one knows how much energy prices will increase in the future. Granted they have been stable or declining recently, but that trend will reverse at some point.
 

NTMBK

Lifer
Nov 14, 2011
10,322
5,352
136
My units of "watts per day" is perfectly valid when describing differences in power consumption.

No it isn't. Watts is rate of consumption of energy. Watts per day is the rate of change in the rate of consumption of energy. Watts == energy/time, watts per day == energy/time^2. Watt-hours on the other hand would work.

First of all, I was comparing an i3-3220 vs an FX6300, genius. Those are the chips the OP is trying to choose from.

Heh, that one is a good spot. The conversation went into an i5 vs 6300 rant at some point, and I got muddled on which one you referring to.

The difference in power between those two chips is about 80 watts under a gaming load.

I'd like to see some figures to back that one up. Given that (again according to the Anandtech review) the difference at 100% load is 65W, I sincerely doubt that it's about 80 watts in a gaming load.

And second, mr genius, no one pays 12 cents a kwh. When you itemize out your electric bill you find that with delivery and surcharges, everyone is paying 15-20 cents per kwh.

Hey, I was just using the figures I could find online from Wikipedia (sourced from a US Government report, so hey), and I deliberately shot for the middle of the range they quoted. I am a European though, so I will admit to a lack of knowledge of how US pricing tariffs work.

My guesstimate is exactly that, a good guesstimate, and it is a hell of a lot more accurate than yours. Someone who games 4 hours a day on an i3-3220 will save very close to $70 on their electric bill over 3 years, not the $23 you claim as accurate. So take your condescending blathering and stick it.

Again, you are basing your figures on the energy consumption at 100% load on your CPU (in fact, slightly over that amount), which is not a realistic figure for gaming.

You are using the same units that he was. His calculation process is correct. I dont feel it was necessary to be so condescending to his mathematical capabilities. The difference is that you are estimating a different delta in power usage and using only 4 hours per day at load vs 8 hours per day for the other poster. If you estimate 8 hours per day at load, and your numbers rounded to 50 watts per hour and 12 cents per kwhr I come up with approximately 18.00 per year. That is not including the taxes, surcharges, etc which are invariably tacked on to energy bills that could increase the cost another 10 to 20 percent.

His post stated that he was basing it on 4 hours a day, not 8. Not really sure where you got 8 from.

I will admit I got a bit condescending and snippy. I did Physics as a degree, so seeing someone misuse power and energy units is like a red flag to a bull for me (I see it all the time in news reports).

As I said above, I am a European, so I'm just basing my pricing of units off online figures.

And no one knows how much energy prices will increase in the future. Granted they have been stable or declining recently, but that trend will reverse at some point.

We'll see; the latest predictions actually have US energy prices dropping, due to the expansion of shale gas extraction providing cheap gas. A single misjudged war on Iran could easily cancel that out though, of course!
 
Aug 11, 2008
10,451
642
126
No it isn't. Watts is rate of consumption of energy. Watts per day is the rate of change in the rate of consumption of energy. Watts == energy/time, watts per day == energy/time^2. Watt-hours on the other hand would work.



Wow, maybe we should transfer this to the "highly technical" forum and delve into a lesson in integral calculus or differential equations. Bottom line, his calculations, using the assummptions that he made, were correct. You can argue about his assumptions, and he probably did overesitmate, but his mathematical process was correct, even if he did not state it in a manner that met your criteria.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
If you game or encode for 4 hours a day, an FX6300 will easily consume an extra 400 watts per day.

That means that the FX6300 uses 100W more than Core i3 while gaming.

4 hours / 400W = 100W

At 4.2 GHz it could be as much as 600 watts a day.

That means that the FX6300 @ 4.2GHz will use 150W MORE in gaming than Core i3. ARE YOU SERIOUS ??? where did you see those numbers ???


Even FX8350 at full load DON'T consume more than 115W than Core i3 in x264 and it is 2X faster.
Meaning that FX8350 will consume almost the same power as Core i3 to finish the same job but at HALF THE TIME.





ps: FX6300 @ 4.2GHz will not use more power than FX8350.
 

NTMBK

Lifer
Nov 14, 2011
10,322
5,352
136
No it isn't. Watts is rate of consumption of energy. Watts per day is the rate of change in the rate of consumption of energy. Watts == energy/time, watts per day == energy/time^2. Watt-hours on the other hand would work.



Wow, maybe we should transfer this to the "highly technical" forum and delve into a lesson in integral calculus or differential equations.

Highly technical?! Any 14 year old knows how to get their units right!

EDIT: Dammit, I need to stop going into cranky physicist mode! That came out harsher than intended. It's still not "highly technical", though.

EDIT 2: Okay, useful analogy time! The difference can be compared to the differences between distance, speed and acceleration.

Distance == m ~ Energy == J
Speed == m/s ~ Power == J/s == W
Acceleration == m/s^2 ~ Power/time == W/s

Where ~ indicates "is analogous to".

[/ramble]
 
Last edited:
Aug 11, 2008
10,451
642
126
Highly technical?! Any 14 year old knows how to get their units right!

EDIT: Dammit, I need to stop going into cranky physicist mode! That came out harsher than intended. It's still not "highly technical", though.

EDIT 2: Okay, useful analogy time! The difference can be compared to the differences between distance, speed and acceleration.

Distance == m ~ Energy == J
Speed == m/s ~ Power == J/s == W
Acceleration == m/s^2 ~ Power/time == W/s

Where ~ indicates "is analogous to".

[/ramble]

I have a minor in physics, with several semesters of calculus, so I know about rates of change, (speed= first Dx, acceleration= second Dx). And I can balance equations and cancel out units as well as the next guy. But I dont really care about that. All I am saying is that he did the calculation correctly, albeit with some very strange estimates of the power differences between the two processors.

Edit: I definitely agree with your first edit, and a bit of a sense of humor wouldn't hurt either. I was only being facetious about moving it to the highly technical forum. I thought that would be obvious from the quote marks and the way I made the statement.
 
Last edited:

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,001
126
Remember gaming often only uses two cores. Sometimes it uses four cores. And when a game does want more than four cores, then the FX6300 will use more power, but should also deliver better performance. I think some of you overestimate CPU load while gaming... especially if the OP is looking at a 7770 level card, that will often be the limiting factor.
 

NTMBK

Lifer
Nov 14, 2011
10,322
5,352
136
Edit: I definitely agree with your first edit, and a bit of a sense of humor wouldn't hurt either. I was only being facetious about moving it to the highly technical forum. I thought that would be obvious from the quote marks and the way I made the statement.

Heh, yeah. I need to stop going into "arguing with everyone" mode.
 

lyssword

Diamond Member
Dec 15, 2005
5,630
25
91
My buddy bought $130 6300, stock volts/stock cooler @4.1 ghz mild OC, with a $70 fx970 mobo
 
Last edited:

infoiltrator

Senior member
Feb 9, 2011
704
0
0
Which 970 motherboard and is he happy?
Does it play the games he wants to play as well as he wishes?
Cheers
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |