can a i3 cpu run most games?

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
A i3 would most likely suck for BF4, i disabled 2 cores on my i7 3770 and left hyper threading on and clocked it at 3.1ghz and well BF4 pretty much ran like crap. I used to have a i3 2100 and was curious how smooth such a chip could run BF4 if i still had it.:biggrin:

Guess a i7 with 2 cores disabled honestly makes for a better"i3" considering the i7 has a bit more cache. Maybe someone with a true i3 could confirm.
 
Apr 20, 2008
10,162
984
126
Skipsneeky, were you playing multiplayer? If so that's a huge reason why. There's more threads than cores in that case. If it was single player then that's just what it was.

I really think the people who are holding out with dual cores are the same people who held onto the "Fast Single Core Will Always Be Better" mantra, be it the Athlon XP, and Core 2 Duo folks. When I was choosing between an E8400 and a Q8200 on several forums back in ~09, I specifically stated I'd hold onto it for as long as possible and long term performance was far more important. Look what happened. The Q8200 is still able to play BF3 online ~32-48 players very well. The E8400? Complete garbage in multiplayer.

In every game made back then, the Q8200 was overkill for the video cards paired with it, such as my 8800GTS/3850/4830. Unless you're pairing these i3's with very high end video cards that are in the reviews, you won't get the performance listed. You'll also be limited in what games you can play starting now with all the console ports trickling in, starting to use 7 threads.

An i7, fx 6xxx, 8xxx, and maybe an i5 are the only long term viable gaming processors out there. Everything else is going to struggle with many new games in the very near future. Not only that but more threads make a system far more responsive for much longer.

Has anyone here used a P4 3.2Ghz w/HT compared to an Athlon 3500+? The P4 is worlds better in modern software because of the responsiveness due to its simultaneous threading capability. That trend is only going to continue, not regress. Faster, lower core/thread count CPUs are only good for the present. We've seen what happened to all those idiots out there who championed the 2.4Ghz AMD 4000+ single core above the 1.9Ghz X2 3600. Which one of the two can play more games and still be a productive work computer, even to this day? In the games that came out a couple years later that utilized two full cores like TF2, that dual core offered 60% more performance, given than you had a video card that could push it that far. In that day a 1.9Ghz AMD 64 CPU was still fast enough to push even the highest end video cards the day it came out. The second core really future-proofed it and offered a ton of more performance when the software caught up. Look at what's happening today and the parallels are huge. The FX series is just as fast as the i3 in low threaded games given that you don't have a $400 GPU, and offers far greater performance in games that require more threads, which is something that an i3 chokes on (NBA 2K15 and well ported games from here on out).

Those who fail to learn from history are doomed to repeat it. Are you all going to be sheep and buy into what the forum shills tell you to do or use your critical thinking skills and get what's best for the money and future? An i7 or 8xxx, 6xxx and i5 is the only way to go, in that order.
 
Last edited:

crashtech

Lifer
Jan 4, 2013
10,547
2,138
146
I'd just like to point out that even though modern consoles have 6 cores available, they are so slow and their IPC is so poor, that a modern, fast clocked single Haswell core can do the work of >2 Jaguar console cores.
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
I'm not arguing against your points (though I disagree that an FX-6 is a better chip than an i5), Scholzpdx, but what do you mean by "The FX series is just as fast as the i3 in low threaded games given that you don't have a $400 GPU..." ? What does how expensive or high end your video card is have to do with how many frames your CPU can deliver?

I see something like this fairly often and it has never made sense to me.
 
Last edited:

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
After thinking on it for a minute, I'd like to expand on what crashtech posted.

The comparison of a 2.4GHz single core vs a 1.9GHz dual is not valid here, considering the IPC and architecture differences. A 1.9GHz dual has almost exactly 60% more processing power than a 2.4GHz quad, which is why you get a 60% FPS improvement.

Stock vs stock, a Haswell core has about 60% higher IPC than half of an FX module. Due to modules sharing resources, an FX-6300 only has about 4.9x the speed of a single core when you load up all of the cores (taken from Anandtech bench), which works out to a 18.5% loss from shared resources, or turned around, you would have 22.5% better performance if you didn't share resources. Some math:

Note: Yurimarks are based on an average of Anandtech benchmarks that show excellent scaling with core count at similar/same clockspeed.

1 FX core = 1.0 Yurimarks
6 FX cores = 4.9 Yurimarks
8 FX cores = 6.5 Yurimarks

1 Haswell core = 1.6 Yurimarks
2 Haswell cores with H/T = 4.2 Yurimarks
4 Haswell cores = 6.4 Yurimarks
4 Haswell cores with H/T (32% from HT, from anandtech bench) = 8.45 Yurimarks

Interestingly enough, if you look at the benches, you'll find that 9 times out of 10 an FX-8350 will perform just ahead of a Haswell i5 (nearly margin of error) if you can load up all 8 cores and have linear scaling in the software you use, and a Haswell i3 will trail an FX-6300 by about 15%.

EDIT: The value proposition of FX's is that an FX-8350 is considerably cheaper than an i5, and an FX-6300 is a bit cheaper than an i3 while also offering about 15% better multithreaded performance. However, you're not going to see FX-8350's performing twice as well as i5's in *anything* years down the road. A more apt comparison is that of a Haswell Pentium to a Core2Quad Q8200 (2.33GHz). In programs that scale well, the quad *will* be a little faster, but which is more future proof? I feel the answer isn't as easy as it appears at first glance.
 
Last edited:
Apr 20, 2008
10,162
984
126
If your budget is only $120 for a CPU and you get an i3, I can only imagine what your video card budget is. A GeForce 550? R7 260? An i3, i5, i7 and any fx will be held to similar frame rates in games that only use a couple/few threads. In games that use 5-7 threads (most games from here out), you are going to have a rough time running the game, regardless of the GPU.

Also, developers on consoles don't have huge api layers to fiddle with to get the most out of performance that PC games suffer from. In middling PC ports like MX vs ATV reflex, a 4.5ghz i7 only gets ~25-35fps during races while console gamers got a full 60fps experience. GTA4 is another game that was able to get serious close-to-metal performance that just wasn't possible with PC hardware. While that should be not as apparent with similar architecture, the dx api is still in the middle of every single instruction given from the CPU GPU. As stated earlier, NBA 2k15 is having huge issues on i3, fx-4xxx, and some i5 cpus with audio, video, and input not keeping in sync, causing massive intermittent stuttering. When I disable two modules I get it. When I cut my CPU speeds in CCC to 2.1ghz with all 8 cores, no issues and still at near 60fps. That reinforces that more available threads, the better. The 6 core i7 is probably the best CPU for gaming.

When you're using a couple powerful cores to do the work of 2x the available threads, you're at the mercy of Windows thread scheduler to get entire processes synched up. Good luck with that. Let the choppy performance change your perception.

Consoles don't have this issue. They can assign a task to each core or split it efficiently to keep every instance at about the same interval. Now run those 7 threads on two powerful cores with smt paired to a non-configurable scheduler.
 
Last edited:

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
Would it be fair to say that programs that have tons of losses from context switching are just as poorly written as those that don't take advantage of many threads? I suppose 2K15 is to an i5 what Guild Wars 2 to an FX8350 - you're going to have miserably performance on an FX chip in that game (19fps where an i5 gets ~30), similar to how badly 2K15 runs on an Intel quad.

2K15 is the first example of a game that hasn't run well on an Intel quad, but it's interesting to see that bad programming can impact both.
 
Last edited:

BSim500

Golden Member
Jun 5, 2013
1,480
216
106
Are you all going to be sheep and buy into what the forum shills tell you to do or use your critical thinking skills and get what's best for the money and future? An i7 or 8xxx, 6xxx and i5 is the only way to go, in that order.
i7's have very slight performance increases over i5's in gaming (but i5's still regarded as "best for the money"). However, an i5 is significantly faster than than an FX6xxx in pretty much every game & benchmark going including 2013-2014 many multi-threaded ones (Evil Within being a whopping 92% faster vs FX6300, Assassins Creed 4 being 76% faster, Shadow of Mordor being 33% faster, Thief being 38% faster, etc). None of these games are more than 12 months old, so there's a little more to "critical thinking" than yelling "2 more cores = more fps for modern games & everyone here but me is a paid shill".

If your budget is only $120 for a CPU and you get an i3, I can only imagine what your video card budget is.

Exactly the same as it would be if buying a $120 AMD CPU...

Also, developers on consoles don't have huge api layers to fiddle with to get the most out of performance that PC games suffer from. In middling PC ports like MX vs ATV reflex, a 4.5ghz i7 only gets ~25-35fps during races while console gamers got a full 60fps experience. GTA4 is another game that was able to get serious close-to-metal performance that just wasn't possible with PC hardware. While that should be not as apparent with similar architecture, the dx api is still in the middle of every single instruction given from the CPU GPU.
Not only is it "not as apparent", it's been virtually inverted, with Haswell i3's + console equivalent 7790 / 260X / 750Ti's capable of 1080p @ 30-50fps whilst the XB1 (with 7790 class GFX) is stuck at 720p @ 30fps on equivalent "Medium" quality in several games (Watch Dogs, etc). The traditional "Direct to metal" advantage has been virtually dead on the current x86 consoles since launch vs what their equivalent desktop dGPU counterparts (7790 / 7850 / 750Ti) can manage without enforced resolution / 30fps caps, even with DX11. A lot of the extreme examples (ie, 60fps consoles vs 30fps PC's) are usually either broken ports or "apples and oranges" settings comparisons, ie, Ultra PC on a low-end card vs Medium equivalent setting console (minus AA, fewer shadows / shaders, etc) on same low-end card.

As stated earlier, NBA 2k15 is having huge issues on i3, fx-4xxx, and some i5 cpus with audio, video, and input not keeping in sync, causing massive intermittent stuttering.
NBA 2K15 is a massively buggy & broken game on many systems. From massive graphical corruption to crashes to desktop, complete freezes for several seconds that have nothing to do with performance / core counts, refuses to start, crash when face-scanning (and resembles something out of Alien 3 when it does half work ), corrupt saves, always on DRM that boots even console users back to the main menu when it loses the server connection, etc. One guy with an FX-8350 had constant stutter that was cured by disabling his sound card. Another was SSAA related. The game is massively buggy on a lot of systems : Intel / AMD / Nvidia alike. It's even freezing up on the XB1! The only sane way to benchmark ANY CPU on that is to wait until they fix its many issues. And even then it won't stop it from being a bad port (as many console franchise sports genre games are).
 
Last edited:

Ranulf

Platinum Member
Jul 18, 2001
2,407
1,305
136
I really think the people who are holding out with dual cores are the same people who held onto the "Fast Single Core Will Always Be Better" mantra, be it the Athlon XP, and Core 2 Duo folks. When I was choosing between an E8400 and a Q8200 on several forums back in ~09, I specifically stated I'd hold onto it for as long as possible and long term performance was far more important. Look what happened. The Q8200 is still able to play BF3 online ~32-48 players very well. The E8400? Complete garbage in multiplayer.

Bah, not this debate again. That was then, this is now. Yet, the debate in the end is still just longevity vs cheap in the short term. Sure, the Q8400 I bought in January 2010 is better 4 years later than the E6600 I bought in 2007. If I'd been smart though, I would have just bought a replacement motherboard for the E6600 instead of listening to everyone say "go quadcore!" and upgrade 1-2 years later when it was more noticeable of an upgrade. That q8400 sure didnt seem much faster than the 6400. It didn't help that q8400 system ended up with a fried mobo 1.5 years later. Mega regrets not moving up to ddr3 in 2010. Oh well.
 
Apr 20, 2008
10,162
984
126
i7's have very slight performance increases over i5's in gaming (but i5's still regarded as "best for the money"). However, an i5 is significantly faster than than an FX6xxx in pretty much every game & benchmark going including 2013-2014 many multi-threaded ones (Evil Within being a whopping 92% faster vs FX6300, Assassins Creed 4 being 76% faster, Shadow of Mordor being 33% faster, Thief being 38% faster, etc). None of these games are more than 12 months old, so there's a little more to "critical thinking" than yelling "2 more cores = more fps for modern games & everyone here but me is a paid shill".



Exactly the same as it would be if buying a $120 AMD CPU...


Not only is it "not as apparent", it's been virtually inverted, with Haswell i3's + console equivalent 7790 / 260X / 750Ti's capable of 1080p @ 30-50fps whilst the XB1 (with 7790 class GFX) is stuck at 720p @ 30fps on equivalent "Medium" quality in several games (Watch Dogs, etc). The traditional "Direct to metal" advantage has been virtually dead on the current x86 consoles since launch vs what their equivalent desktop dGPU counterparts (7790 / 7850 / 750Ti) can manage without enforced resolution / 30fps caps, even with DX11. A lot of the extreme examples (ie, 60fps consoles vs 30fps PC's) are usually either broken ports or "apples and oranges" settings comparisons, ie, Ultra PC on a low-end card vs Medium equivalent setting console (minus AA, fewer shadows / shaders, etc) on same low-end card.


NBA 2K15 is a massively buggy & broken game on many systems. From massive graphical corruption to crashes to desktop, complete freezes for several seconds that have nothing to do with performance / core counts, refuses to start, crash when face-scanning (and resembles something out of Alien 3 when it does half work ), corrupt saves, always on DRM that boots even console users back to the main menu when it loses the server connection, etc. One guy with an FX-8350 had constant stutter that was cured by disabling his sound card. Another was SSAA related. The game is massively buggy on a lot of systems : Intel / AMD / Nvidia alike. It's even freezing up on the XB1! The only sane way to benchmark ANY CPU on that is to wait until they fix its many issues. And even then it won't stop it from being a bad port (as many console franchise sports genre games are).

You missed the entire point of which processors to get for gaming from here on out. I also agree that at the moment, i5 handily beats the FX 6xxx CPUs in 4 thread or less games. There's no doubt about it. The FX-6300, however,
is $95, and the the cheapest i5 (Haswell to boot) CPU is the i5 4430, which is $190.

At each pricepoint sub $160, AMD is far better for gaming and future gaming, unless we're living with blindfolds and and earplugs. AMD also has fairly huge sales at retailers. When an i5 costs $190 for the slowest locked haswell, and FX 6300 is $90 or an FX 8320 or 8350 for $125, you can use that extra $95 and instead of getting an R7 250x 1GB, you can get the R9 290 4G.

So, which of these two combinations would be better? Especially for next gen, 5-7 thread gaming?

i5 4430 and an AMD R7 250x 1GB

or

FX 6300 and an R9 290 4GB?

You know the answer. This is exactly why i'd rate the FX-6300 above the i5 line.

However if you have the money for an i7, you likely have the money for a decent video card as well, which makes the AMD CPU options irrelevant. The i3 competes in price with the FX 6300 and FX 8320/50, and for next-gen gaming the i5 doesn't make much sense. When you can typically buy an FX-8320/50, 970 ATX MOBO and 8GB of DDR3 for $250, and the only i5 CPUs that are worth their price is the 4670k at $235 by itself... it makes almost zero sense to go Intel until you hit those super high-margin pricepoints. And even then you'd want to spend $100 for the 4770k as you're going to need those extra threads, as evidenced by the 2K15 debacle on thread syncing. And then for $55 more you can get the six core i7 5820k, which is the best reasonable high-end CPU there is.

Unless you have a 120/144fps monitor, the FX 63xx and especially 8xxx and up series is fast enough for 60fps in practically everything out there, and has the threads to be future-resistant for the upcoming wave of highly-threaded games. On a CPU hungry game like 2K15, if the 8350 can get very high performance with a decent video card, that's enough evidence that future games are handled easily with enough threads. It was a quick port as they didn't even announce it as next gen until a couple months before release. They didn't take too much time on it yet it runs flawlessly with enough threads available.

By the way, the graphical corruption is because they're using an Intel IGP as well as an nVidia GPU in a laptop. The issue is that in nVidia control panel, the user needs to select high performance mode instead of using the IGP. The IGP under nVidia's drivers will do exactly what's happening in that picture.

The complete freezes that you're speaking of is what I was talking about when I disabled two modules and ran the game with just 4 core/4threads. Constant complete pauses and resume, as well as stuttering left and right. Enabling 1-3 module relieves it completely. Loading times are faster with all 4 modules running. When the game is loading CPU usage spikes to insane levels (90-100% even at 8c/8t 4.2Ghz) as they must be decompressing data before throwing it into ram, something the consoles can do to reduce load times from disc.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
One thing that sucks very badly with FX AM3+ CPUs is its out-dated platform which doesn't even support PCI-E 3.0 not to mention a slew of other things like those new SSD interfaces.
I would be very hesitant to choose anything AMD over an Intel alternative unless I already had a very good cooler like NT-DH14 intending to OC heavily otherwise I would choose an Intel option, even i3 over FX6300. But FX6300@4.7GHz should be better, power consumption be dammed, but it makes sense only if you already have a good cooler.

UPDATE:
http://anandtech.com/bench/product/1197?vs=1289

hmm it still looses quite badly in ST benchmarks, MT are not applicable to FX6 to i3 comparison, even after OC the choice is not clear.
 
Last edited:
Aug 11, 2008
10,451
642
126
Wow, havent we heard this before, when the consoles were first coming out, that FX was the "future proof" cpu? Well so far we havent seen it. In the latest 7 games tested on game.gpu, the 4670k was 13 to 65 percent faster, the average being 42 percent. All the while using less power. And in a good number of those games, even an i3 was faster.
 
Apr 20, 2008
10,162
984
126
One thing that sucks very badly with FX AM3+ CPUs is its out-dated platform which doesn't even support PCI-E 3.0 not to mention a slew of other things like those new SSD interfaces.
I would be very hesitant to choose anything AMD over an Intel alternative unless I already had a very good cooler like NT-DH14 intending to OC heavily otherwise I would choose an Intel option, even i3 over FX6300. But FX6300@4.7GHz should be better, power consumption be dammed, but it makes sense only if you already have a good cooler.

UPDATE:
http://anandtech.com/bench/product/1197?vs=1289

hmm it still looses quite badly in ST benchmarks, MT are not applicable to FX6 to i3 comparison, even after OC the choice is not clear.

$95 FX-6300 vs $160 i3 4360

http://anandtech.com/bench/product/1197?vs=699

The FX-6300 is more than fast enough in 95% of games for 60fps or more in single threaded, and it bests the i3 in multhreaded performance, saving $65 for a better video card or an SSD.

And the "heat" part isn't anywhere near as big of a deal as people make it out to be. I average 37-43C while gaming on 2K15 (at least 50% utilization most of the time) and while I was stress testing 4.4Ghz on all 8 cores with Prime, I averaged 52C and topped off at 55C. This is with a relatively cheap $18 Arctic Cooler Freezer Pro 7 Rev. 2. My case is this Raidmax Cobra and I added in three weak 120mm fans, 2 on top and one on the bottom of the case. I turned the default exhaust fan the other way to blow into the side of my CPU fan and VRM's. The fan on the HSF blows upward into the case fan.



This is before I replaced the top two fans with a couple used Antec 120mm fans from a thrift store and switched the direction of the default case fan. The CPU stays very, very cool in this setup at idle. If an 8-core at 4.4Ghz can be pushed that hard and be kept that cool, a 6-core would be easier.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
Absolute CPU temps of AMD CPUs are unreliable the only way to compare them to Intel CPUs is to compare distance to Tjmax, 52C would be an equivalent of 92C on an Intel CPU.
 
Apr 20, 2008
10,162
984
126
Absolute CPU temps of AMD CPUs are unreliable the only way to compare them to Intel CPUs is to compare distance to Tjmax, 52C would be an equivalent of 92C on an Intel CPU.

You think they don't account for that? It's right on CoreTemp, Tj max of 80C. Did the forum shills tell you that tidbit or not?
 

crashtech

Lifer
Jan 4, 2013
10,547
2,138
146
What is this forum shill crap? Lots of AMD users have complained about not being able to get an accurate core temp reading.
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
The heat produced by an FX-6300 is more than that of an i3, but it's perfectly manageable. Even a stock FX-8 isn't bad at all if you have a recent stepping. FX chips don't use enough more power generally to cause much extra cost in cooling or power supplies. I'm in agreement that both *can be* excellent buys.

I wouldn't pay $160 for an i3 (just as I wouldn't pay $130 for an FX-6350), slightly lower clocked models are available for $110, but even so you're getting about what you pay for with both CPUs. Both have strengths and weaknesses. However, neither an FX-6300 or 8300 are really comparable to a Haswell i5 on average (though may be in a subset of tasks), and their price reflects this.

My experience has been that, since I first got a quad, I haven't experienced any major changes in computer usability and responsiveness, only FPS in games, and have no desire for more cores or even faster clocked ones right now. My Q6600 system with RAIDed SSDs felt about the same as my highly overclocked Ivy Bridge. My wife's PC has an i3 in it, and I have an AMD APU in the living room, and this applies equally to both of those. By spending less on a CPU that you may or may not need, you free up budget for other components. Depending on what games you play and programs you run, you may be better suited with many slow cores than fewer faster ones, but at present, a vast majority of games run better on fewer fast cores. Over time games have been becoming more multithreaded, but even so, a Haswell i5 has as much raw grunt as an FX-83xx and should perform similarly at worst (with one exception now) - while costing about $50-75 more. This doesn't make FX's bad CPUs. The outdated platform isn't a major issue yet either, but these are factors to consider.
 
Last edited:

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
Spurred by the question, I decided to check GameGPU's CPU charts. The oldest review below is 2013. I had to edit out "most modern CPUs deliver 60FPS, so it's largely an academic matter" after seeing the numbers - it's really not true. In fact, it looks like in many, neither an FX-8 nor an i3 really cut it...

These are not cherry-picked. Rather, they're simply randomly chosen from some of the latest reviews and previews of games on GameGPU right now.



















And, last but not least:

 
Last edited:

BSim500

Golden Member
Jun 5, 2013
1,480
216
106
You missed the entire point of which processors to get for gaming from here on out.

No really I didn't. See earlier benchmarks... Just because you personally "demand" every future game run 2x faster on FX chips won't make it so. The problem as always is not just technology (a 50-70% weaker IPC, or the fact game code does not "core scale" anywhere near 100% like video encoding) but game development money - developers will do what's cheapest and spending 6 extra months hand-tweaking every line of code to be super-threaded is simply not on their agenda as they prove themselves over & over again. "An FX-6300 will thrash an i5 in future" is both deluded and pointless since an i5 scores 30-90% higher in many games that are obviously visibly well threaded in the sometimes significant +40-50% i5/i7 vs i3 or FX-8350 vs FX-4350 scores. It's strangely reminiscent of the clueless fanboysim surrounding "next-gen consoles will destroy i5's purely because they have 8 cores, nothing else matters" - until they actually launched that is...

"So, which of these two combinations would be better? Especially for next gen, 5-7 thread gaming? i5 4430 and an AMD R7 250x 1GB or FX 6300 and an R9 290 4GB?"

Not this moronic cr*p again. The general 'gaming rig' advice all round is spend double on GPU what you would on CPU - and for such a budget, the Intel option would be i3 + same R9 290 card (you can pick up a i3-4160 for $99 too). Your comparison is as totally bogus and "leading" as me asking you - "what would you buy - an FX6300 + R9 290 or an FX9590 + 7730 GFX card" then declaring "if you pick the former, then 8-core AMD FX's must suck for gaming because it means less on a GFX card vs an FX6300".

The only people regularly churning out these totally fake, bogus & wildly unbalanced "comparisons" are "MOAR CORE" obsessed AMD users trying to prove some fake point to themselves by first inventing a "one size fits all" abnormally low "budget" for everyone on the planet, then secondly using an inconsistent 2:1 ratio GPU:CPU for AMD but 1:2 ratio for Intel as and when it suits them. Then thirdly declaring any game where an FX 6xxx loses to an i3 even with triple the cores to be the "wrong kind of game" and "just you wait for the 'right kind of game'" - which is always endlessly just over the horizon year after year...

Unless you have a 120/144fps monitor, the FX 63xx and especially 8xxx and up series is fast enough for 60fps in practically everything out there

Except it isn't (which again - is why it's priced on par with an i3 and not an i5/i7)... :whiste:

Bottom Line - On average, for gaming, an i3 is roughly the same as an FX63xx (which is precisely why they have similar price tags). Declaring an FX6300 to be "imminently superior" to an i5 on the back of a personal "you just wait and see, I tell you!" futuristic fantasy is deluded to the core, and quite frankly embarrassing when you have a plethora of contrary benchmarks staring you in the face on games that obviously use more than 4 threads (if they didn't then an FX-8xxx and FX-4xxx would be virtually neck & neck adjusted for clock instead of visible +40% differences...). In short the "i5 killing" games you're waiting for are already here (BF4, etc). The reason the FX-6300 still often gets beaten by an i3 let alone i5 in many games is naturally diminishing returns of threading of certain types of code (ie, doubling the cores for a 10-40% boost matters far less than making a CPU +60-70% inherently more efficient (IPC) on all code. It's why adding 2 cores (FX-6300 vs FX-4350 = a 50% core increase) leads to only a 0-29% boost in fps, which further falls to 0-18% boost for another 2 cores (33% core increase of FX-8350 over FX-6300 in BF4). If you stuck on another 8 cores for a 16-core FX, you'd be lucky to get another 5% out of it. Diminishing returns, plain & simple.

Your posts (as many others in the past) can be summed up as "All we need to do is wait - the rise of "many weak" cores is just around the corner to beat Intel - any minute now... any minute now... any minute now...". Problem is, we've been hearing that every single year since the first dual-cores in 2005, and the "rise" of the next-gen consoles haven't changed that either due to their cores being so weak it literally takes 8 of them just to match a Haswell i3, plus the fact multi-threading is not "free" (increased complexity of writing / debugging code, etc). Like it or not - IPC is still king and if the situation were reversed (as it was with P4 vs Athlon 64 - and I speak as a former proud A64/X2 owner who never bought any P4) - you'd be agreeing 100% with that too.

Likewise, these "dreams" of a "doubling in cores = a doubling in fps", merely reveal a naive mentality that does not understand Amdahl's Law or why for game code, the speedup is often far more like the blue line than the green line (let alone an imaginary perfect diagonal 100% line which some seem to fantasize about). That applies to all CPU's - not just AMD's (which is also the reason why i7's don't get exactly double the fps of i3's despite having more than twice the "parallel" horsepower).
 
Last edited:

4ghz

Member
Sep 11, 2010
165
1
81
Forum Shills = people who recommend cpus that give consistent performance across all games? Sigh...

I will never recommend a FX series chip for gaming and here's why:

Intel I series cpu performance in games = --------------
AMD FX series cpu performance in games = /\/\/\/\/\/\/\/\/

Consistency is key. Who would choose a restaurant where the food is excellent half the time and crap the other half over a restaurant that consistently serves good food? All one has to do is look at the charts that Yuriman posted. The FX 8xxx chips are often beaten soundly by the Haswell i3. I would love to recommend AMD chips for gaming because I hate the way Intel has locked up all their chips below $230. Its seems like a really cheap and petty thing to do considering how little overclockers probably affect Intel's bottom line. But I'm not doing it until they fix those weak cores that are still slower then Intel cores released in 2009. So who exactly am I shilling for?
 
Last edited:

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
What is this forum shill crap? Lots of AMD users have complained about not being able to get an accurate core temp reading.

That forum shill makes me laugh I had only AMD CPUs until Core 2 Duo which is until they made sense, I don't give a crap who produces my CPU or GPU for that matter. I mostly had ATI/AMD cards.

The general 'gaming rig' advice all round is spend double on GPU what you would on CPU

If I followed that advice I would end up with a needless 5960X, it's almost always worth it to invest some more money in the graphics cards but that isn't so with CPUs, 5960X would provide me no improvement in FPS for a long time to come.
on games that obviously use more than 4 threads (if they didn't then an FX-8xxx and FX-4xxx would be virtually neck & neck adjusted for clock instead of visible +40% differences...)
you can't make that kind of comparison for those CPU, FX-8 series is also faster in 4 threaded workloads than FX-4 due to module scaling penalty. So a 4 threaded game will also run faster on FX-8 series.
 
Last edited:

crashtech

Lifer
Jan 4, 2013
10,547
2,138
146
The 2:1 rule of thumb works fairly well except on the very low and very high end; it's more like an admonition not to go CPU-crazy on a gaming rig where the GPU matters more.

I think twin Titans are somewhat of an outlier, don't you?
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
The 2:1 rule of thumb works fairly well except on the very low and very high end; it's more like an admonition not to go CPU-crazy on a gaming rig where the GPU matters more.

I think twin Titans are somewhat of an outlier, don't you?

as are most of the things in my PC like 1200W PSU...
 

Firetrak

Member
Oct 24, 2014
131
0
76
If you've already bought it and can find / borrow a cheap Z77 motherboard, the best thing is to test it first. As mentioned, the "locked" Sandy & Ivy Bridge i5's have the "limited OC" feature ability to go 4-bins above max Turbo. The i5-2320 normal Turbo's are 3.1Ghz (4T) / 3.2GHz (2-3T) / 3.3GHz (1T). So on a Z77 board, +400Mhz = 3.5GHz (4T) / 3.6GHz (2-3T) / 3.7GHz (1T).

i bought this already MSI Z77A-G45 Desktop Motherboard

I got lost in all the charts and shit, felt like i was back in school, so instead i had myself a....

 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
I'm not arguing against your points (though I disagree that an FX-6 is a better chip than an i5), Scholzpdx, but what do you mean by "The FX series is just as fast as the i3 in low threaded games given that you don't have a $400 GPU..." ? What does how expensive or high end your video card is have to do with how many frames your CPU can deliver?

I see something like this fairly often and it has never made sense to me.

Using slower GPUs will get you GPU limited in more games, simple as that.

Core i3 may be faster in Game A using GTX980 but it will be GPU limited using GTX750 Ti. So even a Celeron will be fine with that GTX750Ti.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |