[Techspot] Athlon x4 860K vs Pentium G3258 deathmatch

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

TheELF

Diamond Member
Dec 22, 2012
3,998
745
126
At 1024x768 you can play every game even with the Intel Haswell HD graphics. At those low resolutions every 2-3 year old(or older) CPU can play any game at 30fps or higher.
The problem with dual core Pentium is that it can not play latest AAA titles at 1080p with high-end GPUs without stuttering when Quad Core Athlon can.
What do you think is changing at 1080p?
Does the game run with more threads?
Does the driver thread need more cpu cycles?
Show us some proof!
Not that there is any proof you could show because this is totally stupid,nothing changes for the cpu.
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
What do you think is changing at 1080p?
Does the game run with more threads?
Does the driver thread need more cpu cycles?
Show us some proof!
Not that there is any proof you could show because this is totally stupid,nothing changes for the cpu.

You are aware that in all games, the CPU is tasked with doing the calculations for anything at all that is moving onscreen, aren't you? Sure, it isn't the >2.5x the amount that the GPU has to increase it's workload by, when moving from 1024x768 to 1920x1080, but just moving from 4:3 to 16:9 is an increase in workload for the CPU. See, 4:3 @ 1080P is 1440x1080, which is smaller than 1920x1080. Smaller by 75%. The CPU has to calculate additional moving things, when moving up from 4:3 to 16:9. 33% more, as a matter of fact. (1920 is 1.33x as large as 1440)

Admittedly, 33% more isn't that much, but it's nevertheless not identical, as you claim. Then, like he said, "with a highend video card". So, what is the very first thing that anyone changes, when they move up from playing GTAV on their iGPU? 0x to 8x MSAA, with the other settings unchanged? Of course not, the very first thing we would all do is go from minimum draw distance to the maximum that our card could handle, so at least 2.5x as far, if not 3 or 4x as far.

Guess what that just did? Yep, it made the workload our CPU had increase by another 2.5-4x, on top of the additional 33%. Let's just say we increased the draw distance by only 300%/3x, to be fair. Now, your CPU is doing exactly 400% the work that it was a few minutes ago, when we were using the iGPU only, and playing with minimum draw distance, at 1024x768. I don't know about you, but I'd personally consider +400% a fairly large increase, and most definitely not "nothing changing for the CPU".
 
Last edited:

TheELF

Diamond Member
Dec 22, 2012
3,998
745
126
You are aware that in all games, the CPU is tasked with doing the calculations for anything at all that is moving onscreen, aren't you?
Yes I am,I am also aware that no matter what resolution the amount of "stuff" the cpu has to calculate stays exactly the same,it's not like with more resolution you suddenly get more enemies on screen or more of the scenery.
Yes draw distance changes cpu utilization,not what we were talking about though.
 

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
You think smaller GPUs result in higher minimums with a given CPU? Is this a theory, or is there any evidence we can look at? Not arguing, just curious.

When I play battlefield 4, if I cap fps at 63 and use that perfgraph thingy you can open with console commands it'll stay there with a perfectly flat cpu frametime line.

If I let it run uncapped the game will sometimes freeze for a second, completely unresponsive, then continue on. That would count as "low minimum fps" I guess

Capping fps with ingame limiters can improve things on quadcores as well, in battlefield 3 the audio would sometimes cut out on my i5 750, if I capped it never did this.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
You are aware that in all games, the CPU is tasked with doing the calculations for anything at all that is moving onscreen, aren't you? Sure, it isn't the >2.5x the amount that the GPU has to increase it's workload by, when moving from 1024x768 to 1920x1080, but just moving from 4:3 to 16:9 is an increase in workload for the CPU. See, 4:3 @ 1080P is 1440x1080, which is smaller than 1920x1080. Smaller by 75%. The CPU has to calculate additional moving things, when moving up from 4:3 to 16:9. 33% more, as a matter of fact. (1920 is 1.33x as large as 1440)

Admittedly, 33% more isn't that much, but it's nevertheless not identical, as you claim. Then, like he said, "with a highend video card". So, what is the very first thing that anyone changes, when they move up from playing GTAV on their iGPU? 0x to 8x MSAA, with the other settings unchanged? Of course not, the very first thing we would all do is go from minimum draw distance to the maximum that our card could handle, so at least 2.5x as far, if not 3 or 4x as far.

Guess what that just did? Yep, it made the workload our CPU had increase by another 2.5-4x, on top of the additional 33%. Let's just say we increased the draw distance by only 300%/3x, to be fair. Now, your CPU is doing exactly 400% the work that it was a few minutes ago, when we were using the iGPU only, and playing with minimum draw distance, at 1024x768. I don't know about you, but I'd personally consider +400% a fairly large increase, and most definitely not "nothing changing for the CPU".

This is not how it works. Resolution really does not affect CPU requirements other than going from 4:3 to widescreen due to the increased FOV. Go from 1080p to 4K and the CPU requirements in most games stay within 10%.

Draw distance generally increases CPU load but that is not resolution and the scaling is generally not linear. Increasing the draw distance by 2x will, in general, require more CPU power but not 2x more.

Other settings such as physics and shadows and such may be more demanding on high.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Do AMD's APUs in the consoles even have a competent implementation of AVX?

I wasn't aware that AVX was used much, or even at all, in console or PC games.

I was under the impression that the performance advantage of the i3 over the Pentium in PC ports, was due to thread usage, and nothing more.

I don't think that I've ever seen "AVX" listed on a PC game's specs, either as required or recommended. I'd love to be proven wrong, though.

AVX support is regarded as one of the few wins on the console cpus. It's really the only way the cpu has enough grunt to even run ports of floating point heavy games from Xbox 360 and PS3.

AVX may not be required, but given the huge performance gap between the i3 and Pentium in some games (50% - 100% IPC difference), I wouldn't be surprised if the difference is due to AVX more so than hyperthreading. Games can support multiple code paths, AVX for newer cpus, SSE2 for older ones. Even scarier is the idea that games could require AVX and the Pentium one day won't run games at all. This happened with the transition from SSE to SSE2, where otherwise capable Athlon cpus suddenly couldn't play a handful of games or fell back to a slow x87 path.

If you're just building a gaming rig for the next year, it probably doesn't matter, but I wouldn't want to build a gaming rig that's missing any of the capabilities the consoles have, because they are setting the baseline for minimum specs on future games.
 

TheELF

Diamond Member
Dec 22, 2012
3,998
745
126
AVX may not be required, but given the huge performance gap between the i3 and Pentium in some games (50% - 100% IPC difference), I wouldn't be surprised if the difference is due to AVX more so than hyperthreading.
Tell us one game besides grid2 that uses avx,avx can't give the i3 performance gains if it doesn't exist(in said games)

Goes to your last paragraph as well,untill games come out that will support avx all current mid/low cpus will be decrapitated.
 

Leyawiin

Diamond Member
Nov 11, 2008
3,204
52
91
I've pretty much stopped gaming (probably will until Fallout 4), but I did tinker with a G3258 last winter. Got it to 4.5 Ghz and in my old favorite Skyrim it would have hard stutters in many places that the hoary old Phenom II X4 980 BE @ 4.1 Ghz didn't (with the same GPU). Skyrim isn't heavily threaded at all and it still would cause pretty noticeable hitches with the G3258. That said, it still had higher maximum FPS in the more CPU limited areas (of course).
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Tell us one game besides grid2 that uses avx,avx can't give the i3 performance gains if it doesn't exist(in said games)

Goes to your last paragraph as well,untill games come out that will support avx all current mid/low cpus will be decrapitated.

Do you build a new computer to play old games, or do you build it to play new games?

AVX support is guaranteed for any game optimized for the consoles. Whether or not games continue to support legacy code paths is up to game developers, but I can't see any big game developer crying over dropping support for Pentiums and Celerons when every AMD cpu for the last few years and any Intel cpu aimed at anyone who cares about computing at all supports AVX. Even Atoms support AVX now.

According to this forum post: http://www.techpowerup.com/forums/threads/intel-haswell-overclocking-clubhouse.185344/page-28
Crysis 3 and GRID 2 use AVX.
Both the Dolphin and PCSX2 emulators use AVX/2 as well and get good speed boosts under some circumstances.

DirectX/Visual Studio has support for AVX too.
http://blogs.msdn.com/b/chuckw/archive/2012/09/11/directxmath-avx.aspx

Any contributions Intel makes to games (which there are more and more of) are likely to use AVX as well.
http://www.geeks3d.com/20110107/intel-tech-demos-part-2-avx-cloth-and-onloaded-shadows/

It really depends on how often you upgrade. The Pentium is a really bad cpu choice for a build expected to last 5 years, merely due to developer apathy. There were already games that wouldn't run on it due to a lack of threads, it's very feasible games could code for an AVX path and neglect an SSE1/2/3 fallback.
 

TheELF

Diamond Member
Dec 22, 2012
3,998
745
126
AVX support is guaranteed for any game optimized for the consoles.
How do you figure this?



Avx exists in cpus since the beginning of 2011 and still there is only one game that uses it.
If crysis uses avx commands than your whole argument is invalid because it runs normally on any cpu that has no avx.
And grid has separate executables,so what is your argument here?

Because avx gives some speed boosts under "some circumstances" you think that devs are gonna spend a whole lot of cash to find new ways to program for avx when they have so much money invested for years and have readily available streamlined code at hand?

Also why should any dev use a cpu command if the gpu could do the same much much faster?

It really depends on how often you upgrade. The Pentium is a really bad cpu choice for a build expected to last 5 years, merely due to developer apathy. There were already games that wouldn't run on it due to a lack of threads, it's very feasible games could code for an AVX path and neglect an SSE1/2/3 fallback.
The same apathy that will keep them from using avx...
If the games wouldn't run due to a lack of threads,then they would still not be able to run today,but all games where fixed pretty quickly,and now run faster on the pentium than on amd's quads,it was just the straight port mentality that caused the problems,the consoles use the first 2 cores for the OS so games start on the third core,devs got enough bad press to keep repeating the same stupid mistake.
(but then again...batman arkham knight)
 

VirtualLarry

No Lifer
Aug 25, 2001
56,477
10,137
126
The Pentium is a really bad cpu choice for a build expected to last 5 years, merely due to developer apathy. There were already games that wouldn't run on it due to a lack of threads, it's very feasible games could code for an AVX path and neglect an SSE1/2/3 fallback.

There are still a lot of Core2- and Phenom II-era gaming boxes out there. I think it's highly unlikely that devs would drop support for non-AVX code paths. Heck, it's only recently that we've gotten 64-bit Windows executables.
 
Aug 11, 2008
10,451
642
126
How do you figure this?



Avx exists in cpus since the beginning of 2011 and still there is only one game that uses it.
If crysis uses avx commands than your whole argument is invalid because it runs normally on any cpu that has no avx.
And grid has separate executables,so what is your argument here?

Because avx gives some speed boosts under "some circumstances" you think that devs are gonna spend a whole lot of cash to find new ways to program for avx when they have so much money invested for years and have readily available streamlined code at hand?

Also why should any dev use a cpu command if the gpu could do the same much much faster?


The same apathy that will keep them from using avx...
If the games wouldn't run due to a lack of threads,then they would still not be able to run today,but all games where fixed pretty quickly,and now run faster on the pentium than on amd's quads,it was just the straight port mentality that caused the problems,the consoles use the first 2 cores for the OS so games start on the third core,devs got enough bad press to keep repeating the same stupid mistake.
(but then again...batman arkham knight)

Well, the bug of not running on a two thread cpu may have been eliminated, but two of the worst games for dual cores, even with hyperthreading, are the most recent AAA releases: Witcher 3 and GTA V. Batman seems to do well on i3 at least, but I am disregarding that for a moment. OTOH, I dont buy into the dual core bashing some engage in either. Like the old movie line "you just have to know your limitations".
 

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
When testing my 4.5Ghz G3258 one thing I noticed was much less studdering when paired with Nvidia cards over AMD. For instance paired with a Maxwell 750 Ti BF4 stuttered less than when paired with a Radeon 290. Skyrim and Trials also exhibited similar behaviour.

Just something to keep in mind when talking about how the dual cores studder.
 

crashtech

Lifer
Jan 4, 2013
10,559
2,139
146
I've seen talk about how Nvidia drivers are more threaded, which would make one think they would not be advantageous to a poorly threaded CPU, yet Nvidia GPUs seem to have maintained their reputation for having better minimums with weak CPUs even so.
 

TheELF

Diamond Member
Dec 22, 2012
3,998
745
126
Well, the bug of not running on a two thread cpu may have been eliminated, but two of the worst games for dual cores, even with hyperthreading, are the most recent AAA releases: Witcher 3 and GTA V.
I already uploaded a video of gtaV showing the g1820 playing it at 25fps with no drops,witcher 3 plays at around 20fps but with the cpu at around 70% ,the r7-240 is just not strong enough for more than 20fps.
And that's the celeron and that's while recording and that's in windowed mode.
For gtaV at least you have proof.

Batman was more an example of how devs still manage to publish totally broken games.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
When testing my 4.5Ghz G3258 one thing I noticed was much less studdering when paired with Nvidia cards over AMD. For instance paired with a Maxwell 750 Ti BF4 stuttered less than when paired with a Radeon 290. Skyrim and Trials also exhibited similar behaviour.

Just something to keep in mind when talking about how the dual cores studder.

It might be that the Radeon being bigger than the GTX 750 Ti is contributing factor to the stuttering.

http://forums.anandtech.com/showpost.php?p=37519804&postcount=91

With that mentioned, I would have thought Mantle with the R9 290X (on BF4) would have helped the issue.
 
Last edited:

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
Well, the bug of not running on a two thread cpu may have been eliminated, but two of the worst games for dual cores, even with hyperthreading, are the most recent AAA releases: Witcher 3 and GTA V. Batman seems to do well on i3 at least, but I am disregarding that for a moment. OTOH, I dont buy into the dual core bashing some engage in either. Like the old movie line "you just have to know your limitations".

The witcher 3 runs well on the pentium though. I have a gtx 670, which isn't fast enough to be cpu bound on 1920x1200.

If I were to have a faster gpu and put everything on ultra I suppose it would stutter while running around on the horse.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Regarding frame rate and frame time testing, it would be interesting to see some more data on low end CPUs (g3258 and Athlon x 4 860k) coupled to either low end (eg, R7 250X) GPUs or low midrange (R9 270/R9 270X) GPUs (especially in GTA V).

Reason: Though I appreciate the data points with the high end cards, using lower end cards is more realistic and real world type pairing of hardware.
 
Last edited:

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
Regarding frame rate and frame time testing, it would be interesting to see some more data on low end CPUs (g3258 and Athlon x 4 860k) coupled to either low end (eg, R7 250X) GPUs or low midrange (R9 270/R9 270X) GPUs (especially in GTA V).

Reason: Though I appreciate the data points with the high end cards, using lower end cards is more realistic and real world type pairing of hardware.

Agreed. Anyone building a complete gaming machine with a Pentium G3258 is likely on a tight budget although I think generally Nvidia has less driver overhead (Mantle not withstanding) so that's why I paired mine with a 750 Ti instead of something from AMD.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |