a good example of CPU limitations

toyota

Lifer
Apr 15, 2001
12,957
1
0
I got one from Batman: Arkham Asylum for you. at 1920x1080 on very high settings with physx on high the cpu still makes a noticeable difference even with a 192sp gtx260. the E8500 at 2.0 is still faster than most older cpus like a 5600 X2 and lower so think of all the people using high end cards with those poky cpus. heck there is some guy on here with 8800gt sli and a stock 3800 X2. thats sad because even one of those 8800gt cards is not getting fully utilized and adding that second is nearly a complete waste in most games.

1920x1080 all very high settings, no AA and high physx

E8500 at 3.16 GTX260 at 666/1392/2200

Frames, Time (ms), Min, Max, Avg
2742, 66301, 31, 53, 41.357


E8500 at 2.0 GTX260 at 666/1392/2200

Frames, Time (ms), Min, Max, Avg
2086, 61778, 21, 40, 33.766



I can also say that in Far Cry 2 anything lower than 2.33 on my E8500 will bottleneck the 192sp gtx260 even at 1920x1080 with very high settings. at 2.33 or lower, overclocking the card will result in zero change.

Settings: Demo(Ranch Small), 1920x1080 (60Hz), D3D10, Fixed Time Step(No), Disable Artificial Intelligence(No), Full Screen, Anti-Aliasing(None), VSync(No), Overall Quality(Very High), Vegetation(Very High), Shading(Very High), Terrain(Very High), Geometry(Very High), Post FX(High), Texture(Very High), Shadow(Very High), Ambient(High), Hdr(Yes), Bloom(Yes), Fire(Very High), Physics(Very High), RealTrees(Very High)

E8500 at 2.33 and gtx260 at 576/1188/1990

Total Frames: 2485, Total Time: 51.00s
Average Framerate: 48.72
Max. Framerate: 75.22 (Frame:395, 6.59s)
Min. Framerate: 32.19 (Frame:1671, 34.07s)

E8500 at 2.33 and gtx260 overclocked to 666/1392/2200

Total Frames: 2448, Total Time: 51.01s
Average Framerate: 47.99
Max. Framerate: 73.21 (Frame:407, 6.94s)
Min. Framerate: 32.33 (Frame:1665, 34.60s)

E8500 at 3.16 and the gtx260 overclocked to 666/1392/2200

Total Frames: 3024, Total Time: 51.01s
Average Framerate: 59.28
Max. Framerate: 92.79 (Frame:489, 6.69s)
Min. Framerate: 40.40 (Frame:2042, 34.11s)






 

biostud

Lifer
Feb 27, 2003
18,672
5,400
136
increasing your CPU speed by 58% increase you average framerate by ~23%

in the article increasing the CPU speed by 30% increase the framerate more than 100% in some games. That is quite surprising to me, and make me happy that my "poky", but overclocked CPU is running fine with my 8800GT.
 

faxon

Platinum Member
May 23, 2008
2,109
1
81
i would have to agree quite readily with the FC2 results. when i went from my e5200 @ 3.2GHz to my Q9650 @ stock i gained a good 17FPS minimum increase with my 9800GTX @ 1680x1050. the gains were similarly notable at 1920x1200. overclocking the Q9650 past 4GHz netted gains which i dont have exact details on at the moment, but if you look at the anandtech bench it should be greater gains than the difference between a Q9650 and an e8400 vs a Q9770 and an e8400. my E5200 @ 3.2ghz benches ballpark vs an e8400, so it's a good point of reference in benchmarks. the gains in FC2 using the settings shown were ~4.5fps with a 200MHz boost in CPU speed and an increase in the FSB speed from 333x4 to 400x4. im running my FSB @ 467x4 and the CPU is at x9, netting me ~4.2ghz, or a 1GHz gain over the Q9770 + a 67mhz increase in bus speed (identical to the difference between a Q9650 and a 9770). based on the benchmarks shown, the aproximate gain was in the 30% range, from 45FPS to 65FPS in the benchmark (e8400 vs q9770), and in the 1% range going up 200MHz + 67mhz on the system bus. based on this, i would say i probably got an extra 7% increase in FPS. the interesting thing about this whole thing though, is that none of my cores are fully utilized in FC2. my guess is something in the game optimizing for load balancing across the cores, which benefits greatly from 2 extra cores, but is still capable of benefiting from increased clock speeds. i have seen the same scenario play out in other less well threaded games when run on just a dual core. core 1 will go up to 80% CPU utilization but core 2 will stay around 30, yet the game is still CPU limited since changes in graphical quality beyond a certain point stop giving notable increases or decreases in graphical quality, depending on which direction the slider goes. game of mention is everquest 2, which was updated about 10 months ago to utilize dual core CPUs to some extent to boost engine performance.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,274
41
91
^^ Paragraphs, please

Well, I have Far Cry 2, Left 4 Dead, and the Crysis demo along with an Athlon X2 and an HD3850. I just might try to replicate these results, although I'm not sure if I want to use the same settings as they did. Using low and medium settings, for example, in Far Cry 2 is silly to me. I played the game on High, Very High, and sometimes Ultra with AA. It did get choppy at places, but there is no reason for me to turn the settings to low other than to benchmark.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
Originally posted by: biostud

Left 4 Dead is clearly limited by the Athlon X2 3800+ CPU--look at the difference 600 MHz makes. Our Athlon 64 X2 4200+ at Athlon 64 X2 5000+ speeds is doubling the frame rate.
[/quote]
That?s because they?re running a lightweight game engine at medium settings (LMAO) on a prehistoric processor, so is it really surprising?

Originally posted by: toyota

1920x1080 all very high settings, no AA and high physx
That?s a low resolution with no AA; furthermore, 2 GHz is an artificially slow clockspeed to be testing at.



Here are the benchmarks from my 8800 GTS vs 8800 Ultra on a E6600 (2.4 GHz):

http://episteme.arstechnica.co...965/m/130007356831/p/1

Note the massive performance gains across the board.

And here the benchmarks on a 8800 Ultra on a E6600 (2.4 GHz) vs a E6850 (3.0 GHz):

http://episteme.arstechnica.co...007078831#492007078831

Note that most games showed absolutely zero difference on a 8800 Ultra from this upgrade, even at middling settings like 1600x1200 with 4xAA.

On the same CPU (E6850) I again witnessed around a 30% performance gain by moving from a GTX260+ to a GTX285.

CPU limitations are grossly exaggerated with most differences coming from artificially created scenarios; either by using hopelessly slow CPUs, and/or by using game settings that no-one buying a $1000 quad-core CPU is gong to be running.

Quad-core is just a way for CPU vendors to make more money through marketing, and I?ll probably be picking up a dual-core i5 when they become available. Even then I won?t expect much performance gain (3.73 GHz over 3 GHz is only ~24% faster), but at least my bank account won?t get raped like it would with those horrifically overpriced i7 systems.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
BFG I think the point of the article is to show that when the CPU is the major bottleneck, the graphics card is completely out of the equation.

In a similar case, you have shown that when a processor is sufficient enough (i.e. E6600), then the performance growth shifts towards different videocards. Even then I bet that minimum framerates in gaming of a Core i7 @ 4.0ghz and say 4890 will give you a major boost in latest games compared to a stock E6600.

We can logically deduce that one is obviously going to have a situation where either the CPU or the Videocard is the major bottleneck, assuming one of those components is very old tech (3+ years old). In other words, it's not like going from Core 2 Quad 3.4ghz going to increase your framerates by 30% if you suddenly got a Core i7 @ 4.5ghz with a GTX 285/4890. It's all about balance and knowing when to upgrade the processor/videocard. The fact that an X2 near 5000+ speeds still does OK is already impressive considering it is so old and is not a quad core.

These benchmarks are revealing because when a cpu becomes a major bottleneck, performance of the system falls like a rock. With today's prices for hardware so low, it is likely that most 'gamers' with X2 A64s will shift to Phenom II, and Core i5/7 1156 platform anyway. In which case any modern Core i5/i7 or Phenom II will no longer be a bottleneck and we should see little to no performance difference between the lowest speeds and highest speeds of the quad core models.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
Originally posted by: RussianSensation

In a similar case, you have shown that when a processor is sufficient enough (i.e. E6600), then the performance growth shifts towards different videocards.
Yep, my stance is summed up in one sentence: as long as you have a half-decent CPU (e.g. my E6850), the GPU is by far the biggest bottleneck in performance overall

I?ve seen this first-hand time and time again in real games that I actually play, at real settings I actually play them at. This trend actually seems to be on the rise given Cryostasis beats my GTX285 to a bloody pulp even without AA/AF and PhysX disabled at 1680x1050. I know it?s the graphics card because dropping the resolution significantly increases the framerate.

Even then I bet that minimum framerates in gaming of a Core i7 @ 4.0ghz and say 4890 will give you a major boost in latest games compared to a stock E6600.
We?ll just have to see about that when I upgrade.

I?ve had the upgrade itch for a new CPU/platform for quite a while now, but I absolutely cannot see anything current that I could class as an upgrade.

Also I saw this the other day and it?s quite appropriate for this thread:

http://www.techspot.com/review...performance/page8.html

Even without AA and AF, a 1 GHz underclock didn?t even dent the framerate.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: RussianSensation
BFG I think the point of the article is to show that when the CPU is the major bottleneck, the graphics card is completely out of the equation.

In a similar case, you have shown that when a processor is sufficient enough (i.e. E6600), then the performance growth shifts towards different videocards. Even then I bet that minimum framerates in gaming of a Core i7 @ 4.0ghz and say 4890 will give you a major boost in latest games compared to a stock E6600.

We can logically deduce that one is obviously going to have a situation where either the CPU or the Videocard is the major bottleneck, assuming one of those components is very old tech (3+ years old). In other words, it's not like going from Core 2 Quad 3.4ghz going to increase your framerates by 30% if you suddenly got a Core i7 @ 4.5ghz with a GTX 285/4890. It's all about balance and knowing when to upgrade the processor/videocard. The fact that an X2 near 5000+ speeds still does OK is already impressive considering it is so old and is not a quad core.

These benchmarks are revealing because when a cpu becomes a major bottleneck, performance of the system falls like a rock. With today's prices for hardware so low, it is likely that most 'gamers' with X2 A64s will shift to Phenom II, and Core i5/7 1156 platform anyway. In which case any modern Core i5/i7 or Phenom II will no longer be a bottleneck and we should see little to no performance difference between the lowest speeds and highest speeds of the quad core models.

First of all, Quad over dual is not important except in a couple of RTS games
- if you use *extremes* like a 3800+, then you will see differences with an *ancient* processor

Secondly there is not a lot of PRACTICAL difference playing with a fairly low end $80 Athlon II X2 at 3.9 GHz vs a $250 a Core2Quad at 4.0 GHZ - even with a HD 4870-X2 as the graphics
-- somewhere around 3.4 GHz is plenty of speed for any modern dual core paired with a fairly fast GPU like GTX 275 or 280 or perhaps HD 4890 - when you go to CrossFired 4870s, you want closer to 4 GHz

i have done quite a bit of testing with this and will continue on with Core i7/i5
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: BFG10K

Also I saw this the other day and it?s quite appropriate for this thread:

http://www.techspot.com/review...performance/page8.html

Even without AA and AF, a 1 GHz underclock didn?t even dent the framerate.

May be so but I question the validity of just 1 site.

Look at this:

Legion Hardware - Wolfenstein - 1920x1200 8AA/8AF Max

Using Intel Core i7 965 Extreme Edition, Microsoft Windows 7 (64-bit), Nvidia Forceware 190.38, ATI Catalyst 9.8
GTX 275 = 50
4890 = 47
GTX 260 = 43


vs. what Techspot got - 1920x1200 8AA/8AF

using Intel Core 2 Quad Q9650 (LGA775), Microsoft Windows 7 Ultimate (64-bit), Nvidia Forceware 190.38, ATI Catalyst 9.8
GTX 275 = 67
GTX 260 = 57
4890 = 55

So before we start concluding on TechSpot's #s, I'd wait for more reputable site benches. How can they be getting 17 more frames on a slower processor with GTX275?
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
Originally posted by: RussianSensation

How can they be getting 17 more frames on a slower processor with GTX275?
It?s extremely likely the two sites are running completely different benchmarks, and are thus not comparable to each other.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,001
126
Originally posted by: BFG10K
Originally posted by: RussianSensation

In a similar case, you have shown that when a processor is sufficient enough (i.e. E6600), then the performance growth shifts towards different videocards.
Yep, my stance is summed up in one sentence: as long as you have a half-decent CPU (e.g. my E6850), the GPU is by far the biggest bottleneck in performance overall.

I agree 100%... people make too big of a deal out of CPU bottle necks. Obviously the CPU plays an important role, but you can game one a middle of the road processor most times without much issue. Sure, you may get a few more FPS by going high end, but at the settings most of us use (Usually high settings, with some AA, etc.) the GPU is the biggest bottleneck more often than not.

Tell me, would you guys rather game on a low to mid range CPU like a Phenom II x3 720 (2.8GHz) coupled with a higher end GPU like a Radeon 4890 or would you rather game on a high end CPU 4GHz i7 with a mid range GPU like a Radeon 4670? I know I'd take the PhII 720 / 4890 combo in a heart beat.
 

vj8usa

Senior member
Dec 19, 2005
975
0
0
I honestly don't understand how the numbers Tomshardware is getting are even possible. According to them, a 30% clock speed increase is netting over a 200% increase in framerates in some cases (FC2 on the 3850 goes from 20.2 to 61.6 FPS, for instance). How can the performance increase be greater than the clockspeed increase?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: apoppin

First of all, Quad over dual is not important except in a couple of RTS games
- if you use *extremes* like a 3800+, then you will see differences with an *ancient* processor

Secondly there is not a lot of PRACTICAL difference playing with a fairly low end $80 Athlon II X2 at 3.9 GHz vs a $250 a Core2Quad at 4.0 GHZ

ARMA 2 1280x1024
Core i7 920 @ 3.5ghz = 36.9 average, 33 minimum
Core 2 Quad @ 3.5ghz = 29.8 average, 27 minimum
Phenom II X4 955 @ 3.5ghz = 25.5 average, 24 minimum
Core 2 Duo E8400 @ 3.0ghz = 22.8 average, 20 minimum

WiC: Dual Core vs. Quad Core with AVS Scan in the background
Quad Core Minimum = 82 frames, average 141
Dual core minimum = 32, average 71

Although I personally don't run Anti-virus while gaming, there are certainly other things people do while they game besides Anti-virus scanning (which also brings most mechanical hard drives to a crawl anyway).

-- Also I don't understand your comparison of an $80 X2 to a $250 Quad Core. This hardly makes any sense since a ton of people purchased Core 2 Quads for $300 2 years ago - recall summer 2007 and most of these computer techies had theirs to 3.2-3.6ghz all this time. Not even taking time value of money into the equation, they had a very fast processor for 24 months.

-- No one would pay $250 Core 2 Quad today, and especially since you can get a Q9550 for $170 at Microcenter. So why would you compare an X2 for $80 to a C2Q at $250 = > you should compare $199 Core i5 @ 4.0ghz to that Athlon and then your argument will be stronger (but I have a feeling it will mop the floor with that Athlon).

-- Moreso, you will be able to get that Core i5 to speeds which exceed Core i7 965 for just $200 + $50 cooler -> essentially getting $1000 CPU speed for $250. To me the idea of getting a $200 processor to perform faster than $1000 processor brings a smile to my face but spending $80 for a processor which can barely compete with a C2Q 6600 @3.4ghz I got for $300 2 years ago is hardly exciting. What happened to the term Computer Enthusiast?

-- Bang for the buck X2 falls on its face esp. once you consider other tasks like Lame MP3 conversion and MKV video encoding. $199 Core i5 750 with $140 mobo at 4.0ghz is the next thing to look out for. For a mere $120 more for the cpu, this setup will give you another 2+ years of life. AMD better be ready to bring their X4 3.4 to $150.

-- Finally, recall what happened to A64 4000+ single core vs. X2 3800+ dual core. All dual core systems will eventually suffer this fate. There is a reason it costs $80. And then once you realize that your dual is not good enough, you'll rush out and get a $100 Phenom II X4 3.4ghz drop in for your socket AM3, essentially paying $80 X2 --> $100 for that new X4 <OR> I could have all that right up front for $200 with Core i7 920 / i5 750. . .in much the same way i bought a C2Q6600 for $300 over C2D 6600 $150 at the time. Now those people are browsing online FS forums for used G0s and Q9550s.
 

Leyawiin

Diamond Member
Nov 11, 2008
3,204
52
91
The story of my 8800 GTX. When I bought it I had an X2 5200+. A year later I upgraded to an X2 6400+ and saw a 1/3 jump in FPS in a game like Oblivion. This summer I build an X4 955 rig. FPS went up just 20% more in that game. I "think" I finally have maxed out the 8800 GTX's ability after 2 1/2 years.

More importantly, in scenes with lots of spell effects/special lighting and lots of NPCs running around my game no longer bogs down to the teens like it could with the Windsor. While benchmarking runs changed a fairly small amount, the overall game experience is much better and more smooth. Games may be more GPU limited in generally, but there are still plenty of CPU heavy activities and scenes in many of them.

CPU still plays a big role.
 

Zensal

Senior member
Jan 18, 2005
740
0
0
I was having trouble playing TF2 with my Opty 165 @ 2.4GHz no matter the resolution. FPS would drop into the teens on larger servers. Upgraded to an E5200 @ 3.7 and I was good to go, though it still seems like my CPU is my bottleneck.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: vj8usa
I honestly don't understand how the numbers Tomshardware is getting are even possible. According to them, a 30% clock speed increase is netting over a 200% increase in framerates in some cases (FC2 on the 3850 goes from 20.2 to 61.6 FPS, for instance). How can the performance increase be greater than the clockspeed increase?

It doesn't make sense all.

If the rate-limiting step in the computation sequence that results in fps is improved by 30% then at best the speed-up in the computation sequence that results in fps is going to improve by 30%.

To derive a 200% fps improvement requires something to have been bottlenecking the computation loop quite severely and that something had to be improved upon by 2x without anything else in the computation loop becoming the rate-limiting step.
 

biostud

Lifer
Feb 27, 2003
18,672
5,400
136
Originally posted by: BFG10K
Originally posted by: biostud

Left 4 Dead is clearly limited by the Athlon X2 3800+ CPU--look at the difference 600 MHz makes. Our Athlon 64 X2 4200+ at Athlon 64 X2 5000+ speeds is doubling the frame rate.
That?s because they?re running a lightweight game engine at medium settings (LMAO) on a prehistoric processor, so is it really surprising?

Originally posted by: toyota

1920x1080 all very high settings, no AA and high physx
That?s a low resolution with no AA; furthermore, 2 GHz is an artificially slow clockspeed to be testing at.

[/quote]

Not everyone has the newest hardware, and if the article had shown linear progression with CPU speed, I wouldn't have posted this as the results would have been espected. But that a 30% increase in CPU speed, that results up to 200% increase in framerates is not something you see every day.

Personally the term bottleneck is a bit overused IMO when you talk about +60fps in a game, but in this case going from ~11.5fps to ~32.5fps (1680x1050, medium settings, HD 3850) in FC2 with a 600Mhz CPU increase that is quite amazing.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: BFG10K
Originally posted by: biostud

Left 4 Dead is clearly limited by the Athlon X2 3800+ CPU--look at the difference 600 MHz makes. Our Athlon 64 X2 4200+ at Athlon 64 X2 5000+ speeds is doubling the frame rate.
That?s because they?re running a lightweight game engine at medium settings (LMAO) on a prehistoric processor, so is it really surprising?

Originally posted by: toyota

1920x1080 all very high settings, no AA and high physx
That?s a low resolution with no AA; furthermore, 2 GHz is an artificially slow clockspeed to be testing at.
[/quote]

1920 is a low res? lol. if I raised it it wouldnt even be playable in the first place. 2.0 on the E8500 is not artificially low and is faster than a 5600 X2 which is faster what the majority of pc users still have. also all that is with a on old 192sp gtx260. cpu limitations if anything are certainly not exaggerated. last year I had horrific experiences trying to run a decent card on a 5000 X2. there where a few games like UT3 that still had the same min framerate with 4670, 9600gt, or 8800gt as it did with a wimpy 8600gt. lowering settings dont really help when the cpu is the culprit for low performance. a 5000 X2 is like having my E8500 at 1.6-1.8 yet half the people on here would tell somebody to get a gtx285 if they asked about getting one.
 

Forumpanda

Member
Apr 8, 2009
181
0
0
Originally posted by: Idontcare
Originally posted by: vj8usa
I honestly don't understand how the numbers Tomshardware is getting are even possible. According to them, a 30% clock speed increase is netting over a 200% increase in framerates in some cases (FC2 on the 3850 goes from 20.2 to 61.6 FPS, for instance). How can the performance increase be greater than the clockspeed increase?

It doesn't make sense all.

If the rate-limiting step in the computation sequence that results in fps is improved by 30% then at best the speed-up in the computation sequence that results in fps is going to improve by 30%.

To derive a 200% fps improvement requires something to have been bottlenecking the computation loop quite severely and that something had to be improved upon by 2x without anything else in the computation loop becoming the rate-limiting step.
I suspect that its something like, the game utilizes some set amount of processing power for logic/AI/sound, independent of the part of the engine that draws graphic.

Thus it is plausible that by increasing the CPU power by 30% they are increasing the power available for graphic processing in the engine by a higher percentage.

But the gain still seems sensationally large and is probably fabricated to make people read the article.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Originally posted by: vj8usa
I honestly don't understand how the numbers Tomshardware is getting are even possible. According to them, a 30% clock speed increase is netting over a 200% increase in framerates in some cases (FC2 on the 3850 goes from 20.2 to 61.6 FPS, for instance). How can the performance increase be greater than the clockspeed increase?

They probably had vsync on, in which case it's either you hit the desired fps, or you drop significantly.
It could be even worse I suppose if it has some other time dependent loop like AI or input that also gets missed.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
obvious benchmarking errors.

28 avg fps to 60 avg fps doesn't make sense.

Tom's hardware has been going to pits ever since forum users started benchmarking and writing articles. Nubs at that.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: Idontcare

It doesn't make sense all.

Sure. A lot of things in life don't make sense because we may lack the knowledge or the technical expertise. I don't have an explanation of how 30% boost can give you 100% to 200% performance increase but here is another site with another bench:

X2 3800+ = 2 x 2.0ghz A64
A64 4000+ = 1 x 2.4ghz A64

So we 'logically' would say you can expect a 2x performance increase at most, right? Apparently that would be Wrong.

Far Cry 2 - X2 3800+ vs. A64 4000+
X2 3800+ = 29.1 average, 20 minimum
A64 4000+ = 9.2 average (>3x slower), 5 minimum (4x slower)

Also notice how the minimum framerates of Q6600 2.4ghz almost match the average frames of E6600 2.4ghz for the "dual core is enough" camp.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: RussianSensation
Originally posted by: Idontcare

It doesn't make sense all.

Sure. A lot of things in life don't make sense because we may lack the knowledge or the technical expertise. I don't have an explanation of how 30% boost can give you 100% to 200% performance increase but here is another site with another bench:

X2 3800+ = 2 x 2.0ghz A64
A64 4000+ = 1 x 2.4ghz A64

So we 'logically' would say you can expect a 2x performance increase at most, right? Apparently that would be Wrong.

Far Cry 2 - X2 3800+ vs. A64 4000+
X2 3800+ = 29.1 average, 20 minimum
A64 4000+ = 9.2 average (>3x slower), 5 minimum (4x slower)

Also notice how the minimum framerates of Q6600 2.4ghz almost match the average frames of E6600 2.4ghz for the "dual core is enough" camp.

well all they do is shut off one core of an X2 and that doesnt always work correctly.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: Forumpanda
Originally posted by: Idontcare
Originally posted by: vj8usa
I honestly don't understand how the numbers Tomshardware is getting are even possible. According to them, a 30% clock speed increase is netting over a 200% increase in framerates in some cases (FC2 on the 3850 goes from 20.2 to 61.6 FPS, for instance). How can the performance increase be greater than the clockspeed increase?

It doesn't make sense all.

If the rate-limiting step in the computation sequence that results in fps is improved by 30% then at best the speed-up in the computation sequence that results in fps is going to improve by 30%.

To derive a 200% fps improvement requires something to have been bottlenecking the computation loop quite severely and that something had to be improved upon by 2x without anything else in the computation loop becoming the rate-limiting step.
I suspect that its something like, the game utilizes some set amount of processing power for logic/AI/sound, independent of the part of the engine that draws graphic.

Thus it is plausible that by increasing the CPU power by 30% they are increasing the power available for graphic processing in the engine by a higher percentage.

But the gain still seems sensationally large and is probably fabricated to make people read the article.

I had drafted practically the same post as you, then I thought about it and realized we are talking 200% purported gains from a mere 30% increase in computing resources. Sure I can craft a pretty ridiculous background processing load scenario to generate such a corner-case but its just absurd at a practical level...so I deleted that draft and made the post which you see above. Just had to quote you because we were both on the exact same brainwave here :thumbsup:

Originally posted by: RussianSensation
Originally posted by: Idontcare

It doesn't make sense all.

Sure. A lot of things in life don't make sense because we may lack the knowledge or the technical expertise. I don't have an explanation of how 30% boost can give you 100% to 200% performance increase but here is another site with another bench:


Then by all means speak for yourself and your lack of technical expertise but I don't consider myself to be so ignorant of fundamentally generic rate limiting attributes in throughput scenarios as to be willing to suspend my disbelief and accept on faith alone that the occam's razor explanation here is that the 30% increase in computing resources is the cause of a 200% increase in fps.

I don't question the data, although we could just as easily do that, but I do question the claims that the only change in the system was the processor's speed.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |