Thief 4 CPU benchmarks

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

SPBHM

Diamond Member
Sep 12, 2012
5,058
410
126
the gamegpu test was not made using the built in benchmark if I understand this correctly (Google translate)

"CPU-testing we conducted on 17 models of basic configurations that are relevant today. The test was conducted in those places where the value of video games for the minimum and its load was less than 99%, this time at a resolution of 1920x1080 at the highest quality setting. Testing at maximum quality settings 1920x1080"
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
FXAA as far as I know is not going to decrease the performance significantly, it wouldn't change much, the test was made in 1080p very high, the pcgameshardware test at 720p, and with strange results, indicating a lower CPU load test,

pcgameshardware test is using Anisotropic Filtering = 16x when gamegpu use Trilinear. Also, pcgameshardware use FXAA and perhaps SSAA (it doesn't state) when gamegpu does not.

The pcgameshardware settings represent more the real world performance than those of gamegpu. Even with slower GPUs everyone will use AF 16x than trilinear and people with High-End GPUs will at least enable FXAA.

We can clearly see bellow, that with AF 16x and FXAA enable the Game is GPU limited at 1080p even with SSAA Off. So once again, if you want to play with very high quality graphics and AA filters it will be better to have a fast GPU with weaker CPU than the other way around.



 

SPBHM

Diamond Member
Sep 12, 2012
5,058
410
126
pcgameshardware test is using Anisotropic Filtering = 16x when gamegpu use Trilinear. Also, pcgameshardware use FXAA and perhaps SSAA (it doesn't state) when gamegpu does not.

The pcgameshardware settings represent more the real world performance than those of gamegpu. Even with slower GPUs everyone will use AF 16x than trilinear and people with High-End GPUs will at least enable FXAA.

We can clearly see bellow, that with AF 16x and FXAA enable the Game is GPU limited at 1080p even with SSAA Off. So once again, if you want to play with very high quality graphics and AA filters it will be better to have a fast GPU with weaker CPU than the other way around.






now you think a 720P test with 70min for all CPUs is more representative, OK it simply is a poor choice for a CPU test, with a GPU limited area, no representative of the CPUs performance.

FXAA and 16x AF are not very heavy, you should know that,

as you can see on the graph you posted the 780 TI with AF and FXAA is faster than on their CPU test, and it's easy to understand why, first, AF and FXAA are not very heavy, and maybe they did the right thing and tested the CPUs on a CPU limited area, GPUs on GPU limited area (and faster CPU for no CPU bottleneck), on CPU limited area some CPUs are slower than others as you can see:



which will also be the case with FXAA and AF16x clearly,

but SSAA can save you and give you a nice GPU bottleneck, from what I see most people are playing with it off, because it's normally not worth the performance hit,
 
Last edited:

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
the QX9650 have high enough clock speed, 3GHz I wouldn't expect it to be so slow when all the others have 70 minimum
Something is very wrong with PCGH's C2Q numbers. C2Q is no spring chicken, but we're still talking about a Penryn CPU at 3GHz. A 2500K is simply not going to be 3x faster than a fast C2Q. We didn't make that much progress in only 3 years.

http://www.anandtech.com/bench/product/49?vs=288
 
Last edited:

coolpurplefan

Golden Member
Mar 2, 2006
1,243
0
0
Something is very wrong with PCGH's C2Q numbers. C2Q is no spring chicken, but we're still talking about a Penryn CPU at 3GHz. A 2500K is simply not going to be 3x faster than a fast C2Q. We didn't make that much progress in only 3 years.

http://www.anandtech.com/bench/product/49?vs=288

I don't think anyone wants me to link to a competitor but I saw a graph on another site that showed the Intel Core2Duo E8400 as the base (100%) and the 4670 if I remember correctly was over 200% (for a gaming benchmark).
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
I don't think anyone wants me to link to a competitor but I saw a graph on another site that showed the Intel Core2Duo E8400 as the base (100%) and the 4670 if I remember correctly was over 200% (for a gaming benchmark).
Now if you want to compare a dual core to a quad core, that's a different story. If you have the ability to make good use of more than 2 threads then you should definitely see better gains.

But in this case we're comparing a 4c/4t processor to another 4c/4t processor.
 

SPBHM

Diamond Member
Sep 12, 2012
5,058
410
126
Something is very wrong with PCGH's C2Q numbers. C2Q is no spring chicken, but we're still talking about a Penryn CPU at 3GHz. A 2500K is simply not going to be 3x faster than a fast C2Q. We didn't make that much progress in only 3 years.

http://www.anandtech.com/bench/product/49?vs=288

yes, the difference looks to high for some reason, also a shame they didn't test anything between the 6350 and the Core 2 quad, but considering the proximity of all the other CPU I still thing there is something flawed with their testing (like a less CPU dependent test scene), but the C2Q makes it all more difficult to understand,


I don't think anyone wants me to link to a competitor but I saw a graph on another site that showed the Intel Core2Duo E8400 as the base (100%) and the 4670 if I remember correctly was over 200% (for a gaming benchmark).


maybe is what I posted previously
http://media.bestofmicro.com/I/G/395944/original/Combined-Average-Gaming-Performance.png





it's 2.83 and 3.4, not 3.00 but it gives you an idea, considering how hard "Welcome to the jungle" on Crysis 3 was (should be a lot more challenging for Intel Quad cores than Thief),
let's assume is around 25/16 average for the C2Q 3GHz and 38/26 for the 3570,
on pcgameshardware Thief test it its 43/17 for the C2Q and 114/74 for a 2500K, specially the min difference is intriguing, from 50% on Crysis to over 300% gains on min? Thief must really hate Core 2 Quads!

now IF PCGH actually used an e8400 their results would make more sense, Thief have high utilization for 4 cores, so the e8400 would be way slower than anything else (with 4 cores)

I still think there is something wrong with the PCgameshardware thief test, considering every other single CPU is hitting over 70FPS minimum and also high average, which means it was not one of the hardest parts of the game for the CPU, as you can see on the gamegpu test, but the slowest quad core on their test (FX 4100) is also in big trouble compared to the i5s,
 
Last edited:

2is

Diamond Member
Apr 8, 2012
4,281
131
106
I don't think anyone wants me to link to a competitor but I saw a graph on another site that showed the Intel Core2Duo E8400 as the base (100%) and the 4670 if I remember correctly was over 200% (for a gaming benchmark).

No one wants you to link it because it would be meaningless.

Big difference between a Core 2 Duo and a Core 2 Quad.

Duo = 2 cores
Quad = 4 cores
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
now you think a 720P test with 70min for all CPUs is more representative, OK it simply is a poor choice for a CPU test, with a GPU limited area, no representative of the CPUs performance.

No, I think the Graphics Settings(not the resolution) are more representative of how gamers with high-end GPUs will play the game.

FXAA and 16x AF are not very heavy, you should know that,

as you can see on the graph you posted the 780 TI with AF and FXAA is faster than on their CPU test, and it's easy to understand why, first, AF and FXAA are not very heavy, and maybe they did the right thing and tested the CPUs on a CPU limited area, GPUs on GPU limited area (and faster CPU for no CPU bottleneck), on CPU limited area some CPUs are slower than others as you can see:

Your assumption of a CPU limited area is wrong. First graph with FXAA and 16x AF is with a 4.9GHz Core i7 3970X. Using a 4.9GHz CPU at those IQ settings with the GTX780Ti makes it both CPU and GPU limited compared to CPU scaling graph and the SLI/CF cards.
But if you had a slower GPU you would only be GPU limited.


Second graph is again with 4.9GHz Core i7 and SSAA at Low. Now the game is GPU limited even with the GTX-780Ti because it produce less fps than the CPU scaling graph.



Third graph is with SSAA at High, completely GPU limited but GTX780Ti almost produce 60fps minimum. Since this is a single player Game, i would say that any card produce above 30fps minimum would be fine at those settings.



The CPU scaling tests were run at Default frequencies.


which will also be the case with FXAA and AF16x clearly,

but SSAA can save you and give you a nice GPU bottleneck, from what I see most people are playing with it off, because it's normally not worth the performance hit,

If you have the GTX-780Ti or R9 290 etc you get close to 60fps Minimum with SSAA at High. There is no need to turn this feature off if your GPU can produce that high fps. Everyone spending $600-800 for a GPU will turn ON every available IQ Setting possible.
Obviously if you have a lower end GPU you will use lower settings but then again you will be GPU limited once again. You will have to use a Dual Core or a low end Trinity CPU and pair it with a High-End GPU in order for the game to become CPU limited. Something that nobody will do in real life.
 

SPBHM

Diamond Member
Sep 12, 2012
5,058
410
126
No, I think the Graphics Settings(not the resolution) are more representative of how gamers with high-end GPUs will play the game.

so resolution is not considered settings with impact on graphics performance?
the game gpu (CPU) test is closer to reality in terms of GPU load imo.

do you really think 1280x720 vs 1920x1080 is a smaller difference than FXAA and AF 16x on and off?


Your assumption of a CPU limited area is wrong. First graph with FXAA and 16x AF is with a 4.9GHz Core i7 3970X. Using a 4.9GHz CPU at those IQ settings with the GTX780Ti makes it both CPU and GPU limited compared to CPU scaling graph and the SLI/CF cards.

read what you quoted "(and faster CPU for no CPU bottleneck)" was what I said, clearly, even if it's the exact same scene (I don't know, it wouldn't make sense to use the same, but the 3970x OC is so fast that it really can effectively change that), it shows that the even the stock 4670K/3970x can still hold the 780 Ti a little bit, pairing a fast VGA with slower CPUs don't seem like a good idea for this game, maybe mantle will change that.


But if you had a slower GPU you would only be GPU limited.
http://gamegpu.ru/images/remote/http--www.gamegpu.ru-images-stories-Test_GPU-MMO-ArcheAge-1920_2.jpg

Second graph is again with 4.9GHz Core i7 and SSAA at Low. Now the game is GPU limited even with the GTX-780Ti because it produce less fps than the CPU scaling graph.

http://gamegpu.ru/images/remote/htt...-stories-Test_GPU-MMO-ArcheAge-1920_2_low.jpg

Third graph is with SSAA at High, completely GPU limited but GTX780Ti almost produce 60fps minimum. Since this is a single player Game, i would say that any card produce above 30fps minimum would be fine at those settings.

http://gamegpu.ru/images/remote/htt...stories-Test_GPU-MMO-ArcheAge-1920_ssaa_h.jpg

you can't guarantee the game have no other moment of even higher CPU load, and it just shows at a fixed setting, when the game allows you to go lower, if you target 60FPS you can lower any other graphics setting, as you can see, avoid ssaa and so on,

The CPU scaling tests were run at Default frequencies.
http://gamegpu.ru/images/remote/htt...-stories-Test_GPU-Action-Thief_-test-proz.jpg



If you have the GTX-780Ti or R9 290 etc you get close to 60fps Minimum with SSAA at High. There is no need to turn this feature off if your GPU can produce that high fps. Everyone spending $600-800 for a GPU will turn ON every available IQ Setting possible.
Obviously if you have a lower end GPU you will use lower settings but then again you will be GPU limited once again. You will have to use a Dual Core or a low end Trinity CPU and pair it with a High-End GPU in order for the game to become CPU limited. Something that nobody will do in real life.

Yes, if you use a slower GPU you can lower settings for a higher framerate, which is why having a faster CPU is never a bad thing,
even with SSAA the 680 is bottlenecked by CPUs slower than the i5/i7
if you turn down this with slower VGAs it keeps going lower and lower (the amount of VGA needed to be bottlenecked by the lower CPUs on the graph)

HadOCP for example thinks using SSAA is always a bad idea, because they go for smooth framerate and higher res, a lot of people will follow this idea, because SSAA is brutal for the GPU load.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
I never use SSAA even if I can. I don't see the difference over MSAA and I'd rather not take the performance hit.
 

SPBHM

Diamond Member
Sep 12, 2012
5,058
410
126
That Russian site is pretty much BS.

I think they have improved over the years, their test platform used to be many (entirely) different PCs with different people testing if I remember correctly, but now it looks more organized,

still, it's always good to have more sources,

PCLab test


http://pclab.pl/art56634-9.html

it would be good to have some information on how they tested and min FPS, but...
and there is more GPU limitation,

they used win 8.1, gamegpu win 7 (I don't know why some review websites are still using win7),

one interesting thing, while the gamegpu is no showing much difference between 2m, 3m and 4m AMD CPUs, the pclab seems to have a nicer difference, maybe it's the thread scheduler from windows 8.1 and how it's using the modules!? because looking at the Intel CPUs with 4c there is still no gain from HT;
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
http://www.pcgameshardware.de/Thief...ehen-Sie-ohne-Ruckler-auf-Diebestour-1110793/

and

http://www.neogaf.com/forum/showthread.php?t=774299&page=4

Indeed, i5 or above is sufficient, Core 2 needs to go in the bin and surprise surprise FX is actually excellent for once. For posterity my i7 4770 + 780 Ti GHz gets a smooth 60FPS at 1920x1200 every setting maxed out or turned on, Vsync on. Looks like next gen is finally bringing in well multithreaded games.

Side question: are you not getting horrible tearing when turning on v-sync? Double buffer was better than triple buffer but still bad.

Back to the point. My 3550k at 4.8ghz with 780 runs at 70-100fps no problems maximum settings 1080p. I don't think this is a CPU intensive game.

Edit: SSAA on high but the 780 is clocked 120MHz over stock.
 
Last edited:

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
Found my problem. Don't use fraps. You can't use exclusive fullscreen. If you don't use exclusive fullscreen you will get horrible tearing.
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
Side question: are you not getting horrible tearing when turning on v-sync? Double buffer was better than triple buffer but still bad.

Back to the point. My 3550k at 4.8ghz with 780 runs at 70-100fps no problems maximum settings 1080p. I don't think this is a CPU intensive game.

Edit: SSAA on high but the 780 is clocked 120MHz over stock.

Actually virtually no tearing period. Triple buffering on. Brief look shows GPU usage at 99% with that 4770 (non-K) at 1200p.
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
Spoke too soon:

http://www.techspot.com/review/787-thief-benchmarks/page4.html

"That said, even at 4.50GHz with 53fps the FX-8350 was still slower than a Core i3 processor clocked at 3.3GHz, which is highly disappointing."

and

"The 2.5GHz Core i7-4770K was able to match the 4.5GHz FX-8350, which shows just how superior Intel's core efficiency is right now."

On Windows 8.1, which should be mandatory on review sites.
 

Durp

Member
Jan 29, 2013
132
0
0
Spoke too soon:

http://www.techspot.com/review/787-thief-benchmarks/page4.html

"That said, even at 4.50GHz with 53fps the FX-8350 was still slower than a Core i3 processor clocked at 3.3GHz, which is highly disappointing."

and

"The 2.5GHz Core i7-4770K was able to match the 4.5GHz FX-8350, which shows just how superior Intel's core efficiency is right now."

On Windows 8.1, which should be mandatory on review sites.

Yeah, Vishera is GREAT for gaming.

These quotes sound like a reviewer comparing a three+ generation old processor to Haswell and not something selling on the shelf right next to it.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,001
126
Yeah, Vishera is GREAT for gaming.

These quotes sound like a reviewer comparing a three+ generation old processor to Haswell and not something selling on the shelf right next to it.


What's wrong with Vishera for gaming? Even a lower end, cheap-o FX 4320 hums along at a quite acceptable 45FPS in this title.
 

Insomniator

Diamond Member
Oct 23, 2002
6,294
171
106
Why are we arguing CPU performance on yet another unreal 3 based game. Who cares, the game looks like crap. Why dont we post some Half Life 2 cpu benchmarks?
 

Durp

Member
Jan 29, 2013
132
0
0
What's wrong with Vishera for gaming? Even a lower end, cheap-o FX 4320 hums along at a quite acceptable 45FPS in this title.

If you read that review and still ask "what's wrong with Vishera for gaming? with a straight face then I can't really take you seriously.

That's a 3.3GHz Ivy i3 beating a 4.5GHz 8350, not even a Haswell or even the fastest Ivy i3.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,001
126
If you read that review and still ask "what's wrong with Vishera for gaming? with a straight face then I can't really take you seriously.

That's a 3.3GHz Ivy i3 beating a 4.5GHz 8350, not even a Haswell or even the fastest Ivy i3.


Right, Sandy, Ivy, Haswell (hell, probably Nehalem) are more efficient than Vishera. That's well understood. But what is wrong with Vishera for gaming? Specifically this game. A lower end Vishera based processor that was released 15 months ago at a price of ~$120 still pushes 45FPS in a brand new AAA game at real world resolution and settings. Seems like Vishera is doing well enough.
 
Last edited:
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |