PeterScott
Platinum Member
- Jul 7, 2017
- 2,605
- 1,540
- 136
1280x720? ........
They were trying to make it a CPU benchmark, not GPU benchmark. The higher the resolution, the higher all the CPUs got to each other.
1280x720? ........
1280x720? ........
720P is the best way to test CPU gaming performance as the GPU is not the limiting factor. If you want 1080P (or higher) scores, they are in the link. But the difference between the 8600K and 8700K would be even smaller at higher resolutions.
I don't think that is the cause because Intel overclocking motherboards don't set the power limit to the processor's TDP.This could be due to TDP throttling. Undervolting the CPU might let it run at 4Ghz in all states.
That assumes that lowering resolution has no other impacts on GPU and/or CPU performance.Great. Another thread derailed. I don't know why some people cannot understand the concept of "removing the GPU limit by using low resolutions to find out a CPU's true performance". It's not quantum science.
Great. Another thread derailed. I don't know why some people cannot understand the concept of "removing the GPU limit by using low resolutions to find out a CPU's true performance". It's not quantum science.
Obviously you are not going to get the absolute, exact performance difference, but you can get very close. Using lower resolutions is the best way to find out a CPU's potential. Even digital foundry does it. Do you know a better way?You are not finding a CPU true performance with low resolutions benchmarks, you are trying to create "faster GPUs" and extrapolate how the CPU will perform with future "faster GPUs".
First try to understand the point of low res tests before calling out other's logic.Another thread derailed , Some people use this logic to argument their order buying.It's wrong.Period.
Don't waste your time and energy; the sig tells us all we need to know and what he's advocating for in an Intel thread.
My sig indicates what I own. Not what I advocate. Remember, when I bought what I did Intel was simply not competitive.
The reason I want clock for clock is because Intel did what they've always done. Jumped ahead on process.
My point was, that some people do use their CPU's@100%, and as such, running over like 75c for long periods or thereabouts is NOT good.
1280x720? ........
First try to understand the point of low res tests before calling out other's logic.
What are you even talking about?! What price? No one mentioned the word "price". Do you even have a point? Did you understand the concept of low res tests?Try to understand my logic , NOT base on PRICE.
For those still struggling with how benchmarks work:-
CPU Test = You minimize being bottlenecked by using the fastest GPU you can find and then lower the resolution if that's not enough, to load the (CPU) cores before the GPU hits 100% usage. Anything else just results in different CPU's idling / downclocking from Turbo waiting on the GPU's at which point you're not properly testing the CPU at full load. Lowering the resolution gives a better idea of how a CPU that's bottlenecked by a GPU today will perform in the future on a stronger GPU. The reason why no-one tests 1440p / 4K on CPU reviews is the same reason why no-one tests 5GHz i7's on GTX 1030's.
GPU Test = You minimize being bottlenecked by using the fastest CPU you can find and then raise the resolution if that's not enough, to load the (GPU) cores before the CPU hits 100% usage. Anything else just results in different GPU's idling / downclocking from Turbo waiting on the CPU's at which point you're not properly testing the GPU at full load. Raising the resolution gives a better idea of how a GPU that's bottlenecked by a CPU today will perform in the future on a stronger CPU. Again, it's the same reason why no-one reviews GTX 1080 Ti's on Celeron's.
Real World tests (eg, "But I play at 1440p / 4K / with 60fps VSync on and only want to see how that limitation affects CPU/GPU balance so I don't overbuy"), etc, is an entirely fair personal metric by itself to gauge the best average CPU/GPU pairing for a given tier of hardware / budget / constraints. But in arguments with other people of different brands, it often gets abused into cherry picking when bottlenecked low fps numbers get discarded one minute when one's favored "team" is behind, after previously being included when it was "the other way around". That's not aimed at anyone here personally, but is a highly visible and tedious trait amongst brand fanboys who seem to populate Youtube comments section and certain other clickbaity "rumor" sites...
In short, 4K / 1440P gamers should still take note of the 720p benchmarks as they basically show how much headroom your CPU has after your next GPU upgrade (ie, longer lifespan between upgrades). The real-world ones like this recent one from Techspot, are there to primarily help lower requirements gamers avoid huge mismatches. And even those are only a rough guide as people can and do play on Med/High vs Ultra which reduces GPU bottleneck (often up into the next tier, eg, in Witcher 3, 1050Ti Med = 1060 Ultra) which then makes even a budget CPU's limitations more pronounced.
What are you even talking about?! What price? No one mentioned the word "price". Do you even have a point? Did you understand the concept of low res tests?
That's not exactly the subject here but ok, let's compare (based on newegg prices); in BF1 multiplayer at 1080p, an 8400 is 23% faster than a 1600X, and its 99th percentile is 10% higher. A 1600 costs $215, a 1600X $220, and an 8400 $190.I guess You're limited to anandtech forum.Some use it as Perf/price.
It's for high end WoW players. You basically can't get 144fps in the game it's so single threaded cpu bound.The i3 8350k price is way to high. I rather get the low end 6 core i5 at that price. They need to lower it to $150 bucks or something
That's not exactly the subject here but ok, let's compare (based on newegg prices); in BF1 multiplayer at 1080p, an 8400 is 23% faster than a 1600X, and its 99th percentile is 10% higher. A 1600 costs $215, a 1600X $220, and an 8400 $190.
Now obviously this is only the CPU. As of now the only compatible boards are Z370s and they cost more than cheap AMD boards, so a MB+CPU combo is about $35 cheaper for AMD (with a B350 board for OCing).
190 + 120 = $310
215 + 60 = $275
The intel combo is about 13% more expensive. Is it worth it? That's up to you. This would change once intel's B and H boards arrive.
P.S. the AMD boards have the advantage of supposedly supporting up to Zen 2. Intel's 300 boards' future support is unknown at this point.
Is it fact that 8 core we gona see next year coffee lake, no chance it gona be ice lake 10nm?