If each core already runs two threads then it's at full utilization.
You can look at the 4/8 zen part to see how much performance you loose by not having those threads running for your game.
Yes it would, that's what I said,but the 9400f will also have a nice FPS advantage to eat into.
The 8400 is 100Mhz slower then the 9400f and has a 135FPS avg compared to the 109FPS avg of the 2600x ,6 cores vs 6 cores with SMT and the 8400 still has a 24% advantage in avg FPS you can eat into.
Excellent attempts. Your hypothesis about "two threads then it's at full utilization" doesn't bear out, because if the pipeline isn't full, then there's still headroom. The CPU can delegate partial workloads to all 12 threads. This does not equate to full utilization.
In any case, guessing is nice, but benchmarks and reviews are nicer.
There are reviews comparing the 8400 to the 2600 and nearly all state that if you're doing anything more than just gaming, they'd take the
2600 - not even talking about the 2600x. TomsHardware said that the
1600X is a better gaming+streaming platform than the 8600K (which is better than the 8400, of course; and naturally the 2600X is better than the 1600X).
GamersNexus
compared the 2600X to the 8600K and recommended the AMD chips for streaming, specifically mentioning that the 8600K often fails to even encode any frames and creating a new metric, 1FENET (one frame every now and then) because the 8600K was so bad at streaming along with gaming ().
Whatever you personally think of how resource utilization works (and you should probably read up on it), it doesn't matter. The proof is in the performance. When measuring streaming + gaming, AMD beats the pants off the 8600K. It would surely do so against the less powerful 8400, but mercifully, no one has felt the need to perform such a bloodbath.