Once I had that data for test results, I would have grouped the graphs by L3 cache size, not by number of cores.
He was patting himself on the back for myth busting so many saying 8 cores was a better investment. Hence he wanted the compare and contrast between 6 and 8 as well as cache size. Of course he didn't do the in depth testing necessary to reach that conclusion. Instead he fired the arrow and then painted the bullseye around it.
His Spiderman, Baldur's Gate 3, and Plaque Tale saves where he runs his 30 second passes, are not peak CPU use for those games. Spiderman you pick a save when Broadway is chock-a-block with NPCs and swing fast along it close to the street or run and zip line around there. Plague Tale the marketplace full of NPCs is a tough spot. His BG3 save has to be ACT 1. Gamers Nexus uses an ACT 3 save a viewer provided. U.S. Steve uses CL14 3200 ram and a 4090 too. HUB gets 132 avg 100 1%. GN gets 115 avg 65 1% because it is more CPU heavy on the 5800X3D.
That's not to say a 5600X is not a good gaming CPU. I picked up the 5600 and 5600X here in the forums for $100 each. They've been great. I am just doing my usual rant about doing lazy testing then saying "See? I told you so." Then forum dwellers reference that lazy bigger bar better chart like it's the gold standard.
If he wanted to really squash the beef he should test CPUs from when the debate over core count was at its peak. The consoles were using a CPU a lot like the 3700X so the advice was to match that to ensure your system could enjoy a long life. Intel 9th gen was a major factor for those discussions too.
Anyways, I'll wrap this up. I am just enjoying that Zen 3 continues to be relevant. Yet it'll be 4yrs old this November. The schadenfreude comes from haters claiming $300 for the 5600X was too much money. But here we are and it still gets the job done nicely. You could sell it and upgrade to a 5700X3D for about $150 out of pocket if you wanted. No board or ram to buy. No reinstalls or data migrations. Good stuff.