Seems like software can/does change a lot more often than CPU guts do.
I can't help but think it's easier to write software than to design/build a CPU, no? What is the AMD cpu now, four year old, five?
Furthering that, the more I think about it the more I have a real problem with the difference between testing software and testing hardware. One could design an amazing CPU, one with features unheard of, doing things previously thought impossible, an engineering triumph. But you test it with software that is totally unaware of these amazing advances, and it bombs.
Is the software crap, or is the CPU crap?
Maybe I'm a product of my era but I have a hard time blaming the hardware. What if dx12 comes along and does what is promised, what if you could apply it to the oft mentioned "64player MP BF4" that is always trotted out as being a bad spot for an FX chip, what if it suddenly allowed the FX to perform acceptably? Is the CPU suddenly better? It didn't change, how can it? Does software = hardware? Does current software dictate the perceived quality/value of hardware?
There are IMO some philosophical channels here that I really wish were not a part of this hobby. It's hard enough to determine what is good hardware and what is best, let along if it depends on what damn software you are running. I know it's not likely to change, and I know the easy answer is you judge hardware by the software you have at hand, but that's a shallow answer. I'm not into shallow. Especially not in technical hobbies.