With regards to gaming reviews.
The only way to test a processor for gaming is to provide a - partly unrealistic, ok - CPU bottleneck. This is done by removing AA and resolution while keeping the other graphical settings up. Thus, your CPU has to work overdrive so as to keep up with the huge number of frames that the GPU is providing in such a setup.
The relevance of such a test for real life scenarios goes as follows: A CPU that performs markedly worse in such an environment will have reduced lifetime performance and endurance for the consumer, since gamers tend to change GPUs faster than CPUs and a future upgrade together with more demanding games will introduce a bottleneck (and a worse real life performance) faster.
Having said that, there is a benefit for the consumer to also see gaming testing in real life scenarios too, simply because these will be his own experience if he purchases the product. For this reason, I value reviews that tend to show performance in both scenarios. Remember, the gaming reviews are in the end trying to extrapolate holistic gaming performance by focusing on the specific object rated, be that a monitor, a GPU, a CPU etc. What you get now by using this or that is still valuable info, since PC gamers tend to use many different resolutions, aspect ratios, monitor refresh rates etc etc.
At the same time, there has been an argument around concerning gaming for some time now (I think since the bulldozer architecture was introduced). This goes like "as time passes by, game engines and APIs take advantage of more parallelization thus a good performer right now is not as valid as we might think, unless it has the spare resources (that current games don't utilize and future games will) to be "future-proof". This is partly a loaded argument, for two reasons.
1. The guys that really value gaming performance and do stay at the enthusiast part of the market follow closely the changes in both software and hardware. The whole idea of futureproofing is moot for someone that his main task is gaming and is regularly changing a 700$ GPU every year or so. Those people will simply buy what gives them the best performance every month.
2. The other (non enthusiast) parts of the gaming market - which btw make up the majority of it as far as software sales are concerned - use a lot more modest hardware (take a look at the averages in steam for resolution, CPUs, GPUs etc to understand what I am talking about). Also, they change hardware much slower than the enthusiast segment. The software providers know this very well, and the rate of adoption on tech that destroys their own client base's ability to pay them is - not surprisingly - much slower than the "future is now, your expensive four cores will stutter in a year or so" crowd wants to admit.
Somewhere inside the whole debacle there is an additional argument about the software market being driven by the consoles that have that many cores so in the future the games will--- but I'm really tired of typing now.