Thanks. Like the performance of that Ninja 4. More looking ahead.
The only thing that bugs me here is the incinsistency of the different sites. I don't expect them to all have the same numbers, but there is way too much variance in how these coolers compare to each other.
That's the bane of 'controlled' testing. Too many variables. From ambient to cooler intake temps, to different platforms running under different conditions.
Some cooler sites like FrostyTech use a heating plate rig. Which is great for consistent tests over time. However, those rigs only simulate a CPU's TDP, so their results don't correlate that well to real world usage. At least not since 22/14nm and AVX2 have arrived.
Then there's SPCR who use the same 1366 test rig from 2008 - Historically, probably the most reliable review site. If you want to know how a cooler is going to perform on a non-overclocked X58 quad core from 2009. (eerily, the Ninja 4 is one of their top rated coolers - so it's not just for 1150/1151)
The best one can hope for is consistency within a review site - not always possible since many update their platforms and immediately the historical comparisons are invalid. Some will retest some of old coolers on their new platform and some context is preserved.
One could take all the reviews in aggregate and compare. The top coolers will still come out on top. Sure, some will rate higher in some reviews, but other than the odd payola review or broken testing process, clear patterns can be discerned.
Or one can purchase enough stuff over the years, to find corelation with some review sites and their processes. I'm not saying it's the best method, but take pride in doing my consumer duty to support the cpu cooling industry. ;-)
Can't really blame the media on this one tho. Other than shams and lazy practices, most reviewers attempt to create a stable, repeatable, verifiable testing system. It's just that it's really, really hard to do right. So many variables.
It's not like running a trusted benchmark and highest score wins. Ambient temp/humidity needs to be controlled. Is testing open bench valid? Testing in a case is more real world. But, which case? How is the case setup for airflow? What do the other components add to the thermal thumbprint? Cooler intake temp readings repeatable and verifiable? (+/- 3-5 degrees if not) How consistent are the mounts? That's +/- 2-3 degrees right there. Choice of TIM and it's application? That's another +/- couple degrees. Cooler base to the specific ihs fitting? Number of test mules? Test applications? Number of test runs? And then there's stuff enthusiasts want to know, how well does it cool an overclocked chip. Great...now the silicon lottery is under test too.
I take all tests in the YMMV spirit and weigh my choices. Most of the time they're good. Sometimes they're not. If cooler reviews were objectively 100% perfectly correlated, would make the enthusiast pursuit less fun, imo. A little uncertainty can be very entertaining. ;-)
Or maybe we need to setup a 'What's the Best Cooler for my Rig' service. You send in your rig to one of the better testing guys - they run their testing process on your top 5 cooler picks and report the results. And then send the whole rig to SPCR for noise/cooling results. Shouldn't cost more than a grand or two per test.