Where have all the great English reviewers gone. I really miss Anand and Thomas from Tom's hardware. The state of most review processes are so terrible today. Even if they were skewed to their favorites from time to time their processes were so impeccable.
Between what Adore brought up today and just now seeing a reviewer actually thinking a slide for "visibility" that had one set up look 200% better for two FPS. It occurred to me that whether or not Adored is right, this review cycle would never have happened. I might have had an issue with Tom being so sure early in the Athlon vs. PIII fight constantly stacking the deck in favor of the PIII or ignoring problems with the 1.13GHz PIII that had to get recalled. But his testing methodology was impeccable. Anand was so in depth in arch analysis you almost never had to read the review to see what the performance was going to be like because you would already know what the performance would look like in tests. There were others like Overclockers.com and Arstechnica (which is still good in overall tech understanding but not as great they were.
These guys would realize issues like the DX12 results while testing and would tell us the results but caution us to not to use the results to come to any kind of consensus. When Anand would be contacting every manufacture and software companies in the tests and get to the bottom of a bench that looked off and if he was under the gun he would table it for a part II where he would dig into the corner cases. They used to actually drive companies to fix problems before release and it was near impossible to disregard their results.
Now almost no one spends the time on the reviews any more. Adore whether right or long brings up a good point. AMD being not having a high end GPU makes it hard to use their cards in tests. But at least in Anandtech CPU reviews they would tell us when they expected to see a performance difference with different video cards and such severe differences in a DX versions like 11 and 12 (or in this case 8 and 9) wouldn't be written off.
I just wish we were back in the mid 2000's. It was just so less flash and so much more depth and instead of taking a random small time webpage a month to dig through the results due to lack of both time and equipment resources to bring these issues to light. Instead we would have known day 1 and by 3 we would know why and within a week we would know if the problem was going to get fixed or not.
Between what Adore brought up today and just now seeing a reviewer actually thinking a slide for "visibility" that had one set up look 200% better for two FPS. It occurred to me that whether or not Adored is right, this review cycle would never have happened. I might have had an issue with Tom being so sure early in the Athlon vs. PIII fight constantly stacking the deck in favor of the PIII or ignoring problems with the 1.13GHz PIII that had to get recalled. But his testing methodology was impeccable. Anand was so in depth in arch analysis you almost never had to read the review to see what the performance was going to be like because you would already know what the performance would look like in tests. There were others like Overclockers.com and Arstechnica (which is still good in overall tech understanding but not as great they were.
These guys would realize issues like the DX12 results while testing and would tell us the results but caution us to not to use the results to come to any kind of consensus. When Anand would be contacting every manufacture and software companies in the tests and get to the bottom of a bench that looked off and if he was under the gun he would table it for a part II where he would dig into the corner cases. They used to actually drive companies to fix problems before release and it was near impossible to disregard their results.
Now almost no one spends the time on the reviews any more. Adore whether right or long brings up a good point. AMD being not having a high end GPU makes it hard to use their cards in tests. But at least in Anandtech CPU reviews they would tell us when they expected to see a performance difference with different video cards and such severe differences in a DX versions like 11 and 12 (or in this case 8 and 9) wouldn't be written off.
I just wish we were back in the mid 2000's. It was just so less flash and so much more depth and instead of taking a random small time webpage a month to dig through the results due to lack of both time and equipment resources to bring these issues to light. Instead we would have known day 1 and by 3 we would know why and within a week we would know if the problem was going to get fixed or not.
Last edited: