These are new and popular games people play. I'm sorry you have an issue with using games people are actually playing today.
Perhaps you should speak to the game developer for their use and choice of 3D features in their own games?
We will continue to use new game releases, popular game releases, and games people are actually playing on the PC.
You need to be able to understand that PC gamers don't play 4-5 games a year, most of which are GW titles. We want a general overview of the card's
overall performance. If you don't want to provide those reviews, you should discuss the performance in specific games, and not draw overall conclusions on 4-5 games. Your reviews lack variety (no Total War games, no racing games, no strategy games of any kind, no MMO games, etc.) This means that all of your modern reviews are too small to conclude on the card's overall performance.
Secondly, your conclusions are out of this world:
Bottom line section:
"In terms of gaming performance, the AMD Radeon R9 Fury X seems like better competition for the GeForce GTX 980 4GB video card, rather than the GeForce GTX 980 Ti. GTX 980 cards are selling for as low at $490 today."
How can anyone think 980 and Fury X are on the same page? :sneaky:
When almost every professional site online disagrees with you, the problem is probably your review or your testing methodology (not testing enough games to draw an accurate conclusion).
http://www.sweclockers.com/test/20730-amd-radeon-r9-fury-x/17#content
or
http://www.computerbase.de/2015-06/amd-radeon-r9-fury-x-test/5/
or
Even HWC shows 24-30% lead over 980.
http://www.hardwarecanucks.com/foru...682-amd-r9-fury-x-review-fiji-arrives-22.html
You might really want to either augment your reviews with more games or just outline in your reviews that you aren't assessing the card's overall performance in a wide variety of games but only games you like to play/latest AAA games.
Even your previous analysis of 4GB vs. 6GB of VRAM where you kept saying how 6GB is bare minimum for 4K was totally flawed since 980TI SLI outperforms Titan X in nearly every game you've ever tested at 1440P or 4K where SLI scales. You seem to be confusing VRAM required vs. dynamic VRAM the game uses by just looking at MSI AB data. In other situations you aren't calling out horrible PC game optimizations like in Dying Light that cripples VRAM for little to no benefit with higher draw distance. Because most gamers would rather play at constant 60 fps than suffer FPS below 60 in Dying Light, we aren't going to crank draw distance to the maximum to get a 2% improvement in graphics. That means your testing is 100% catered to how
you want to play (meaning less emphasis on higher FPS) at the expense of minimal improvement in graphical quality despite a massive performance hit, not how most PC gamers play Dying Light. Can you actually say that the performance hit with 100% draw distance is actually worth it dipping below 60 fps vs. 50-75% draw distance?