Someone is on a mission. So the community should follow, Dirt Showdown, a game that was derided by someone at one point, and now held as a shining example of how it's supposed to be done. Sorry don't see the logic or point.
I said Dirt Showdown's performance hit when enabling the Global illumination setting was not worth its performance hit (I did criticize this setting's performance penalty) but I didn't criticize the way in which F+ MSAA was implemented in Dirt Showdown. It is actually done very efficiently as a result of the F+ model and using compute shaders. NV's problem in that game was that its compute shaders aren't even fast enough for the global illumination or contact hardening shadows settings that also happen to use Compute Shaders. However, set aside the global illumin./contact hardening shadows and look at how well the MSAA works in Dirt Showdown in the context of its performance hit. You wouldn't want this in future games? You would rather keep spending $500 on new GPUs because SMAA/MSAA drops your performance 33-52%?
Ok here is my point: Crytek promised us a game with next generation graphics that will melt our PCs, presumably because of it having next gen graphics. I don't see anything of the sort. The only thing that's melting an HD7970GE/GTX680's performance is the mind-boggling anti-aliasing penalty typically incurred under deferred lighting game engines. It is not so much the next generation graphics that are pushing these GPUs as seen by their ability to easily go > 50 fps without AA. When you first fired up Far Cry 1 or Crysis 1, the reason you needed a GPU upgrade was not because of some SMAA/MSAA setting. It was because their graphics were revolutionary for the time.
In other words, the reason I brought up Dirt Showdown is if Crysis 3 was coded with a F+ game engine, then GTX680/HD7970GE would be hitting way higher FPS with MSAA and then it would have been obvious that Crytek completely failed to deliver on their promise of next generation graphics (or as they claimed that
C3 will be the best looking PC game for at least 2 years).
What Crytek has done here is they have taken Crysis 2, barely improved the graphics, but instead added native AF support, and a bunch of AA modes and included "high resolution" textures natively instead of a patch like they did with Crysis 2. This native SMAA/MSAA support on VHQ level that drops performance 20-30 fps. So now Crytek can say "Yes, we melted your PC." No **** Sherlock!! What did you expect with a deferred lighting game engine? Ok but despite melting our GPUs, Crytek didn't actually deliver on their promise of making this a PC game with next gen graphics.
Even if you look at the in-game options menus, it's nearly a carbon copy of Crysis 2. Crysis 3 is just another console port with minor tweaks I listed regarding Crytek adding a high rez texture option & native AF/AA modes.
Crysis 2
Crysis 3
Crytek is pulling a marketing wool over our eyes and saying "Look, the performance cripples even $500 cards like GTX680/7970GE -- it's a next generation game!!" You can't see that the performance drops from 57 fps on an HD7970GE to 27 fps with just 1 setting and it happens to be anti-aliasing? What's AA have anything to do with next generation PC graphics? Nothing.
This does not look like a next generation PC game:
The main reason the game runs like a dog is because those geniuses decided to force SMAA/MSAA on a deferred lighting game engine, not because the game looks
like this. In other words Crytek is full of it.
You've missed anti-AMD conspiracy part:
Poor gamers...only 8 antialiasing modes in Crysis 3.
Must be NVIDIA shoving TXAA down their throats
It's not an anti-AMD conspiracy theory. It's about doing what's best for PC gamers in terms of a balance between graphics quality and performance level. If you are going to go full out for graphics and claim you will bring next gen graphics, you better deliver then. If right now you turn on MSAA, you lose almost 20 fps on a GTX680 to an unplayable 34 fps and 30 fps on an HD7970GE to an unplayable 27 fps. That's acceptable to you? Did it not occur to you that if the game was made with an F+ codepath that a GTX680 would play it fine with MSAA? Guess what happens then: NV can't promote their next $500-600 GPU to entice you to upgrade to be able to play games like Crysis 3 smoothly. It's in their best interest to make sure you upgrade and buy their next GPU. Why would they want to promote a more efficient way to code games, especially not when a more efficient coding path requires the use of compute shaders that happen to work slower on their GPUs than their competitor's? So what that 8 AA modes were added? The entire engine is horribly inefficient when using them.
I call it how I see it. Compute shaders can be used to accelerate graphical effects that normally would run slower using traditional methods. So let's use/promote those methods. Instead, PC gamers keep accepting this BS from developers that a huge performance penalty is a part of the game (i.e., the game of how developers are coding these games and "working closely" with GPU makers to make sure we get a "good" experience). It's one thing to cripple GPUs because the game looks gorgeous and completely another thing because you can't code / optimize the game engine properly.
Look at the graphics of Crysis 3 and tell me it's acceptable that HD7970GE and GTX680 run this game at 27-34 fps at 1080P with 4xMSAA given its level of graphics. Don't worry, NV will gladly sell you a $900 Titan in a couple months to solve your "performance issues." It's going to be an even more impactful marketing technique if Titan is shipped with a free copy of Crysis 3 as a bundle to make you feel like that $900 upgrade was worth it.
Looks 95% like Crysis 2, runs at 32 fps on a $500 GPU:
I have no problem spending $400-500 for another GPU but when a game still runs like a dog on $800 worth of GPUs but my jaw doesn't hit the floor from its graphics, there is a problem -- an optimization problem.