He addressed that when he said GW is HBAO+ TXAA, etc. What you're quoting doesn't say that developers can't optimize for AMD hardware. It's saying developers can't share with AMD how to optimize GW specific features. There's a BIG difference between the way you're trying to spin what NVidia said, and what they actually said.
GW doesn't allow the developer itself to change the code in any way without NV's permission; and since the develover is also banned from sharing the code with AMD, neither the developer nor AMD can ever optimize GW's proprietary gamecode for AMD hardware,
ever!
The end result is in all GW titles AMD is no longer in control of its own performance. While GameWorks doesn’t technically lock vendors into Nvidia solutions, a developer that wanted to support both companies equally would have to work with AMD and Nvidia from the
beginning of the development cycle to create a
vendor-specific code path.
It’s impossible for AMD to provide a quick after-launch fix if some of the source-code in the game is vendor specific and AMD has no direct access to it.
This kind of maneuver ultimately hurts developers in the guise of helping them. Even if the developers at Ubisoft or WB Montreal wanted to help AMD improve its performance,
they can’t share the code. If Nvidia decides to stop supporting older GPUs in a future release,
game developers won’t be able to implement their own solutions without starting from scratch and building a new version of the library from the ground up. This subtle point can explain why Kepler GPUs show very poor performance in many titles where Maxwell shows an unusual performance advantage. Developers who rely on GW may discover that their games run unexplainably poorly on AMD hardware with no insight into why because they can't share the code with AMD and thus it makes it impossible for a developer who is connected to the GW's program to alter this GW's code to speed up performance on AMD hardware.
Nvidia’s GameWorks program is conceptually similar to what Intel pulled on AMD 8-10 years back. In that situation, Intel’s compilers refused to optimize code for AMD processors, even though AMD had paid Intel for the right to implement SSE, SSE2, and SSE3. The compiler would search for a CPU string rather than just the ability to execute the vectorized code, and if it detected AuthenticAMD instead of GenuineIntel, it refused to use the most advantageous optimizations. The situation here is different, in that we’re discussing third-party libraries and not the fundamental tools used to build executables, but the end result is similar.
Thus far you provided no valid counter-arguments to dispute how GWs blocking access to the source code doesn't hurt AMD directly. If AMD pulled the same tactic, it would basically guarantee that one would need to have both AMD and NV GPUs for optimal gaming performance, or would have to get a way faster NV card to compensate for the massive performance penalty in AMD's locked source code. I personally think if AMD's GE were to follow NV's GW, we would actually need to have an AMD and an NV card for optimal performance across a wide variety of titles - an absurd concept! How would you feel if in every single GE titles, SLI wouldn't work for 3-6 months until AMD gave the developer the go ahead to release a patch for working SLI? You'd still buy those games for $50? How would you feel if AMD was 10X the size of NV and just bought out 99% of all AAA developers and inserted proprietary closed game source code optimized specifically for AMD cards? That's fair competition?!
Also, AMD is doing exactly that, doing stupid that is, by freely sharing their GE source code and presenting publicly on how to best optimize it at GDCs & other tech conferences. They are spending their $ to help NV optimize for GE titles. Not very clever at all given what NV is doing against them.
That's because PC gaming/software development has been open source. How far would we get in PC hardware and software if from day 1, everyone just made everything proprietary? The solution lies in the developers but for marketing $, they'll bend over and do whatever is asked. What this means is there is little integrity left in PC game development for many studios. Why? Because if a major developer is willingly accepting vendor specific
locked source-code that can never be altered or optimized without that vendor's permission, this developer is knowingly favouring all consumers that have products from this very vendor, and thus ignoring everyone else in the market who is gaming a videocard from a competing vendor (i.e., AMD, Matrox, Intel, etc.).
Let's hypothetically imagine we had 6 GPU makers in the market and NV had 50% market share, while everyone else had 10% a piece. By virtue of the developer accepting proprietary source code optimized
specifically for NV's hardware, the developer would be consciously and willingly sacrificing the performance of the game for the remaining 50% of the market comprised of the
other 5 GPU makers. If you are the developer, how do you accept such an insanely biased business decision if you have
any sort of business ethics and actually care about your consumers? If you are willingly favouring 1 type of consumer over the other, knowing that the outcome of your choices will result in exactly a favourable situation for one part of the market only, you are a biased gaming developer.
If 100% of the source code in the game can be shared with anyone, the developer is fully cleared of bias because now the optimization rests with the 3rd party GPU maker(s). In this case, not only does GWs hurts AMD, it also hurts everyone with an Intel GPU/APU because Intel can't optimize GW's source code either. Now just imagine if Intel threw hundreds of millions of dollars and made almost all source code in AAA games proprietary!? :hmm: I am curious how many of the gamers defending GW/thinking there is nothing wrong with this are proponents of Apple's proprietary business practices. I would bet, a large fraction.