So it's AMD's fault that tessellation usage is so wasteful?
http://www.computerbase.de/2016-01/rise-of-the-tomb-raider-benchmarks/2/
20-25% performance loss for zero visual difference as noted by the reviewer.
Let's reverse that, who's fault do you think that is? Crystal Dynamics, for doing such a horrendously unoptimized usage of tessellation (which would have crippled consoles even harder, so no, it's a "PC special feature"), or could it possibly be NV's fault for sponsoring these guys to give them an incentive to do such things?
What about HairWorks in Witcher 3, is it AMD's fault that the default x64 looks visually identical to x16 but performs much worse? Why would x64 be the default, even making CDPR invest to access and modify the source, so they updated it to optimize it?
What about GodRays in Fallout 4, why is it on Ultra vs High or even Medium, looks identical but just performs worse? Who's fault is that?
Someone at NV created these tech and set the default to a very unoptimized level. It must be AMD's fault, right? And you are calling others crazy... nice one.