Yes apart from the fact that GK110 is 2x as big as todays GPU's upon which Nvidia has poor 28nm yields and when you make the GPU bigger there is going to be even less yield and cost double per GPU. There is no commercial sense to use a less profitable GPU. The only way they would do this is if they have a few million Tesla GK110 reject cores sitting about where parts of the GPU have been turned off.
The GPU has 7bln transistors and is a fabrication nightmare apparently. Last time Nvidia tried to do this on 40nm and they started getting 2% yields and the best they ever had was about 20%.
There is a ton of information about Nvidia on this around the web. A 780GTX will be a faulty core because they wont sell a perfect GK110 to anyone other than Tesla customers.
You don't know yields for Nvidias GPUs, 40nm or 28nm. I guess you are basing your claims on Charlie Demerjan from Semiaccurate who is a known Nvidia-phobe. There is simply no proof to what you are saying.
As far as my sources go, GK110@GTX780 was not profitable for launch in September 2012. It is 4 months later now and the launch might not take place for another 2-3 months. Surely in that timeframe improvements can be made.
Then:
There is no 15 SMX Tesla to begin with. The professional market uses much less cards than the gaming market. While demand for K20(X) seems to be high, it will be fulfilled sooner or later. Wafer output also increased substantially in Q3 2012 and I would expect it will increase further towards Q2 2013.
Finally:
Nvidia needs a strong GPU to compete with HD8970. They cannot do that with a GK114 or GK204 or any other kind of GK104 derivative since it has too little bandwidth. If they were to build a GK104 derivative with a 384bit memory bus, more units etc. they would basically have a GK110.
To develop and tape out a GPU costs much much money. GK110 exists, no further development costs are needed. An additional "HPC-less" GK110 would incur additional unnecessary costs AND due to the limited volume of professional solutions, GK110s R&D costs would probably never be covered. So you see, it seems to be a bad idea not to use GK110.
Its not on the consumer market. Its on the HPC market as those Tesla Cards.
Its a monster GPU that uses 350w of power. The heat will be insane and so will the costs associated.
Thats ok when your selling to enterprise. Not so good when its inside your gaming rig.
Please stop inventing stuff, stop lying. GF110 came in the GTX580 and the GTX570, already forgot?
Nvidia didn't release a single-GPU card in their entire history that had a TDP of 350W and they will not start now. Your claims are completely uninformed and very poorly thought out.