When NV/AMD cannot sell high enough volumes of GPUs to generate sufficient profits, they will be forced to raise prices on a per mm2/per die size and per specific grade level of a GPU (for example, what was once viable to sell at $150 now has to sell at $250). However, doing so means that less and less gamers are enticed to buy GPUs as they become less affordable. As volumes fall, NV/AMD are even more pressured to raise prices to justify the R&D and manufacturing costs, which in turn causes a vicious cycle of rising Average Selling Prices and falling demand.
*snip*
I am interested to see in what other people's thoughts are on why the desktop discrete GPU market has declined so dramatically in recent years? Please share your opinions.
tl;dr for anyone who doesn't want to read a wall of text. Gamers are moving from PC to console but the console market leeches from the success and R&D of the PC market so long term diminishing numbers in the PC market is unsustainable for the entire gaming ecosystem.
Long version - This is all because of the mass adoption of cheap consoles, they have a much slower generation of 6-8 years when in that same space we have 3-4 generations of video cards, CPUs, memory types etc.
Consoles used to be their own discreet engineering projects with their own specialized hardware, but slowly over time this because less the case and they have trended towards becoming more like PCs, the last few generations have seen the consoles go directly to Nvidia and AMD for their GPU and even adopt DirectX as a standard.
All of this stuff exists because of the dGPU market and the PC gaming market, the reason that 8 years ago we had a PS3 and now 8 years later we have the PS4 and they see a x16 increase in hardware speed is because between those releases the PC market has had an agressive 18-24 month generational cycle where gamers are constantly upgrading and paying for the aggressive R&D which gives us this constant growth, it's expensive but hey - we want that power because we love graphics and gaming.
Also in that time we see improvements in DirectX and rendering standards, they're mostly coming from whitepapers on new features written and implemented on PCs, and then integrated into PC gaming engines like Unreal engine, and CryEngine. Remember that even these engines were PC exclusives not all that long ago, but now they've adapted to be multi-platform, so again consoles can lag behind the times and benefit from the improvement in tech more or less for free, because the R&D was paid for by the PC gaming community.
My longwinded point here is that the core of progress and development doesn't come from consoles, it comes form PCs, but the console platforms are the biggest and most mainstream and increasingly they're pulling in gamers who might have otherwise been PC gamers and invested money there instead. The reason is largely down to price, the initial buy in cost of consoles is MUCH cheaper or at least perceived that way and of course anyone with any knowledge knows that their business model is cheap buy in with high consumables cost (the games) by charging royalties etc.
In the long run this is all one large relative balancing act, the console market EXPLICITLY relies on the success of the PC market because between console generations we do 8 years of constant R&D which is prohibitively expensive, MS and Sony cannot shoulder that cost themselves, they're simply buying hardware at long intervals from a market who's R&D costs are offset not on the console GPU sales by largly on PC sales, by people who pay premiums for top of the range hardware, those premiums don't go into Nvidia and AMDs pocket they most go to offset the insane R&D they have to do.
Consider for a second what might happen if say Nvidia and AMD shut down their R&D for GPUs, exited the PC dGPU market outside of dGPUs for business and a few specialist ones for CAD. The demand on chips from TSMC and other providers plummet and the whole market segment grinds to a halt. Meanwhile 8 years later MS and Sony want to re-vamp their consoles because sales are low so what do they do? Well they have to pay to do R&D from scratch, they need to invest in the 8 missing years of hardware development, or they face releasing a next gen console with barely any new horsepower. What does that do to the prices to the hardware? Well that shoves the prices through the roof and that cost is passed onto the consumers, the console gamers.
R&D is super expensive and someone has to pay for it, it's the people on the bleeding edge pay for it, right now consoles are favored for being cheap and ubiquitous, they lose that edge the moment the PC market dies because they become the bleeding edge and have to finally pay for their own R&D.
To add to all of that we've had a relative slump in progress for a while with dGPUs, stuck on the 28nm node for way too long, that is a blip I feel will be shortly over, the next Gen of Nvidia tech using FinFETs sounds amazing and potentially an insane leap in power. If anything can refresh the market is really coming back around to this more aggressive 2x performance between gens that we were closer to around the 8800 GTX era and prior.