But, did AMD create this future?
No, not at all. It wasn't even their vision. TI released a fusion chip in 1986, the TMS34010.
The future has been created thanks to smartphones, tablets, and Intel if marketshare numbers are anything to show for it.
But, did AMD create this future?
But the iGPU is a long ways off from matching up against ~$100 dGPUs. A GTX 740 or 750 will blow them out of the water. That'll change but we're not there yet. The only time I see favorable comparisons for iGPUs is with odd graphics settings vs some crap dGPU like an r7 240 DDR3.
That argument doesn't really hold water IMO. No doubt a GTX 750 is much faster than HD4600 in raw performance, however convincing the average casual PC gamer that already plays games that only require low end hardware that the 750 is a tangible enough upgrade in real-world is one different matter.
Discrete high-end GPUs is soon to be a dead dodo, that much I agree though. It's trapped in a vicious cycle of escalating costs, lower PC demand, and death of AAA PC games pushing GPUs to the limit while providing very visible graphical gains like Crysis did.
Discrete high-end GPUs is soon to be a dead dodo, that much I agree though. It's trapped in a vicious cycle of escalating costs, lower PC demand, and death of AAA PC games pushing GPUs to the limit while providing very visible graphical gains like Crysis did.
I don't think anyone knows what 'soon' is. As shown in this post: JPR sales figures, revenue for gaming systems is going up. Of course, as best I can tell, so have the ASPs of GFX cards. So, the number of cards sold is going down, but the business is still a healthy one sales wise.
I have expected, at some point, that integrated graphics will replace idGPUs due to cost first and performance second. It seems that gamers are still willing to pay for higher performance dGPUs with sufficient sales to support at least one GPU company. That, and iGPUs are up against a moving target, performance wise, as screen resolutions keep creeping up (and VR could be an even bigger game changer). So the question becomes, at what lagging process node do GFX cards get stuck for too long and allow iGPUs to catch up (and even here, we see Intel slowing down as moving to new nodes becomes technically more difficult)? 5 years? 10 years? 20 years? I used to think it would be around 2020, but I think with the changing dynamics of CPU/iGPU development and the changing gaming performance demanded, it could take much longer. Much of this is going to be dependent on macro-economic conditions (how much cash will consumers of the future have to spend on PC gaming systems).
I don't think anyone knows what 'soon' is. As shown in this post: JPR sales figures, revenue for gaming systems is going up. Of course, as best I can tell, so have the ASPs of GFX cards. So, the number of cards sold is going down, but the business is still a healthy one sales wise.
I have expected, at some point, that integrated graphics will replace idGPUs due to cost first and performance second. It seems that gamers are still willing to pay for higher performance dGPUs with sufficient sales to support at least one GPU company. That, and iGPUs are up against a moving target, performance wise, as screen resolutions keep creeping up (and VR could be an even bigger game changer). So the question becomes, at what lagging process node do GFX cards get stuck for too long and allow iGPUs to catch up (and even here, we see Intel slowing down as moving to new nodes becomes technically more difficult)? 5 years? 10 years? 20 years? I used to think it would be around 2020, but I think with the changing dynamics of CPU/iGPU development and the changing gaming performance demanded, it could take much longer. Much of this is going to be dependent on macro-economic conditions (how much cash will consumers of the future have to spend on PC gaming systems).
There is very little tangible benefit of a S550 over a Hyundai Genesis.....
Doesn't stop a LOT of people from getting an S550.
I don't think High End GPU is dead. Luxury is just that, luxury. I knew I was getting very little tangible benefit moving from 1080p, to 1800p VSR.
I still did it... and I definitely will be looking into the $1000 Dual Chip WC cards in the future. There will always be people who want to be at the high end.
If your point goes in that direction, then we're discussing a back to the future like thing as graphics processing was done by CPUs first. Is the TI good at both parallel and serial tasks? The description sounds like this is the case with that FP extension.No, not at all. It wasn't even their vision. TI released a fusion chip in 1986, the TMS34010.
The future has been created thanks to smartphones, tablets, and Intel if marketshare numbers are anything to show for it.
I'd say AMD was half-right. They thought the future was hot fusion (125W TDP). The future is actually cold fusion (4.5W TDP).
There is very little tangible benefit of a S550 over a Hyundai Genesis.....
Doesn't stop a LOT of people from getting an S550.
I don't think High End GPU is dead. Luxury is just that, luxury. I knew I was getting very little tangible benefit moving from 1080p, to 1800p VSR.
I still did it... and I definitely will be looking into the $1000 Dual Chip WC cards in the future. There will always be people who want to be at the high end.
As long as gamers demand that power (e.g. with StarVR), and HPC needs high FLOP density (as the periphery costs space and power too), there will be synergies. And most GPU components can be multiplied to get higher performance.High end GPUs will also vanish when the ROI disappears.
As long as gamers demand that power (e.g. with StarVR), and HPC needs high FLOP density (as the periphery costs space and power too), there will be synergies. And most GPU components can be multiplied to get higher performance.
There is no use for hope in estimating the way of the markets. What drives markets are demands. And this here sounds like calling the death of desktop PCs due to tablets.Chip design still cost.
Gamers dont dictate when the dGPU will vanish due to ROI. Its just another hopeless hope people hold onto.
You will however end up seeing the same GPU uarch on the same node for many years when it happens.
There is no use for hope in estimating the way of the markets. What drives markets are demands. And this here sounds like calling the death of desktop PCs due to tablets.
Chip design costs depend on many factors. In the end you can have scalable ones as it already happens. What are the costs then?
BTW.. "hopeless hope"... Well, nvm.
What's your used scaling factor from Steam to the whole market? And newer models need a while to see their full revenue over lifetime. Remember the simplest product lifecycle models for what I mean.If we use steam, the entire AMD 200 series+all Maxwell cards sold about 12.5-15 million units. And that includes cards the IGPs will kill off at the bottom. If we look at GTX970+ and R9 series we are down to 5-7 million cards.
Now do the financial math and tell me if you still believe in it. Because not even nVidia believes in it when you look at their actions.
If we use steam, the entire AMD 200 series+all Maxwell cards sold about 12.5-15 million units.
The attach rate of GPUs (includes integrated and discrete GPUs) to PCs for the quarter was 137% which was down -10.82% from last quarter, and 26.43% of PCs had discrete GPUs, which is down -4.15%.
What's your used scaling factor from Steam to the whole market? And newer models need a while to see their full revenue over lifetime. Remember the simplest product lifecycle models for what I mean.
5-7M cards still mean at least $500M in revenue. Mask sets were 1-2% then. Slightly different designs (remember: reusing parts means having building blocks incl. whole testing etc.) mean a distribution of costs over all GPU models of that family. So the revenue of all smaller models have to be added too.
Intel also doesn't design their Pentiums from scratch. And Nvidia seems not to care about your statement with their ~600sqmm Pascal GPU.
There are more options than discussed here: make smaller dies and combine if necessary with some $25 interposer. Aren't ARMs GPUs performance scaled by using multiple fragment processor instances?
You mix gaming GPUs and everything else. Even using GTX750/GTX750TI and R7 in the gaming aspect is adding too much. The point is gamers cant pay for the ROI.
As I've said before I think the iGPU will eventually kill off the dGPU, but not anytime real soon. I give the dGPU a minimum of 3 years to live, after that I think it's just a guess.
No, it isn't. Yours is a hardcore enthusiast view. Go to Newegg and click graphics cards, performance cards. You'll see the 260 / 260X / 750 / 750 Ti there. In mainstream you'll see 740s and 250's.
Anyone spending more than $60 on their GPU either wants to play games, is a professional of some kind, or is just a poor consumer.
I predict the opposite. With Broadwell and then Skylake, Intel has allocated more and more die space to the iGPU. We're at a point in which the iGPU inside the latest Intel chip actually has equal die space vs the CPU. At this point, you can even start to think of the latest Intel "APU" as having an integrated cpu. Actually, if Intel does start allocating more die space for the iGPU than the cpu, then it really is an iCPU. It might be the GPU that will eventually kill off the cpu. The casual user only need to open a pdf file so fast but the demand for visual computing will never slow down.
And those are the cards being hit hard by IGPs.
GTX750+GTX750TI got less users combined than GTX970 on Steam. AMDs R7 series even less with 0.58%.
These are the bread and butter cards they sell to many non gamers.
Discrete GPUs isnt shrinking 40% a year due to gamers. They shrink 40% a year due to non gamers.