It is extremely critical for their laptop business. CPU without integrated graphics is useless there.
I'm not suggesting they will abandon the IGPs at all. Just the dGPUs.
It is extremely critical for their laptop business. CPU without integrated graphics is useless there.
Intel themselves have predicted several more quarters of reducing marketshare (till 2024). This can be considered a best case.Go count how many negative quarters AMD has had in the last 20 years. A negative quarter is not a valid reason to abandon a critical part of the business.
Back in 2015 when things were going badly for AMD there was a rumor that they were going to spin off the GPU business:
AMD is considering breakup, spin-off graphics business unit – report - KitGuru
Chief executive officer of Advanced Micro Devices has ordered a research to review possibilities ofwww.kitguru.net
I remember saying back then that it was nonsense, since the GPU business was critical to AMD (and it was).
You could easily have made the same financial argument, that AMD couldn't afford it's poorly performing GPU business that was dragging it down. But likely that wasn't being considered as GPU is a critical part of modern CPU business.
It's going to take more than a bad quarter and nonsense from MLID to convince me that Intel is abandoning the discrete GPU business. Especially when Intel themselves are saying the opposite.
I find it hilarious that people keep believing sketchy rumors contradicted by companies involved, and when those ones completely fail to pan out, they jump right a believing the next sketchy rumor to come along.
Go count how many negative quarters AMD has had in the last 20 years. A negative quarter is not a valid reason to abandon a critical part of the business.
Back in 2015 when things were going badly for AMD there was a rumor that they were going to spin off the GPU business:
AMD is considering breakup, spin-off graphics business unit – report - KitGuru
Chief executive officer of Advanced Micro Devices has ordered a research to review possibilities ofwww.kitguru.net
I remember saying back then that it was nonsense, since the GPU business was critical to AMD (and it was).
You could easily have made the same financial argument, that AMD couldn't afford it's poorly performing GPU business that was dragging it down. But likely that wasn't being considered as GPU is a critical part of modern CPU business.
It's going to take more than a bad quarter and nonsense from MLID to convince me that Intel is abandoning the discrete GPU business. Especially when Intel themselves are saying the opposite.
I find it hilarious that people keep believing sketchy rumors contradicted by companies involved, and when those ones completely fail to pan out, they jump right a believing the next sketchy rumor to come along.
Intel doesn't have the same situation since their iGPU are basically a product dumping strategy for them.
A big one that seemed to have deeply pissed off server guys is Optane, 3D-XPoint. Several yrs validating this and then yanked. I see this a very big issue in light of AMD's dominance in server. Why would you take Intel's promises & projections seriously anymore. It'll be their "show me the money" moment.While there certainly is an element of Schadenfreude in some responses, that is mainly because Intel and/or Raja have behaved to so bullish the whole past 5 years.
And what Asterox lined to earlier (the story about when Intel exited their attempts at mobile), was hardly the first time Intel spent $billions with nothing to show
I can remember: Networking (okay they have some Ethernet but back in the 1990s they were aiming for Cisco and Juniper), Larrabee, Itanium, Atom in mobiles, 5G modems, and probably more I've forgotten.
I can see this, however bandwidth will be an issue for very powerful client GPU tiles. Servers use HBM. Top end gamer will "always" be discrete, but this is not the main volume market anyhow. It's not a bad future for them if implemented well.A comment at a hardware forum said he worked with Gelsinger before and he's very good at cutting on crufts and focusing on core competencies of the company he leads.
He said that there are more to get rid of and they want to focus on "logic". We know they can't and won't abandon it completely because iGPUs. But are dGPUs a distraction in Gelsinger's eyes? Perhaps it is? Unless he's really talking about smaller units like the NUC.
Let's say the expansion cards go away entirely(including PV) and we get Falcon Shores on the server and post-Meteorlake CPUs with massive tile GPUs on it.
Even on the server if they perfect their tile approach rather than having multiple cards like they do now, instead they'll have Shores chip with 15 GPU tiles and 1 CPU tile. And you'd be either able to choose or customize with whatever ratio you want as long as it's equal/less than the top. So you can get 4 GPU + 12 CPU.
I can see this, however bandwidth will be an issue for very powerful client GPU tiles. Servers use HBM. Top end gamer will "always" be discrete, but this is not the main volume market anyhow. It's not a bad future for them if implemented well.
They hired everyone with GPU experience: marketing, driver writers, architects, QA/QC folks etc. Then everyone waited for the architects to come up with a design. That took quite a while. Whatever they proposed, Intel fab engineers would tell them, "That's not how it's done here. No, we are not going to change our CPU-centric process flow to accommodate your needs". After much back and forth, they started understanding each other more and things started moving forward at a steady pace. Meanwhile, Brian Krzanich and Raja Koduri spent a lot of time arguing. Raja wanted the GPU to be called KPU or Koduri Pixel Unit. BK didn't relent. Raja slowed down the GPU development in retaliation. When the first successful tape out happened, Raja shifted the goalposts and added raytracing and XeSS to the required feature list. Those took considerable time to develop. Then Patty entered the picture as the new CEO. Raja stressed upon him his need to get a big ego boost from the "GPU to KPU" name change. Pat held him off for a while, telling him "soon". But when the first game ran successfully, Raja gave his ultimatum. KPU or he will pull a Koduri and smash the only working engineering sample on the floor. He climbed on top of a table with the card in his hand, ensuring that the card could attain sufficient velocity to be rendered kaput on impact. Pat was in a pickle. After a standoff lasting several hours, he managed to get Raja down on the ground by promising him a very lucrative promotion. However, Raja was to drop any mention of KPU in the future. Raja resigned himself to his sad fate and lost interest in his job. Lisa Pearce kept asking him for a sample so she could have her team validate their driver on final silicon. Raja gave her an early silicon sample. Lisa and her team spent a few months checking and re-checking their code, wondering what had gone wrong. Finally, Lisa took the heatsink off and saw something peculiar. She called up Raja and asked, "What's a KPU?". Raja made up an excuse saying that he forgetfully handed her the wrong sample. So in May-2022, Lisa got her hands on the final silicon, FINALLY. Rest is history.I still wonder what they actually did in the years right after announcing that work on dedicated GPUs is underway.
They don't have to serve the direct dGPU market. Also Apple doesn't need GDDR6 to get the performance they need.
I am leaning in the direction that MLID rumor is nonsense to put it kindly, but Pat saying there are "more businesses to exit" makes me wonder what that is, whether it's big and impactful like GPUs are or not. It could really be the periphery portions like NUCs because at one point it seemed like they wanted to be like Apple and make you an Intel PC while doing it all horizontally by providing the parts.
i award you an Oscar for writing, and a Razzie for hammiest performance.They hired everyone with GPU experience: marketing, driver writers, architects, QA/QC folks etc. Then everyone waited for the architects to come up with a design. That took quite a while. Whatever they proposed, Intel fab engineers would tell them, "That's not how it's done here. No, we are not going to change our CPU-centric process flow to accommodate your needs". After much back and forth, they started understanding each other more and things started moving forward at a steady pace. Meanwhile, Brian Krzanich and Raja Koduri spent a lot of time arguing. Raja wanted the GPU to be called KPU or Koduri Pixel Unit. BK didn't relent. Raja slowed down the GPU development in retaliation. When the first successful tape out happened, Raja shifted the goalposts and added raytracing and XeSS to the required feature list. Those took considerable time to develop. Then Patty entered the picture as the new CEO. Raja stressed upon him his need to get a big ego boost from the "GPU to KPU" name change. Pat held him off for a while, telling him "soon". But when the first game ran successfully, Raja gave his ultimatum. KPU or he will pull a Koduri and smash the only working engineering sample on the floor. He climbed on top of a table with the card in his hand, ensuring that the card could attain sufficient velocity to be rendered kaput on impact. Pat was in a pickle. After a standoff lasting several hours, he managed to get Raja down on the ground by promising him a very lucrative promotion. However, Raja was to drop any mention of KPU in the future. Raja resigned himself to his sad fate and lost interest in his job. Lisa Pearce kept asking him for a sample so she could have her team validate their driver on final silicon. Raja gave her an early silicon sample. Lisa and her team spent a few months checking and re-checking their code, wondering what had gone wrong. Finally, Lisa took the heatsink off and saw something peculiar. She called up Raja and asked, "What's a KPU?". Raja made up an excuse saying that he forgetfully handed her the wrong sample. So in May-2022, Lisa got her hands on the final silicon, FINALLY. Rest is history.
See any of the mining threads
Can we agree that gamer discrete graphics cards GPUs probably have the lowest silicon $ margins for the big 3? Why would Intel, entering a time of difficulties, want this as a burden?
Can we agree that gamer discrete graphics cards GPUs probably have the lowest silicon $ margins for the big 3? Why would Intel, entering a time of difficulties, want this as a burden?
Anyone with a bit of common sense will realize that the financial benefits of a client gaming GPU is more than just what's shown at face value. The ability to survive and thrive in a hyper competitive market with a product with demanding requirements in multiple disciplines especially software and drivers while selling it at consumer level pries will translate into much better GPU in the datacenter as well.
Nope, was true but no longer, datacenter and gaming have diverged and are rapidly moving further apart. A few more years will make this accepted by all.Anyone with a bit of common sense will realize that the financial benefits of a client gaming GPU is more than just what's shown at face value. The ability to survive and thrive in a hyper competitive market with a product with demanding requirements in multiple disciplines especially software and drivers while selling it at consumer level pries will translate into much better GPU in the datacenter as well.
The reason the original 3DFX cards were so successful was that the 3D capabilities of the Voodoo cards in the $300 price range were able to get close to what took $30K dedicated machines from SGI! Image quality wise they were close enough and for 1/100 the price no one cares.
So hypothetically if you were a company selling video cards for such workstations you would have been completely screwed. And indeed that's what happened with companies like 3D Labs exiting the market altogether. Either you do client GPUs successfully or all dGPU efforts will in the long run die.
You can see from the conservative attitude of the enterprise world with server chips that you won't get something fantastic by competing in that market. They are totally fine with whatever they got. That's why servers take a longer time for AMD to penetrate for example. The design teams that make the chips for them will have the same mentality.
They could abandon Alchemist, as in not really even try to sell them properly, but with Battlemage I do think that they could have a chance in low to mid end segments.I'm not suggesting they will abandon the IGPs at all. Just the dGPUs.
I believe that the decision to pull out from dGPU for consumers market may be temporary caused by the fact that they found they would need (at least) one more stepping for ARC to work properly, and that would cause such cost and delay, that introducing it to the market is simply not rational.
If they keep on working on GPUs for servers they may reintroduce GPUs for consumers in the future.
Reading some posts here makes it seem as though some think money can be conjured out of nothing. If true, please send me a PM on how to replicate this feat. Most appreciated.
Reading some posts here makes it seem as though some think money can be conjured out of nothing. If true, please send me a PM on how to replicate this feat. Most appreciated.
The divergence is more in the specialization, optimizing hardware further for the workloads. That's mainly pushed by AMD right now as that allows a lot of space reduction and power and performance optimization which again enables the designs to scale out much more. Nvidia for the last couple gens actually pushed server first designs to high end gaming, though I do expect them to divert the hardware designs more as well in the coming gens.Nope, was true but no longer, datacenter and gaming have diverged and are rapidly moving further apart. A few more years will make this accepted by all.
Nope, was true but no longer, datacenter and gaming have diverged and are rapidly moving further apart. A few more years will make this accepted by all.
Btw wccftech sources say the MLID story is FUD.
Intel failed with the memory controller and rBAR dependence, apart from this the hardware isn't even bad. The driver is bad, no native dx9 support, bad dx11 performance and so on.
I was very amused by his way of asking for the public trust: something about him being the one who leaked info about Arc in the first place, and this Arc cancellation news being somewhat detrimental to him, as his business relies on giving hints to people so they can eventually purchase upcoming products. Except that's NOT his business model, he relies on being the first to talk about anything, having that exclusive leak you MUST know about, no matter if it's good or bad. For this he needs hype, needs controversy, and he will create controversy when he smells blood in the water.MLID has been absolutely hyping everything to the moon which I think is the WORST trait of the press.