News Intel GPUs - Battlemage officially announced, evidently not cancelled

Page 131 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

maddie

Diamond Member
Jul 18, 2010
4,787
4,771
136
Go count how many negative quarters AMD has had in the last 20 years. A negative quarter is not a valid reason to abandon a critical part of the business.

Back in 2015 when things were going badly for AMD there was a rumor that they were going to spin off the GPU business:

I remember saying back then that it was nonsense, since the GPU business was critical to AMD (and it was).

You could easily have made the same financial argument, that AMD couldn't afford it's poorly performing GPU business that was dragging it down. But likely that wasn't being considered as GPU is a critical part of modern CPU business.

It's going to take more than a bad quarter and nonsense from MLID to convince me that Intel is abandoning the discrete GPU business. Especially when Intel themselves are saying the opposite.

I find it hilarious that people keep believing sketchy rumors contradicted by companies involved, and when those ones completely fail to pan out, they jump right a believing the next sketchy rumor to come along.
Intel themselves have predicted several more quarters of reducing marketshare (till 2024). This can be considered a best case.
 

PingSpike

Lifer
Feb 25, 2004
21,733
565
126
Go count how many negative quarters AMD has had in the last 20 years. A negative quarter is not a valid reason to abandon a critical part of the business.

Back in 2015 when things were going badly for AMD there was a rumor that they were going to spin off the GPU business:

I remember saying back then that it was nonsense, since the GPU business was critical to AMD (and it was).

You could easily have made the same financial argument, that AMD couldn't afford it's poorly performing GPU business that was dragging it down. But likely that wasn't being considered as GPU is a critical part of modern CPU business.

It's going to take more than a bad quarter and nonsense from MLID to convince me that Intel is abandoning the discrete GPU business. Especially when Intel themselves are saying the opposite.

I find it hilarious that people keep believing sketchy rumors contradicted by companies involved, and when those ones completely fail to pan out, they jump right a believing the next sketchy rumor to come along.

I agree generally (I don't think Intel should give up GPU because in the long term it may kill them), but AMD couldn't abandon the GPU business even in their darkest hours. The console business was the life support system they were living off of back then and that required at least a nominal continuance of the GPU business. And if you look back, the didn't really invest as much into GPU as they should have. Because they were broke. Intel doesn't have the same situation since their iGPU are basically a product dumping strategy for them.
 
Reactions: Tlh97 and maddie

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Intel doesn't have the same situation since their iGPU are basically a product dumping strategy for them.

A comment at a hardware forum said he worked with Gelsinger before and he's very good at cutting on crufts and focusing on core competencies of the company he leads.

He said that there are more to get rid of and they want to focus on "logic". We know they can't and won't abandon it completely because iGPUs. But are dGPUs a distraction in Gelsinger's eyes? Perhaps it is? Unless he's really talking about smaller units like the NUC.

Let's say the expansion cards go away entirely(including PV) and we get Falcon Shores on the server and post-Meteorlake CPUs with massive tile GPUs on it.

Even on the server if they perfect their tile approach rather than having multiple cards like they do now, instead they'll have Shores chip with 15 GPU tiles and 1 CPU tile. And you'd be either able to choose or customize with whatever ratio you want as long as it's equal/less than the top. So you can get 4 GPU + 12 CPU.
 

maddie

Diamond Member
Jul 18, 2010
4,787
4,771
136
While there certainly is an element of Schadenfreude in some responses, that is mainly because Intel and/or Raja have behaved to so bullish the whole past 5 years.

And what Asterox lined to earlier (the story about when Intel exited their attempts at mobile), was hardly the first time Intel spent $billions with nothing to show

I can remember: Networking (okay they have some Ethernet but back in the 1990s they were aiming for Cisco and Juniper), Larrabee, Itanium, Atom in mobiles, 5G modems, and probably more I've forgotten.
A big one that seemed to have deeply pissed off server guys is Optane, 3D-XPoint. Several yrs validating this and then yanked. I see this a very big issue in light of AMD's dominance in server. Why would you take Intel's promises & projections seriously anymore. It'll be their "show me the money" moment.
 

maddie

Diamond Member
Jul 18, 2010
4,787
4,771
136
A comment at a hardware forum said he worked with Gelsinger before and he's very good at cutting on crufts and focusing on core competencies of the company he leads.

He said that there are more to get rid of and they want to focus on "logic". We know they can't and won't abandon it completely because iGPUs. But are dGPUs a distraction in Gelsinger's eyes? Perhaps it is? Unless he's really talking about smaller units like the NUC.

Let's say the expansion cards go away entirely(including PV) and we get Falcon Shores on the server and post-Meteorlake CPUs with massive tile GPUs on it.

Even on the server if they perfect their tile approach rather than having multiple cards like they do now, instead they'll have Shores chip with 15 GPU tiles and 1 CPU tile. And you'd be either able to choose or customize with whatever ratio you want as long as it's equal/less than the top. So you can get 4 GPU + 12 CPU.
I can see this, however bandwidth will be an issue for very powerful client GPU tiles. Servers use HBM. Top end gamer will "always" be discrete, but this is not the main volume market anyhow. It's not a bad future for them if implemented well.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
I can see this, however bandwidth will be an issue for very powerful client GPU tiles. Servers use HBM. Top end gamer will "always" be discrete, but this is not the main volume market anyhow. It's not a bad future for them if implemented well.

They don't have to serve the direct dGPU market. Also Apple doesn't need GDDR6 to get the performance they need.

I am leaning in the direction that MLID rumor is nonsense to put it kindly, but Pat saying there are "more businesses to exit" makes me wonder what that is, whether it's big and impactful like GPUs are or not. It could really be the periphery portions like NUCs because at one point it seemed like they wanted to be like Apple and make you an Intel PC while doing it all horizontally by providing the parts.
 
Jul 27, 2020
17,923
11,691
116
I still wonder what they actually did in the years right after announcing that work on dedicated GPUs is underway.
They hired everyone with GPU experience: marketing, driver writers, architects, QA/QC folks etc. Then everyone waited for the architects to come up with a design. That took quite a while. Whatever they proposed, Intel fab engineers would tell them, "That's not how it's done here. No, we are not going to change our CPU-centric process flow to accommodate your needs". After much back and forth, they started understanding each other more and things started moving forward at a steady pace. Meanwhile, Brian Krzanich and Raja Koduri spent a lot of time arguing. Raja wanted the GPU to be called KPU or Koduri Pixel Unit. BK didn't relent. Raja slowed down the GPU development in retaliation. When the first successful tape out happened, Raja shifted the goalposts and added raytracing and XeSS to the required feature list. Those took considerable time to develop. Then Patty entered the picture as the new CEO. Raja stressed upon him his need to get a big ego boost from the "GPU to KPU" name change. Pat held him off for a while, telling him "soon". But when the first game ran successfully, Raja gave his ultimatum. KPU or he will pull a Koduri and smash the only working engineering sample on the floor. He climbed on top of a table with the card in his hand, ensuring that the card could attain sufficient velocity to be rendered kaput on impact. Pat was in a pickle. After a standoff lasting several hours, he managed to get Raja down on the ground by promising him a very lucrative promotion. However, Raja was to drop any mention of KPU in the future. Raja resigned himself to his sad fate and lost interest in his job. Lisa Pearce kept asking him for a sample so she could have her team validate their driver on final silicon. Raja gave her an early silicon sample. Lisa and her team spent a few months checking and re-checking their code, wondering what had gone wrong. Finally, Lisa took the heatsink off and saw something peculiar. She called up Raja and asked, "What's a KPU?". Raja made up an excuse saying that he forgetfully handed her the wrong sample. So in May-2022, Lisa got her hands on the final silicon, FINALLY. Rest is history.
 

Asterox

Golden Member
May 15, 2012
1,028
1,786
136
They don't have to serve the direct dGPU market. Also Apple doesn't need GDDR6 to get the performance they need.
I am leaning in the direction that MLID rumor is nonsense to put it kindly, but Pat saying there are "more businesses to exit" makes me wonder what that is, whether it's big and impactful like GPUs are or not. It could really be the periphery portions like NUCs because at one point it seemed like they wanted to be like Apple and make you an Intel PC while doing it all horizontally by providing the parts.

What if, at Intel someone forgot the key for a certain "problematic output or business exit doors".

 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,818
21,573
146
People at Intel proposing to take AMD's market position in GPUs over 5yrs ago, was a perfectly reasonable goal. We all know what shape AMD was in at the time. Now, in late '22 with how things have gone since? It sounds absurd.
 

gorobei

Diamond Member
Jan 7, 2007
3,713
1,067
136
They hired everyone with GPU experience: marketing, driver writers, architects, QA/QC folks etc. Then everyone waited for the architects to come up with a design. That took quite a while. Whatever they proposed, Intel fab engineers would tell them, "That's not how it's done here. No, we are not going to change our CPU-centric process flow to accommodate your needs". After much back and forth, they started understanding each other more and things started moving forward at a steady pace. Meanwhile, Brian Krzanich and Raja Koduri spent a lot of time arguing. Raja wanted the GPU to be called KPU or Koduri Pixel Unit. BK didn't relent. Raja slowed down the GPU development in retaliation. When the first successful tape out happened, Raja shifted the goalposts and added raytracing and XeSS to the required feature list. Those took considerable time to develop. Then Patty entered the picture as the new CEO. Raja stressed upon him his need to get a big ego boost from the "GPU to KPU" name change. Pat held him off for a while, telling him "soon". But when the first game ran successfully, Raja gave his ultimatum. KPU or he will pull a Koduri and smash the only working engineering sample on the floor. He climbed on top of a table with the card in his hand, ensuring that the card could attain sufficient velocity to be rendered kaput on impact. Pat was in a pickle. After a standoff lasting several hours, he managed to get Raja down on the ground by promising him a very lucrative promotion. However, Raja was to drop any mention of KPU in the future. Raja resigned himself to his sad fate and lost interest in his job. Lisa Pearce kept asking him for a sample so she could have her team validate their driver on final silicon. Raja gave her an early silicon sample. Lisa and her team spent a few months checking and re-checking their code, wondering what had gone wrong. Finally, Lisa took the heatsink off and saw something peculiar. She called up Raja and asked, "What's a KPU?". Raja made up an excuse saying that he forgetfully handed her the wrong sample. So in May-2022, Lisa got her hands on the final silicon, FINALLY. Rest is history.
i award you an Oscar for writing, and a Razzie for hammiest performance.
 
Reactions: igor_kavinski

DrMrLordX

Lifer
Apr 27, 2000
21,805
11,159
136
See any of the mining threads

Amusing given the subject of this thread. Also Intel DOES manufacture their own BTC mining ASICs now.

Can we agree that gamer discrete graphics cards GPUs probably have the lowest silicon $ margins for the big 3? Why would Intel, entering a time of difficulties, want this as a burden?

As opposed to what? High-end gamer dGPUs have very nice profit margins. Maybe not compared to enterprise graphics/AI accelerators, but still.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Can we agree that gamer discrete graphics cards GPUs probably have the lowest silicon $ margins for the big 3? Why would Intel, entering a time of difficulties, want this as a burden?

Anyone with a bit of common sense will realize that the financial benefits of a client gaming GPU is more than just what's shown at face value. The ability to survive and thrive in a hyper competitive market with a product with demanding requirements in multiple disciplines especially software and drivers while selling it at consumer level pries will translate into much better GPU in the datacenter as well.

The reason the original 3DFX cards were so successful was that the 3D capabilities of the Voodoo cards in the $300 price range were able to get close to what took $30K dedicated machines from SGI! Image quality wise they were close enough and for 1/100 the price no one cares.

So hypothetically if you were a company selling video cards for such workstations you would have been completely screwed. And indeed that's what happened with companies like 3D Labs exiting the market altogether. Either you do client GPUs successfully or all dGPU efforts will in the long run die.

You can see from the conservative attitude of the enterprise world with server chips that you won't get something fantastic by competing in that market. They are totally fine with whatever they got. That's why servers take a longer time for AMD to penetrate for example. The design teams that make the chips for them will have the same mentality.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,262
5,259
136
Anyone with a bit of common sense will realize that the financial benefits of a client gaming GPU is more than just what's shown at face value. The ability to survive and thrive in a hyper competitive market with a product with demanding requirements in multiple disciplines especially software and drivers while selling it at consumer level pries will translate into much better GPU in the datacenter as well.

Another thing is to capture more laptop revenues. Probably as many dGPU chips end up in laptops as in cards.
 

maddie

Diamond Member
Jul 18, 2010
4,787
4,771
136
Anyone with a bit of common sense will realize that the financial benefits of a client gaming GPU is more than just what's shown at face value. The ability to survive and thrive in a hyper competitive market with a product with demanding requirements in multiple disciplines especially software and drivers while selling it at consumer level pries will translate into much better GPU in the datacenter as well.

The reason the original 3DFX cards were so successful was that the 3D capabilities of the Voodoo cards in the $300 price range were able to get close to what took $30K dedicated machines from SGI! Image quality wise they were close enough and for 1/100 the price no one cares.

So hypothetically if you were a company selling video cards for such workstations you would have been completely screwed. And indeed that's what happened with companies like 3D Labs exiting the market altogether. Either you do client GPUs successfully or all dGPU efforts will in the long run die.

You can see from the conservative attitude of the enterprise world with server chips that you won't get something fantastic by competing in that market. They are totally fine with whatever they got. That's why servers take a longer time for AMD to penetrate for example. The design teams that make the chips for them will have the same mentality.
Nope, was true but no longer, datacenter and gaming have diverged and are rapidly moving further apart. A few more years will make this accepted by all.
 

Kocicak

Senior member
Jan 17, 2019
982
973
136
I believe that the decision to pull out from dGPU for consumers market may be temporary caused by the fact that they found they would need (at least) one more stepping for ARC to work properly, and that would cause such cost and delay, that introducing it to the market is simply not rational.

If they keep on working on GPUs for servers they may reintroduce GPUs for consumers in the future.
 
Reactions: Tlh97 and Vattila

Tup3x

Golden Member
Dec 31, 2016
1,011
1,001
136
I'm not suggesting they will abandon the IGPs at all. Just the dGPUs.
They could abandon Alchemist, as in not really even try to sell them properly, but with Battlemage I do think that they could have a chance in low to mid end segments.
 

mikk

Diamond Member
May 15, 2012
4,173
2,210
136
I believe that the decision to pull out from dGPU for consumers market may be temporary caused by the fact that they found they would need (at least) one more stepping for ARC to work properly, and that would cause such cost and delay, that introducing it to the market is simply not rational.

If they keep on working on GPUs for servers they may reintroduce GPUs for consumers in the future.


Arc Alchemist is coming, although it could be a very limited launch. Apart from the ASRock Challenger A380 there is nothing, no sign of OEM versions of A750/A770 shortly before they launch. Maybe it's mainly Intels limited edition which are coming. Even the Asrock A380 was limited, it was sold out after a few days on newegg and since then it run out of stock and I have to wonder if it comes back outside of Asia regions. And now newegg have to sell the Asrock from China which is much more expensive, it's bizarre. OEMs are not really interested apparently.

Intel failed with the memory controller and rBAR dependence, apart from this the hardware isn't even bad. The driver is bad, no native dx9 support, bad dx11 performance and so on.

Btw wccftech sources say the MLID story is FUD.

 
Last edited:
Reactions: Ranulf and psolord

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,692
136
Reading some posts here makes it seem as though some think money can be conjured out of nothing. If true, please send me a PM on how to replicate this feat. Most appreciated.

The secret is making them breed. You do this by providing optimal financial conditions. That's all there is too it.

You also need some to breed them of course.
 

moinmoin

Diamond Member
Jun 1, 2017
4,994
7,765
136
Nope, was true but no longer, datacenter and gaming have diverged and are rapidly moving further apart. A few more years will make this accepted by all.
The divergence is more in the specialization, optimizing hardware further for the workloads. That's mainly pushed by AMD right now as that allows a lot of space reduction and power and performance optimization which again enables the designs to scale out much more. Nvidia for the last couple gens actually pushed server first designs to high end gaming, though I do expect them to divert the hardware designs more as well in the coming gens.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Nope, was true but no longer, datacenter and gaming have diverged and are rapidly moving further apart. A few more years will make this accepted by all.

That doesn't change the fundamentals which is HUMAN in nature. Client is the harder of the two to succeed, therefore the team will need to do better. That's why even the military uses commercial chip vendors for their projects. Volume is the key to success in these projects which servers do not have.

Btw wccftech sources say the MLID story is FUD.

LOL. You know what and I am inclined to believe them. MLID has been absolutely hyping everything to the moon which I think is the WORST trait of the press.

We will see.

Intel failed with the memory controller and rBAR dependence, apart from this the hardware isn't even bad. The driver is bad, no native dx9 support, bad dx11 performance and so on.

Yea so they didn't just port the integrated drivers to ARC but the hardware as well. Not surprising. They've been working on iGPUs for most of their entire careers. The mindset would have been entrenched and not forsee what was required.
 
Last edited:

coercitiv

Diamond Member
Jan 24, 2014
6,395
12,827
136
MLID has been absolutely hyping everything to the moon which I think is the WORST trait of the press.
I was very amused by his way of asking for the public trust: something about him being the one who leaked info about Arc in the first place, and this Arc cancellation news being somewhat detrimental to him, as his business relies on giving hints to people so they can eventually purchase upcoming products. Except that's NOT his business model, he relies on being the first to talk about anything, having that exclusive leak you MUST know about, no matter if it's good or bad. For this he needs hype, needs controversy, and he will create controversy when he smells blood in the water.

Personally I think there's a high chance that MLID got "adored" with this piece of news, that is if the sources are actual people and not simple props. I really doubt some Intel execs would put their financial security on the line just to entertain themselves leaking such delicate info to a youtuber. It's one thing to leak high level architectural info about upcoming products and generate some hype, completely another thing to leak information that could potentially affect the stock price of the company that employs you.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |