News Intel GPUs - Battlemage officially announced, evidently not cancelled

Page 113 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

maddie

Diamond Member
Jul 18, 2010
4,787
4,771
136
Your text made it sound like it's TSMC's problem when it isn't. In this case TSMC not only can dictate the price as long as there is sufficient demand (recent earnings call reinforced there is), it can earn on top due to also getting money for contract breaches while freeing wafers for other paying customers.
If 2 of your biggest customers want to cut confirmed/locked orders, it isn't looking all that good. Even AMD & Apple reduced orders. Do you think what's affecting these big 4 don't apply to the smaller ones?
 

moinmoin

Diamond Member
Jun 1, 2017
4,994
7,765
136
If 2 of your biggest customers want to cut confirmed/locked orders, it isn't looking all that good. Even AMD & Apple reduced orders. Do you think what's affecting these big 4 don't apply to the smaller ones?
I consider the reports that AMD & Apple reduced orders dubious. Nvidia was a big customer of Samsung just moving back to TSMC having to pay big bucks for that. Intel was not really a big customer yet.

But who cares about some few choice customers? As I wrote TSMC itself in the recent earnings call from July 14th reinforced that it faces no lack of demand, quite the opposite.

"Despite the ongoing inventory correction, our customers' demand continues to exceed our ability to supply. We expect our capacity to remain tight throughout 2022 and our full year growth to be mid-30% in U.S. dollar terms."


Who do you believe?
 

GodisanAtheist

Diamond Member
Nov 16, 2006
7,063
7,489
136
Frankly I don't see AMD cutting orders. They're on good terms with TSMC and given they're the second player in both their markets they would be happy with both higher ASP or greater market share.
 

gorobei

Diamond Member
Jan 7, 2007
3,713
1,067
136
During the WAN Show, they talked about the rumors of Intel killing off Arc:
might as well get the details out there:
  • linus is reacting to MLID leak: that upper management is getting conflicting messages from the different divisions and is worried they have another 10nm process debacle boondogle again with arc. arc and battlemage have problems that cant be solved in drivers and will take until Q3-ish 2023 to solve. members of the arc team went on a rogue pr tour not coordinated or authorized by the main intel pr division.
  • linus thinks this may be a play by the arc team to save the project from being axed by getting some hype in the public that will get them more time to solve the problems.

the reality is probably that this is all internal gossip from all the misc divisions inside intel, so killing gaming gpus is probably the hyperbolic corner case.

given how gpu tile integration into chiplet apus in server is the direction most are heading, i cant see intel abandoning consumer gaming graphics since they would lose an opportunity to recoup some dev cost by selling more chips. but as linus points out: they are spending money at tsmc to fab this, and making silicon for the micro profit margins of gaming vs a server apu means they would be losing money which is hard to justify to shareholders.
 

CakeMonster

Golden Member
Nov 22, 2012
1,428
535
136
With a source that bad, I feel dirty even discussing the 'leak'. I mean, knowing big corps like Intel, it wouldn't surprise me if they are constantly evaluating what to invest in (and making dumb short term profit decisions). But there's definitely a need for Intel to be in the parrallell computing business long term, and there's a need for a basic functioning iGPU long term that is better than what they had. So is the dGPU market so bad for them that they need to cut their losses ASAP before they even have realized much of the development and driver potential? I kind of doubt that. Also, the parts about these guys going on a 'rogue' tour to try and save the project reads like bad fanfic. The hardware community is kind of science focused and we should be more critical than this, even if there could be grains of truth in whatever is being discussed at Intel, there are still clown 'leakers' that IMO are not met with the proper critical thinking skills and who should not be acknowledged as much as they are.
 
Reactions: igor_kavinski

jpiniero

Lifer
Oct 1, 2010
14,839
5,456
136
So is the dGPU market so bad for them that they need to cut their losses ASAP before they even have realized much of the development and driver potential?

The whole point of doing this in the first place was to fill the fabs - their fabs, not TSMC. If they won't use the fabs internally because of yields or competitiveness, there's no real point in continuing since it will never get profitable. Especially now that AMD has gotten their act together.
 
Reactions: Tlh97 and maddie

biostud

Lifer
Feb 27, 2003
18,402
4,965
136
Does AMD have the flexibility to shift between GPU and CPU production as long as it is on the same node?
 

Leeea

Diamond Member
Apr 3, 2020
3,698
5,432
136
Not really follofiwng apple closely but why has it supposedly "flopped"? I always thought ut was a success?
Which is why I put "flopped" in quotes.

So Apple claimed a m1 iGPU would beat a rtx3090. On the spec sheet, it has all the specs to do it. Memory bandwidth, execution units, etc. The transistors are there.


In reality? The $2,100 m1 pro released in 2022 loses to a gtx1060 laptop edition released in 2016.


Equivalent to a rtx3090? Not even in the same ballpark.


Power efficiency? The 15watt ryzen 5700u iGPU beats the 60watt m1 pro in civ 6. You can get laptops with a 5700u in them for $700.


People will go into reasons. The software, the drivers, poorly optimized pipeline, etc. Which is my point, does not matter who you are, you cannot wave the magic wand and have a competitive GPU.


It takes time, time to develop the software, time to develop the engineers, time to develop the hardware. Both Apple and Intel have not put in the time.
 
Last edited:

biostud

Lifer
Feb 27, 2003
18,402
4,965
136
Which is why I put "flopped" in quotes.

So Apple claimed a m1 iGPU would beat a rtx3090. On the spec sheet, it has all the specs to do it. Memory bandwidth, execution units, etc. The transistors are there.


In reality? The $2,100 m1 pro released in 2022 loses to a gtx1060 laptop edition released in 2016.


Equivalent to a rtx3090? Not even in the same ballpark.


Power efficiency? The 15watt ryzen 5700u iGPU beats the 60watt m1 pro in civ 6. You can get laptops with a 5700u in them for $700.


People will go into reasons. The software, the drivers, poorly optimized pipeline, etc. Which is my point, does not matter who you are, you cannot wave the magic wand and have a competitive GPU.


It takes time, time to develop the software, time to develop the engineers, time to develop the hardware. Both Apple and Intel have not put in the time.

What I don't really understand why it is so important to compare it with a high end PC gaming card. If you buy an Apple product, gaming is most likely not your primary, secondary or tertiary reason.
 
Reactions: Mopetar and Leeea

maddie

Diamond Member
Jul 18, 2010
4,787
4,771
136
Does AMD have the flexibility to shift between GPU and CPU production as long as it is on the same node?
AFAIK, AMD and others order wafers. What they do with them is their own decision. If you have several designs that use the same process, why would you think they can't switch masks/designs as needed. There would be a minimum batch size for optimum costs as you don't want to waste time/money switching them too quickly, but otherwise, its up to you.
 
Reactions: biostud

moinmoin

Diamond Member
Jun 1, 2017
4,994
7,765
136
That allow for quite some flexibility. Do you know if console chips are ordered by AMD or Sony and Microsoft?
AMD has to do the orders due to the x86 licensing, but AMD can't change orders its semi-custom customers wanted it to do unless contracts are changed accordingly.
 

DrMrLordX

Lifer
Apr 27, 2000
21,806
11,161
136
Does AMD have the flexibility to shift between GPU and CPU production as long as it is on the same node?

Basically. I think it's more-accurate to say that they can order what they want on a per-wafer basis (in batches, of course). As @moinmoin indicated they are still restricted by contractual obligations with customers.
 

Leeea

Diamond Member
Apr 3, 2020
3,698
5,432
136
What I don't really understand why it is so important to compare it with a high end PC gaming card. If you buy an Apple product, gaming is most likely not your primary, secondary or tertiary reason.
Apple was the one who made that comparison first.

Why did Apple feel it is so important to compare it with a high end PC gaming card?


If you buy an Apple product, gaming is most likely not your primary, secondary or tertiary reason.
Apparently Apple feels differently.
 
Reactions: xpea

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Which is why I put "flopped" in quotes.
So Apple claimed a m1 iGPU would beat a rtx3090. On the spec sheet, it has all the specs to do it. Memory bandwidth, execution units, etc. The transistors are there.
In reality? The $2,100 m1 pro released in 2022 loses to a gtx1060 laptop edition released in 2016.
Equivalent to a rtx3090? Not even in the same ballpark.
Power efficiency? The 15watt ryzen 5700u iGPU beats the 60watt m1 pro in civ 6. You can get laptops with a 5700u in them for $700.
People will go into reasons. The software, the drivers, poorly optimized pipeline, etc. Which is my point, does not matter who you are, you cannot wave the magic wand and have a competitive GPU.
It takes time, time to develop the software, time to develop the engineers, time to develop the hardware. Both Apple and Intel have not put in the time.

Ok, few clarifications. The games you are comparing are being EMULATED. Very few games that run native on ARM powered Macs at this point. Comparing games that are being emulated is a joke.

As for Apples claim, it was in specific, cherry picked, use cases using the Metal API. And they are technically true.

Your claim of M1 being a flop is wrong. The M1 is a huge improvement over Intel powered models. Of every laptop I have owned, and currently own (including a $3K Dell Precision for work), my M1 MBP is the fastest laptop I have ever used. And unlike the Intel powered Precision, it won't burn my lap (or even get warm really).
 

biostud

Lifer
Feb 27, 2003
18,402
4,965
136
AFAIK, AMD and others order wafers. What they do with them is their own decision. If you have several designs that use the same process, why would you think they can't switch masks/designs as needed. There would be a minimum batch size for optimum costs as you don't want to waste time/money switching them too quickly, but otherwise, its up to you.
I have no idea how contracts and production in semi-conducters works and if the 5nm production used for the CPU and GPU were interchangeable contract wise, so I was curious if was possible to balance between the two.
 

Leeea

Diamond Member
Apr 3, 2020
3,698
5,432
136
Ok, few clarifications. The games you are comparing are being EMULATED. Very few games that run native on ARM powered Macs at this point. Comparing games that are being emulated is a joke.
My claim was it takes years of time invested to get the software support.

Clearly, if Apple is emulating their competitors systems, they do not have said support. They have not taken the time to create the ecosystem, and thereby their product is inferior.

Your claim that "Very few games that run native on ARM powered Macs" if supports my claim.


If emulating games are a joke, and that is all the m1 has support for with the vast majority of titles, does that not make the m1 a joke?


As for Apples claim, it was in specific, cherry picked, use cases using the Metal API. And they are technically true.
Care to provide a cherry picked 3rd party example of a game on Apple that outperforms a rtx 3090?

Your claim of M1 being a flop is wrong. The M1 is a huge improvement over Intel powered models. Of every laptop I have owned, and currently own (including a $3K Dell Precision for work), my M1 MBP is the fastest laptop I have ever used. And unlike the Intel powered Precision, it won't burn my lap (or even get warm really).
I claimed Apples GPU flopped at meeting its marketing claims. Nothing in your quoted statement indicates otherwise.
 
Last edited:

biostud

Lifer
Feb 27, 2003
18,402
4,965
136
Apple was the one who made that comparison first.

Why did Apple feel it is so important to compare it with a high end PC gaming card?



Apparently Apple feels differently.
I can't remember, but wasn't the benchmark mostly targeted towards content creation, and not games? Or was it just the arbitrary "x2.4 more powerful" kind of thing.
 

Leeea

Diamond Member
Apr 3, 2020
3,698
5,432
136
I can't remember, but wasn't the benchmark mostly targeted towards content creation, and not games? Or was it just the arbitrary "x2.4 more powerful" kind of thing.
It was a stupid claim in a pre-release presentation, claiming to be faster then all existing discrete GPUs. At the time the rtx3090 was the fastest discrete GPU. It made headlines.

On paper they should have the performance, so perhaps they did think at that point in time they could deliver on the claim.



My point is that GPUs are hard and companies like Intel and Apple should be prepared for the long haul if they plan on making competitive GPU products.
 

maddie

Diamond Member
Jul 18, 2010
4,787
4,771
136
I have no idea how contracts and production in semi-conducters works and if the 5nm production used for the CPU and GPU were interchangeable contract wise, so I was curious if was possible to balance between the two.
OK, I wondered if you knew sometime more specific.
 

Frenetic Pony

Senior member
May 1, 2012
218
179
116
That allow for quite some flexibility. Do you know if console chips are ordered by AMD or Sony and Microsoft?

AMD orders them for their console partners, which is beneficial to everyone as that means AMD gets more volume and lower prices for all its chips and Sony/MS don't have to bother with negotiations where they'd be at a disadvantage anyway.

Anyway I wouldn't be surprised if GPUs were on the chopping block. Latest Intel numbers were below guidance, and despite the fact that they're getting bailed out by US taxpayers to the tune of billions of dollars shareholders might still not be happy. Sacrificing something like GPUs, a hypothetically long prospect and low margin project, could appease them if it comes to Gelsinger being in any danger of being ousted.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |