Sorry if this was actually stated/explained, but is Intel and AMD possibly co-developing GPUs? Seems really bizarre that they announce the partnership on the one chip (along with a licensing deal? that's how it seems was touted is more about a licensing setup than a single specific product), and then Intel hires the head of AMD's graphics (with seemingly no anti-compete clause?). Seems to me that licensing RTG GPUs and then doing a co-development deal would serve them well. It lets them get up to speed on GPU faster, it gives AMD more marketshare (helping to speed software development). Most of all it lets them both take on Nvidia. With both offering compatibility for HPC/compute/etc, where going Intel or AMD CPU gets you tight integration with GPU. Intel would still have advantages (Optane for instance). AMD keeps tight integration on APUs, Intel less reliant on Nvidia. Meanwhile, each can tweak GPUs as they see fit moving forward (maybe Intel focuses more on the pro stuff, or they kinda pick particular niches), while still keeping some compatibility. Short term it helps with Nvidia's marketshare (both in consumer for AMD GPUs, but also in pro where Intel and AMD both could use leverage against Nvidia; potentially in mobile too, where I could see Intel pairing an updated Atom or maybe even just doing ARM or maybe bit of each with a mobile focused GPU based on RTG and then an integrated Intel modem).
Oh, and Intel gets to manufacture GPUs on their process, potentially giving them an advantage (maybe they get higher clocks or certain other features, maybe some certified external GPU setups for laptops), while AMD gets flexibility (possibly Intel maybe even a foundry partner?).
To me, it would seem to make a lot of shorter term sense as they're competing but not stepping on each other too much as they both are fairly entrenched in certain markets/segments, and then they can see how things go. But I'm not sure there's much actually supporting any of that?
mabe, with windows soon to be running on ARM chips/Nvidia Tegra in laptops, Intel/AMD will need something to counter it? Just guessing.
I mean a new superfast 7 watt tegra chip in a laptop running windows 10, sounds good to me.
edit: I also agree with the Mobile chip theory, Nvidia has a massive lead and Intel/AMD wants a piece of that pie also.
Is that possible at this point? What I mean is that Microsoft is moving to that, but I thought they said it requires extensive bridging of code or special compiling, so they basically focus on a specific chip (meaning it won't open it up to all ARM designs, even when they follow the ARM instruction set), which I believe is the Snapdragon (835?) to start with. Sure they can probably expand, but I wouldn't say there's a guarantee that Microsoft will put in the work, and I think Intel already threw a fit about Microsoft and Qualcomm doing even that (which is I think why things have grown pretty quiet with regards to that), so I doubt they'd be pleased with Nvidia adding to that.
But its been awhile since I've seen anything on that, so maybe things changed.
I'm not sure about massive lead. Intel is making big progress on modems which alone was enough to give Qualcomm near monopoly in the more premium segment (even when Apple started doing their own SoC, they still used Qualcomm modems). And Intel is making other changes (and part of their big new GPU push is for IoTs and mobile). Plus Nvidia's custom designs have had serious issues, and they haven't updated their standard ARM based ones in what 2-3 years? Either they've got problems with their custom CPUs again, or they can't make a business case for them beyond specific niches, which indicates the issues are much bigger than just them making a chip.
The only lead that Nvidia has in mobile is their dGPU. They have compelling products, but few new wins. Switch is a big deal, but it's a closed ecosystem. They need to push more on Chromebooks (at the expenses of Rockchip) and get majority in bed with MS and their next attempted ARM port of windows.
Nvidia has also rubbed a lot of companies the wrong way. Nintendo was about the only major tech/gadget company that had not worked with (and subsequently had a sour experience with) Nvidia. That seems to be a good fit now, but we'll see, as both are notoriously guarded about licensing (read $$$) issues, which is what usually soured companies on working with Nvidia (who wanted premium pricing for what they claimed was premium hardware, which early Tegras really did not live up to). I think that was actually more or less just an issue of convenience. Nvidia was the only company that had an ARM design with a particularly strong GPU that they could purchase, and they also had tools ready to help developers (a notorious Nintendo problem, one which definitely hampered the Wii U). AMD had stalled or cancelled their ARM chip, and Jaguar even on the latest process likely wouldn't have been able to offer the battery life that ARM does.
Some of the companies have worked with Nvidia again, even somewhat agreeably (Microsoft most notably after they were mad about the Xbox but then used Tegra in Zune HD and in Surface non-Pros; although that's likely because they maintained the Windows/PC hardware relationship; will be interesting to see if how the Intel/AMD thing works out, as Surface would seem to be a good fit for that hybrid Intel CPU/AMD GPU, but it'll be almost another year before they update it and by then things could change). Sony was mad after the PS3 (where they basically turned to Nvidia at the last moment when it became clear Cell couldn't outdo a GPU for game rendering; and then felt that Nvidia wasn't as willing to deal on pricing). Apple seems to have really soured on Nvidia, I think a large part was "bumpgate" which hurt Apple's rep and cost them money. They had used them since, but don't any more even when Nvidia has put in software work to try and win Apple support and they would seem to have an advantage over AMD in laptops with their better perf/W. I seem to recall Samsung being bitter about I think Tegra 2 or 3 that was in one of the S phones. And in spite of Nvidia specifically touting design wins, few products with Tegra actually materialized.
I'm not sure what's up with Tegra at this point. Nvidia was touting custom cores (and acting like they had caught Apple), but haven't seen those chips come out (other than maybe some premium cars where I'm not sure they're actually doing what Nvidia claims yet). And they've talked mostly about automotive since (which they also lost Tesla deals to Intel and AMD, so I'm not sure that's going all well for them either). Not sure if they made progress on their LTE modem (which was another Tegra thing Nvidia hyped, only it seemed to be a dud). Seems like they need to focus, either roll with ARM CPU to try and get compatibility, and play their GPU and software support of the GPU, or focus on making Tegra more premium (only it has to actually live up to it). Or they need to be willing to take less margins in order to build some good will. But considering we saw Intel try to do that almost to the extreme, and it was a total failure.