It's saying something about "Display processors" to support ARM. Does that mean AMD will develop video cards supporting ARM?
Without Meyer, what will AMD do next?
ARM Micro Devices
By Timothy Prickett Morgan
Posted in PCs & Chips, 12th January 2011 23:16 GMT
Analysis Things seemed poised to turn around for AMD in 2011. But the abrupt departure of CEO Dirk Meyer on Monday afternoon – at the exact same time that rivals Intel and Nvidia ceased their hostilities and a week after Nvidia jumped into the processor racket – indicates that AMD's board of directors sees challenges that aren't obvious to outsiders.
And to at least one ex-insider who is now looking for a job.
The chatter is that Meyer was shown the door at AMD because the chip maker had missed the boat on smartphones and tablets, markets that are growing a lot faster (in terms of units) than traditional CPUs and GPUs. As Nvidia president and CEO Jen-Hsun Huan said at the launch of his company's Tegra 2 system-on-a-chip last week at the Consumer Electronics Show, for a lot of people, their smartphone is "their most personal computer."
As El Reg previously reported, Nvidia has licensed the Cortex-A15 processor design from ARM Holdings – the company that controls the ARM RISC chip that dominates the smartphone and handheld market – and is set to announce its own multicore ARM processor, code-named Project Denver, in 2013 alongside its Maxwell family of GPU chips.
The landscape of the computing market is changing fast, and some of that shaking and quaking is due to AMD itself. AMD bought ATI Technologies in July 2006 for $5.4bn to get its hands on a GPU and chipset businesses that would help it become a platform player like rival Intel. And in March 2009, AMD abandoned its roots and chucked its chip fabs onto the shoulders of GlobalFoundries, along with rich Abu Dhabi backers who fancy controlling a business not based on oil. (AMD founder Jerry Sanders once famously observed that "real men have fabs," snarling at rivals who used third parties to bake their chips.)
Under Dirk Meyer, AMD survived the economic downturn (albeit not unscathed), dealt with product delays and changing roadmaps for standalone processors and converged CPU-GPU hybrids, and suffered through a massive platform upgrade last year with the Opteron 4100 and 6100. AMD even faced down a resurgent Intel's Nehalem family of Core and Xeon chips in 2009 and Westmere and Nehalem-EX desktop and server processors in 2010.
AMD's Fusion line of PC chips is now ready to face Intel's Sandy Bridge 2nd Generation Cores in 2011, and their Opterons arguably beat Intel's chips in terms of price/performance and performance/watt. GlobalFoundries is apparently planning to double its spending on chip-making factories and equipment in 2011, hitting $5.7bn. Oh, and AMD buried the legal hatchet with Intel back in November 2009, raking in $1.25bn and curbing some of Intel's bad behavior.
Shouldn't Meyer have been secure in his job after that hard slog over the past few years? Apparently not.
Rather than those positive developments, AMD's board focused on what Meyer didn't get right. For example, AMD chips are not the CPUs in game consoles. And worse yet, AMD chips are not in cell phones, smartphones, or tablets. AMD has somewhere between 11 and 12 per cent of global microprocessor revenues, compared to Intel's roughly 80 per cent, and it peaked in the server racket a few years back when the Opterons were so much better than their Intel Xeon competition.
The problem is that while AMD was busy cleaning up its books, going fabless, and getting its processor roadmap back in order, a slew of other products - netbooks, tablets, ereaders, and truly smart phones - changed the market. These areas are growing while PCs and servers are losing steam. To be sure, there are hundreds of millions of PC and server chips being sold each year, and that will be true for as far as any of us can see. But today there are billions of other chips being consumed, and AMD can ill afford to ignore that fact.
AMD has the Geode LX low-powered chip, which it bought from National Semiconductor in 2003, and the company could have long since created an x64 alternative to the Athlon, Turion, or Opteron for devices such as tablets. If these Geode chips were inadequate, AMD could have partnered with or acquired VIA Technologies, another maker of low-powered x64 chips, and created something that could have taken on Intel's Atom and various ARM chips for small computing devices.
But AMD – or Meyer, it seems – apparently thought the company was on the right track in the embedded processor space with its low-powered Opterons for hyperscale server clusters and the combination of the Athlon II X2 embedded processor and the Radeon HD 6700M for high-end devices such as the Surface 2.0 touch desktop from Samsung and Microsoft, which debuted at CES 2011.
AMD just released its Fusion C-Series low-power APUs (accelerated processing units), and its Fusion G-Series embedded APUs are soon to hit the streets. Both are based on the company's new low-power Bobcat CPU cores – AMD's first new x86 core since 2003 – and both feature an integrated Radeon-based GPU core on the same silicon. While these CPU-GPU APUs are impressive, it's unlikely you'll see them a smartphone – perhaps a tablet, but we'll have to wait and see.
To play the CPU-GPU limbo game, AMD could get under the power consumption broomstick in a much simpler way. The most obvious thing would be to do the same math that Nvidia did many years ago, see that at least some of the "computing" market was shifting to ARM processors, and license the Cortex designs from ARM Holdings. That way, AMD could come up with something new and interesting, like Nvidia is trying to do with the Denver multicore ARM chips for servers and their related Maxwell GPUs.
But the ARM market is rapidly getting crowded, with Calxeda and Nvidia now chasing servers, Nvidia chasing PCs, and a slew of companies – including Apple, Nvidia, Marvell, Qualcomm, Texas Instruments, and Samsung – are making ARM-derived chips for netbooks and smartphones.
For the PC and server markets, ARM chips won't be fully suitable until they have higher clock speeds, more cores, and more memory and I/O capacity. So it's not too late for the new CEO at AMD to jump into the ARM fray, or to acquire someone who is already there. Considering that Marvell now has a market capitalization of $13.2bn and Nvidia is at just under $12b, such acquisitions seem unlikely. And it's hard to imagine that Samsung, Texas Instruments, or Apple would want to buy AMD – although an Apple acquisition of AMD and a subsequent push into ARM chips would certainly be an interesting development.
An ARM license would not only be the cheapest alternative for AMD, but perhaps the only alternative – short of merging with Intel. (Wouldn't that be funny, watching Intel argue with US and EU antitrust authorities that the relevant market includes ARM devices, and that it doesn't have a monopoly on microprocessors?) An ARM license would put AMD in competition with lots of aggressive chip makers that peddle high-volume, low-margin chips, while at the same time Intel and AMD would fight the converged CPU-GPU wars on the desktop and in the laptop.
It's hard to see what AMD's next CEO will do – but it's pretty clear that it won't be an easy job.
This author believes AMD will design ARM SOCs claiming it would be their cheapest option.
I do know that AMD (like Nvidia) has the advantage of fast access to the half node (20/28nm) for the GPU due to the fact it designs/sells high profit discrete cards. IMHO this should help them (as well as Nvidia) compete against other players wanting to field top end SOCs for emerging software platforms.
http://www.theregister.co.uk/2011/01/12/what_does_amd_do_now/
But in terms of his strategic vision, I think a fair minded person would have to give him and his team an F, for shutting themselves out of the markets with the most potential growth, and basically putting the company on a perilous path a few years from now.
I generally agree with your entire post, though there is another way of looking at it. AMD has what, 10-15% marketshare in x86? Since the x86 market is huge, and even in the most optimistic ARM uptake scenario, is not going away anytime soon; given there are only 2 real players, AMD has huge growth potential, at least in theory. So there is something to be said for concentrating their efforts on x86.
OTOH, AMD's problem is the same it has been for the last 20+ years; it pretty damn tough competing with intel on equal ground. Not that AMD doesnt have excellent engineers, but just look at the R&D budgets. Intel can afford a plan B, C, D and if that fails, still go ahead with plan E. Even if plan E is rubbish, they still make more money on it than AMD can from its most ingenious design. If AMD hits a single snag, they are in deep, deep trouble. Think Barcelona and how it impacted AMD, and compare it to Intel's multi billion dollar failures like Itanium or Larabee and how it doesnt even seem to impact their results one tiny bit. Or the other way around, how relatively little money they made those few times when they actually had better products for brief periods of time, like the Athlon against P3 or K8 against late P4s. AMD still bled money more often than not.
Moreover, the dynamics of this market are against AMD; cost of developing new chips (and manufacturing processes) skyrockets, so whatever disadvantage they already had over the past decades, is only getting bigger.
Thats why I agree they should have pursued alternative market opportunities when they could afford it; like nVidia did rather cleverly over the last years with Tesla, Tegra and now project denver. None of these are making them a lot of money yet, but the potential is there and the risk spread. AMD otoh, keeps betting the farm on that same old horse, that for the past decades hasnt really won races very often. Risky strategy to put it mildly.
Now the kicker:
If it's simply about small formfactor embedded devices, then this shift in the semicon industry will take the good part of a decade, if not longer. But a series of recent events, culminating in the MS + ARM announcement at CES, demonstrated that, not only is the PC market overall being curbed by the growth of these embedded platforms, but the x86 ecosystem's share of the PC market itself will be threatened. Soon, there will be serious challenges from ARM and others on segments of x86 core markets.
Given that applications are rapidly transforming into web-services, and the inevitable march toward server virtualization on the cloud, and with the dominant client OS now ported to an alternative ISA, the niches where x86 can hide will diminish greatly in the next several years. Neither client side nor server side would be a safe haven for x86 systems even in the medium term future, they will have to compete on the basis of merit, especially performance per watt.
So in this sense, most of us have anticipated an eventual shift to ARM and the like, but very few could have predicted the pace of this shift has just sped up multiple times. Now, NV has hedged in the right direction, even with their main business lagging, their future is reasonably secure, and they will be able to go toe to toe with ARM design houses. Intel still stubbornly clings to their strategy (we'll see if they persist), and given their sheer size and resources, can put up stiff resistance to ARM encroachment. AMD, on the other hand, has neither, and may soon become fish food in 5 years time, given the reasons I have outlined above. That's why the BoD made this decision, and that's why it has to be now.
That is very true; but it does not solve their underlying problem in terms of the architectural design that they have been shooting for. And that's not anything that you can change outside a 3-5 year development cycle.
Ground up new ARM Design, trying not to infringe on the ARM IP ~5-6yrs.
Licensed with their own custom IP integrated; about ~1-2yrs.
[*]Intel has chosen to shoot for full x86 compatibility, abandoning any alternative ISA, and tried to leverage their vast engineering resources and superior manufacturing (compared to contract fabs such as TSMC) to brute force their way into the market. They have the goal of maintaining the vast influence of their ISA ecosystem, and extend its uses into previously unreachable areas. Even given their engineering and marketing prowess, they have partially failed (fully or not, we shall see within the next couple of iterations) to make any dent in the smartphone and MID market, and has only a token presence among tablets. If they stumble, they will still be a viable entity for a long time to come, simply due to their financial situation and their position as an industry standard purveyor.
I asked a question about development time on another board. Here is the answer I got:
So maybe the Stock Cortex A15 would be achievable?
Yep, we were supposed to have 32nm Medfield back in 2010. But so far they haven't even seen Moorestown Consumer Tablets available Retail?
This seems like an awfully large delay to me. I just have to wonder if Intel is already planning (internally) a shift towards a uarch (even more efficient than ARM) for mobile?
I think you might be overstating the importance of MS porting windows to ARM. I think its more evidence of what is going to happen than a cause.
People buy windows, not because they like it so much, but because all the software they know runs on windows. An arm windows port will run none of that, besides what MS provides, like apparently Offce (which isnt exactly irrelevant, especially in the enterprise market, but still). If you are going to buy a platform that is not compatible with any of your existing software, you might as well buy (or get for free) the best OS with the widest range of software, and I just dont think that will be Windows for most consumers. Dont forget windows has already been ported to MIPS, Alpha and PowerPC, and utterly failed to catch on on these platforms.
Reread your arguments, and then apply them to a desktop linux, like ubuntu. ubuntu today is able to run a whole lot more software, even more windows software (through wine) than windows on ARM will. Why hasnt it taken off then?
Now there are some arguments to be made to explain this, like MS preinstalling windows, forcing users to buy windows, there is complete ignorance and what not, but thats just not enough to explain ubuntu's utter lack of success so far.
Ironically, I suspect windows on ARM will suffer the same problem of "not being compatible" and while I really hope this arm push will finally make people discover ubuntu, I suspect new OSs like ChromeOS will do a lot better than either windows or ubuntu on ARM.
I do fully agree with your edit though.
Edit:
Actually for ARM based systems, these will be closer to embedded versions with some form of app_store/android market than your typical desktop OS is. So there should be greater level of HW SW integration, and less compatibility problems even than desktop Windows on x86's myriad of HW options. So in terms of comparison to Linux distros, there is no comparison in terms of ease of administration and maintenance of the OS, it's quite the opposite, really.
So Nvidia Project Denver and Desktop Google Android combined together could be more like a console PC? (possibly competing against Windows Desktop and Xbox simultaneously?)
For a couple of very simple reasons.
Most of those reasons dont apply to factory installs. Let HP/Dell/whoever sort out the driver issue, by selecting appropriate components and/or put pressure on the supplier to provide linux drivers, and the user would be shielded from that. Considering a windows license cost around $50-$100, using ubuntu would increase their profit margins manyfold and youd think OEMs would have jumped on that opportunity if they saw any consumer interest. So far, its not been very much.
As for the UI.. Ive been using ubuntu since 6.10. Ever since 8.04 IMO its been ready for Joe Six pack, provided the OS where factory installed and properly configured, so the user doesnt have to modify kernel parameters to maybe make suspend/resume work, doesnt have to compile his own wifi drivers or figure out how to get flash/mp3/codecs working properly.
If an OEM would preinstall it with everything working, there is nothing wrong with ubuntu 8.x. Its not like the UI has really changed since 8.04 or heck, even since 6.10 (except for unity, which incidentally, I despise in its current state, but thats another discussion).