Will AMD sign a new deal to build ARM chips?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

cbn

Lifer
Mar 27, 2009
12,968
221
106
It's saying something about "Display processors" to support ARM. Does that mean AMD will develop video cards supporting ARM?

Yup, I forgot "display processor" is Digitimes Speak for "GPU". It is confusing what this really means as far as the strategy goes? Discrete AMD GPUs along side ARM CPUs in notebooks/set-top boxes/SFF desktops?

Maybe AMD wants to meet Nvidia on the Android Platform?

Lately Nvidia has been making big claims about Android and I believe they are likely to aim high for this platform. (perhaps making the claim that games could be played more efficiently if coded natively for a Big screen "Lowest common denominator" Android platform rather than ported from specialized consoles to Windows Desktop).
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
This author believes AMD will design ARM SOCs claiming it would be their cheapest option.

I do know that AMD (like Nvidia) has the advantage of fast access to the half node (20/28nm) for the GPU due to the fact it designs/sells high profit discrete cards. IMHO this should help them (as well as Nvidia) compete against other players wanting to field top end SOCs for emerging software platforms.

http://www.theregister.co.uk/2011/01/12/what_does_amd_do_now/

Without Meyer, what will AMD do next?

ARM Micro Devices

By Timothy Prickett Morgan

Posted in PCs & Chips, 12th January 2011 23:16 GMT

Analysis Things seemed poised to turn around for AMD in 2011. But the abrupt departure of CEO Dirk Meyer on Monday afternoon – at the exact same time that rivals Intel and Nvidia ceased their hostilities and a week after Nvidia jumped into the processor racket – indicates that AMD's board of directors sees challenges that aren't obvious to outsiders.

And to at least one ex-insider who is now looking for a job.

The chatter is that Meyer was shown the door at AMD because the chip maker had missed the boat on smartphones and tablets, markets that are growing a lot faster (in terms of units) than traditional CPUs and GPUs. As Nvidia president and CEO Jen-Hsun Huan said at the launch of his company's Tegra 2 system-on-a-chip last week at the Consumer Electronics Show, for a lot of people, their smartphone is "their most personal computer."

As El Reg previously reported, Nvidia has licensed the Cortex-A15 processor design from ARM Holdings – the company that controls the ARM RISC chip that dominates the smartphone and handheld market – and is set to announce its own multicore ARM processor, code-named Project Denver, in 2013 alongside its Maxwell family of GPU chips.

The landscape of the computing market is changing fast, and some of that shaking and quaking is due to AMD itself. AMD bought ATI Technologies in July 2006 for $5.4bn to get its hands on a GPU and chipset businesses that would help it become a platform player like rival Intel. And in March 2009, AMD abandoned its roots and chucked its chip fabs onto the shoulders of GlobalFoundries, along with rich Abu Dhabi backers who fancy controlling a business not based on oil. (AMD founder Jerry Sanders once famously observed that "real men have fabs," snarling at rivals who used third parties to bake their chips.)

Under Dirk Meyer, AMD survived the economic downturn (albeit not unscathed), dealt with product delays and changing roadmaps for standalone processors and converged CPU-GPU hybrids, and suffered through a massive platform upgrade last year with the Opteron 4100 and 6100. AMD even faced down a resurgent Intel's Nehalem family of Core and Xeon chips in 2009 and Westmere and Nehalem-EX desktop and server processors in 2010.

AMD's Fusion line of PC chips is now ready to face Intel's Sandy Bridge 2nd Generation Cores in 2011, and their Opterons arguably beat Intel's chips in terms of price/performance and performance/watt. GlobalFoundries is apparently planning to double its spending on chip-making factories and equipment in 2011, hitting $5.7bn. Oh, and AMD buried the legal hatchet with Intel back in November 2009, raking in $1.25bn and curbing some of Intel's bad behavior.

Shouldn't Meyer have been secure in his job after that hard slog over the past few years? Apparently not.

Rather than those positive developments, AMD's board focused on what Meyer didn't get right. For example, AMD chips are not the CPUs in game consoles. And worse yet, AMD chips are not in cell phones, smartphones, or tablets. AMD has somewhere between 11 and 12 per cent of global microprocessor revenues, compared to Intel's roughly 80 per cent, and it peaked in the server racket a few years back when the Opterons were so much better than their Intel Xeon competition.

The problem is that while AMD was busy cleaning up its books, going fabless, and getting its processor roadmap back in order, a slew of other products - netbooks, tablets, ereaders, and truly smart phones - changed the market. These areas are growing while PCs and servers are losing steam. To be sure, there are hundreds of millions of PC and server chips being sold each year, and that will be true for as far as any of us can see. But today there are billions of other chips being consumed, and AMD can ill afford to ignore that fact.

AMD has the Geode LX low-powered chip, which it bought from National Semiconductor in 2003, and the company could have long since created an x64 alternative to the Athlon, Turion, or Opteron for devices such as tablets. If these Geode chips were inadequate, AMD could have partnered with or acquired VIA Technologies, another maker of low-powered x64 chips, and created something that could have taken on Intel's Atom and various ARM chips for small computing devices.

But AMD – or Meyer, it seems – apparently thought the company was on the right track in the embedded processor space with its low-powered Opterons for hyperscale server clusters and the combination of the Athlon II X2 embedded processor and the Radeon HD 6700M for high-end devices such as the Surface 2.0 touch desktop from Samsung and Microsoft, which debuted at CES 2011.

AMD just released its Fusion C-Series low-power APUs (accelerated processing units), and its Fusion G-Series embedded APUs are soon to hit the streets. Both are based on the company's new low-power Bobcat CPU cores – AMD's first new x86 core since 2003 – and both feature an integrated Radeon-based GPU core on the same silicon. While these CPU-GPU APUs are impressive, it's unlikely you'll see them a smartphone – perhaps a tablet, but we'll have to wait and see.

To play the CPU-GPU limbo game, AMD could get under the power consumption broomstick in a much simpler way. The most obvious thing would be to do the same math that Nvidia did many years ago, see that at least some of the "computing" market was shifting to ARM processors, and license the Cortex designs from ARM Holdings. That way, AMD could come up with something new and interesting, like Nvidia is trying to do with the Denver multicore ARM chips for servers and their related Maxwell GPUs.

But the ARM market is rapidly getting crowded, with Calxeda and Nvidia now chasing servers, Nvidia chasing PCs, and a slew of companies – including Apple, Nvidia, Marvell, Qualcomm, Texas Instruments, and Samsung – are making ARM-derived chips for netbooks and smartphones.

For the PC and server markets, ARM chips won't be fully suitable until they have higher clock speeds, more cores, and more memory and I/O capacity. So it's not too late for the new CEO at AMD to jump into the ARM fray, or to acquire someone who is already there. Considering that Marvell now has a market capitalization of $13.2bn and Nvidia is at just under $12b, such acquisitions seem unlikely. And it's hard to imagine that Samsung, Texas Instruments, or Apple would want to buy AMD – although an Apple acquisition of AMD and a subsequent push into ARM chips would certainly be an interesting development.

An ARM license would not only be the cheapest alternative for AMD, but perhaps the only alternative – short of merging with Intel. (Wouldn't that be funny, watching Intel argue with US and EU antitrust authorities that the relevant market includes ARM devices, and that it doesn't have a monopoly on microprocessors?) An ARM license would put AMD in competition with lots of aggressive chip makers that peddle high-volume, low-margin chips, while at the same time Intel and AMD would fight the converged CPU-GPU wars on the desktop and in the laptop.

It's hard to see what AMD's next CEO will do – but it's pretty clear that it won't be an easy job.
 
Last edited:

Hard Ball

Senior member
Jul 3, 2005
594
0
0
This author believes AMD will design ARM SOCs claiming it would be their cheapest option.

I do know that AMD (like Nvidia) has the advantage of fast access to the half node (20/28nm) for the GPU due to the fact it designs/sells high profit discrete cards. IMHO this should help them (as well as Nvidia) compete against other players wanting to field top end SOCs for emerging software platforms.

http://www.theregister.co.uk/2011/01/12/what_does_amd_do_now/

That is very true; but it does not solve their underlying problem in terms of the architectural design that they have been shooting for. And that's not anything that you can change outside a 3-5 year development cycle.

Bobcat is a technically very sound chip, in just about every way. The engineering and planning is probably superior to anything that Intel has in the similar thermal envelope and market segments; and it should do very well among sub-notebooks and net-tops. But the fundamental problem is that they will not be competing primarily with Intel, but with the various ARM vendors, if they will have a prayer of growth a couple of years from now.

Of course, hindsight is always perfect, but it's useful to look back at their strategy. They basically made some very important bets from 07 - 09, and these has nothing to do with K10 or Bulldozer, or with the graphics stack. One is to abandon the development of ARM and other lower power ISAs, primarily on xillion and imageon (actually goes back further, if you remember alchemy), which would have only contributed minor cash flow problems (if at all). In cutting off everything that is not x86 or D3D, they basically bet against almost everyone else in the industry that were beginning to focus on low power systems (well, Intel partially did that as well, just about the only other).

At the same time, they anticipated that the low power system design point that they need to aim for are in the 5-15W range. If they did not have an ARM strategy, and knowing the design criterion of silverthorne, they should have aimed to undercut Atom TDP while maintaining similar performance, if they were prudently hedging their strategy. Instead, they decided to develop a fully out of order, superscalar with a complex front end, with decode support for a wide range of x86 extensions to out perform Atom. Now we know that even Intel's strategy is problematic in competing in the MID, let alone phone markets, there Bobcat simply has no chance, and won't have a chance during the lifetime of the design. It would have been much more prudent to design something in order, 2-issue, with a single FP, simplified branch predict, as well as do without most SIMD extensions, and pull the design point within striking distance of ARM variants.

Meyer has been a fantastic CEO that steered them through difficult period of GF spinoff (mostly engineered by Ruiz), Intel settlement, and other transitions. He also has righted a lot of the execution problems at AMD, so now that it finally will have a reasonably strong lineup through all segments, on both sequential and throughput fronts. But in terms of his strategic vision, I think a fair minded person would have to give him and his team an F, for shutting themselves out of the markets with the most potential growth, and basically putting the company on a perilous path a few years from now.
 

P4man

Senior member
Aug 27, 2010
254
0
0
But in terms of his strategic vision, I think a fair minded person would have to give him and his team an F, for shutting themselves out of the markets with the most potential growth, and basically putting the company on a perilous path a few years from now.

I generally agree with your entire post, though there is another way of looking at it. AMD has what, 10-15% marketshare in x86? Since the x86 market is huge, and even in the most optimistic ARM uptake scenario, is not going away anytime soon; given there are only 2 real players, AMD has huge growth potential, at least in theory. So there is something to be said for concentrating their efforts on x86.

OTOH, AMD's problem is the same it has been for the last 20+ years; it pretty damn tough competing with intel on equal ground. Not that AMD doesnt have excellent engineers, but just look at the R&D budgets. Intel can afford a plan B, C, D and if that fails, still go ahead with plan E. Even if plan E is rubbish, they still make more money on it than AMD can from its most ingenious design. If AMD hits a single snag, they are in deep, deep trouble. Think Barcelona and how it impacted AMD, and compare it to Intel's multi billion dollar failures like Itanium or Larabee and how it doesnt even seem to impact their results one tiny bit. Or the other way around, how relatively little money they made those few times when they actually had better products for brief periods of time, like the Athlon against P3 or K8 against late P4s. AMD still bled money more often than not.

Moreover, the dynamics of this market are against AMD; cost of developing new chips (and manufacturing processes) skyrockets, so whatever disadvantage they already had over the past decades, is only getting bigger.

Thats why I agree they should have pursued alternative market opportunities when they could afford it; like nVidia did rather cleverly over the last years with Tesla, Tegra and now project denver. None of these are making them a lot of money yet, but the potential is there and the risk spread. AMD otoh, keeps betting the farm on that same old horse, that for the past decades hasnt really won races very often. Risky strategy to put it mildly.
 

Hard Ball

Senior member
Jul 3, 2005
594
0
0
I generally agree with your entire post, though there is another way of looking at it. AMD has what, 10-15% marketshare in x86? Since the x86 market is huge, and even in the most optimistic ARM uptake scenario, is not going away anytime soon; given there are only 2 real players, AMD has huge growth potential, at least in theory. So there is something to be said for concentrating their efforts on x86.

OTOH, AMD's problem is the same it has been for the last 20+ years; it pretty damn tough competing with intel on equal ground. Not that AMD doesnt have excellent engineers, but just look at the R&D budgets. Intel can afford a plan B, C, D and if that fails, still go ahead with plan E. Even if plan E is rubbish, they still make more money on it than AMD can from its most ingenious design. If AMD hits a single snag, they are in deep, deep trouble. Think Barcelona and how it impacted AMD, and compare it to Intel's multi billion dollar failures like Itanium or Larabee and how it doesnt even seem to impact their results one tiny bit. Or the other way around, how relatively little money they made those few times when they actually had better products for brief periods of time, like the Athlon against P3 or K8 against late P4s. AMD still bled money more often than not.

Moreover, the dynamics of this market are against AMD; cost of developing new chips (and manufacturing processes) skyrockets, so whatever disadvantage they already had over the past decades, is only getting bigger.

Thats why I agree they should have pursued alternative market opportunities when they could afford it; like nVidia did rather cleverly over the last years with Tesla, Tegra and now project denver. None of these are making them a lot of money yet, but the potential is there and the risk spread. AMD otoh, keeps betting the farm on that same old horse, that for the past decades hasnt really won races very often. Risky strategy to put it mildly.

Yes, you make some very good points.

Another way of looking at this is: There are the three most important component suppliers for the Personal Computer (I'm talking all x86 compatible PC and Macs), Intel, AMD, and NV. All of them have been threatened by low power ISA designs since circa 2007, as many people in the industry have foreseen this. Each must adopt some kind of long term strategy to cope with the upcoming tectonic shift in the industry.

Of the three:
  • Intel has chosen to shoot for full x86 compatibility, abandoning any alternative ISA, and tried to leverage their vast engineering resources and superior manufacturing (compared to contract fabs such as TSMC) to brute force their way into the market. They have the goal of maintaining the vast influence of their ISA ecosystem, and extend its uses into previously unreachable areas. Even given their engineering and marketing prowess, they have partially failed (fully or not, we shall see within the next couple of iterations) to make any dent in the smartphone and MID market, and has only a token presence among tablets. If they stumble, they will still be a viable entity for a long time to come, simply due to their financial situation and their position as an industry standard purveyor.
  • Nvidia has elected to go the opposite route, partially because Intel has time and again denied to extend their ISA license, and even tried to shut them out of their entire platform. They have been betting that there will be an eventual shift in the form-factors, and therefore a shift on the demand of the component suppliers, and started to hedge their bets in their own ARM design.
  • AMD on the other hand, actually had the most experience of low powered designs (their own alchemy line started in 2002, along with ATi's bets in the area that we have discussed to death), rather inexplicably decided that the perceived industry trend is either wrong, or is irrelevant to their future roadmap. The targeted their own low powered microarchitecture above even that of Intel's, basically above that of netbooks into ultraportable / nettops category. If silverthorne did not succeed in smartphones/MIDs, then Bobcat basically has zero chance of making it in that market, no matter what other tricks they the the pony to do. And AMD incidentally has much less resources than Intel, and can ill afford a wrong bet of such a magnitude.

Now the kicker:
If it's simply about small formfactor embedded devices, then this shift in the semicon industry will take the good part of a decade, if not longer. But a series of recent events, culminating in the MS + ARM announcement at CES, demonstrated that, not only is the PC market overall being curbed by the growth of these embedded platforms, but the x86 ecosystem's share of the PC market itself will be threatened. Soon, there will be serious challenges from ARM and others on segments of x86 core markets.

Given that applications are rapidly transforming into web-services, and the inevitable march toward server virtualization on the cloud, and with the dominant client OS now ported to an alternative ISA, the niches where x86 can hide will diminish greatly in the next several years. Neither client side nor server side would be a safe haven for x86 systems even in the medium term future, they will have to compete on the basis of merit, especially performance per watt.

So in this sense, most of us have anticipated an eventual shift to ARM and the like, but very few could have predicted the pace of this shift has just sped up multiple times. Now, NV has hedged in the right direction, even with their main business lagging, their future is reasonably secure, and they will be able to go toe to toe with ARM design houses. Intel still stubbornly clings to their strategy (we'll see if they persist), and given their sheer size and resources, can put up stiff resistance to ARM encroachment. AMD, on the other hand, has neither, and may soon become fish food in 5 years time, given the reasons I have outlined above. That's why the BoD made this decision, and that's why it has to be now.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Now the kicker:
If it's simply about small formfactor embedded devices, then this shift in the semicon industry will take the good part of a decade, if not longer. But a series of recent events, culminating in the MS + ARM announcement at CES, demonstrated that, not only is the PC market overall being curbed by the growth of these embedded platforms, but the x86 ecosystem's share of the PC market itself will be threatened. Soon, there will be serious challenges from ARM and others on segments of x86 core markets.

Given that applications are rapidly transforming into web-services, and the inevitable march toward server virtualization on the cloud, and with the dominant client OS now ported to an alternative ISA, the niches where x86 can hide will diminish greatly in the next several years. Neither client side nor server side would be a safe haven for x86 systems even in the medium term future, they will have to compete on the basis of merit, especially performance per watt.

So in this sense, most of us have anticipated an eventual shift to ARM and the like, but very few could have predicted the pace of this shift has just sped up multiple times. Now, NV has hedged in the right direction, even with their main business lagging, their future is reasonably secure, and they will be able to go toe to toe with ARM design houses. Intel still stubbornly clings to their strategy (we'll see if they persist), and given their sheer size and resources, can put up stiff resistance to ARM encroachment. AMD, on the other hand, has neither, and may soon become fish food in 5 years time, given the reasons I have outlined above. That's why the BoD made this decision, and that's why it has to be now.

Yep, you make very good points here.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
That is very true; but it does not solve their underlying problem in terms of the architectural design that they have been shooting for. And that's not anything that you can change outside a 3-5 year development cycle.

I asked a question about development time on another board. Here is the answer I got:

Ground up new ARM Design, trying not to infringe on the ARM IP ~5-6yrs.
Licensed with their own custom IP integrated; about ~1-2yrs.

So maybe the Stock Cortex A15 would be achievable?
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
[*]Intel has chosen to shoot for full x86 compatibility, abandoning any alternative ISA, and tried to leverage their vast engineering resources and superior manufacturing (compared to contract fabs such as TSMC) to brute force their way into the market. They have the goal of maintaining the vast influence of their ISA ecosystem, and extend its uses into previously unreachable areas. Even given their engineering and marketing prowess, they have partially failed (fully or not, we shall see within the next couple of iterations) to make any dent in the smartphone and MID market, and has only a token presence among tablets. If they stumble, they will still be a viable entity for a long time to come, simply due to their financial situation and their position as an industry standard purveyor.

Yep, we were supposed to have 32nm Medfield back in 2010. But so far they haven't even seen Moorestown Consumer Tablets available Retail?

This seems like an awfully large delay to me. I just have to wonder if Intel is already planning (internally) a shift towards a uarch (even more efficient than ARM) for mobile?
 

Hard Ball

Senior member
Jul 3, 2005
594
0
0
I asked a question about development time on another board. Here is the answer I got:



So maybe the Stock Cortex A15 would be achievable?

Well, that might be possible.

But to think of AMD's grand strategy of fusion, of intending to integrate a their own throughput SPMD design onto a heterogeneous platform, it will take much longer to do what they actually desire to do even with a stock ARM design. They need to forge a way forward for their fusion strategy with greater integration and better memory consistency models, and taking someone else's design is never going to provide that.

The even more important fact is that, the greatest advantage of AMD over any run of the mill company that can do a sequential design is their experience at doing high performance OoO superscalar, speculative pipelines with the appropriate model for memory reference. To simply take a stock cortex design is basically consigning themselves willingly into a vegetative state, purely for the purpose of a base level of survival and existence. That might be even a less desirable outcome for their management and share holders, than simply to allow themselves to be acquired and their people, experience, and IP to be put to much better use.

In common terms, you can think of ARM being roughly the equivalent of the Borg, and NV being some version of Locutus . Except in this case Locutus at no point wants to be rescued to rejoin the Federation.
 

Hard Ball

Senior member
Jul 3, 2005
594
0
0
Yep, we were supposed to have 32nm Medfield back in 2010. But so far they haven't even seen Moorestown Consumer Tablets available Retail?

This seems like an awfully large delay to me. I just have to wonder if Intel is already planning (internally) a shift towards a uarch (even more efficient than ARM) for mobile?

I think you are referring to some "just in case" design by Intel. I have heard some tid bits from people in the industry for/against that hypothesis, but it's not really my place to talk about these in the public.

As much attention as the industry watchers have been following the Windows on ARM story, I don't think the bulk of the lay press really appreciates all of the potential implications. One can think of the entire tussle between x86 and ARM variants as being a dam engineering project. The currents of ARM have attempted in many ways of slithering around the project, where, Intel, the civil eng tried to plug leaks and fractures on the dam facade as well as in layers of subsoils with reasonable rate of success. The dam itself would be the dominant client OS that despite numerous pressure points from the currents behind has held back the bulk of their advances. Now, the dam owner, MS, has decided to light a pack of trinitroglyceride named W8 at the site of the main locks, with the fuse set at 2012. So we get to sit back and watch Intel's valiant efforts or see the fireworks.


Back shortly after ATi acquisition, with the talks of Torrenza, I thought that there was a fair chance that AMD might pursue a heterogeneous system architecture involving x86, D3D, and some low power architecture like ARM. Something maybe even with a few ARM cores as the main sequential processors, full IO virtualization; with x86 and D3D co-processors and a innovative directory based coherence protocol that plays well with all. That type of system, if implemented right, would allow a smooth migration away from x86. Now, AMD has basically blew any chance of that, and the time has more or less run out on that. That type of system may not be required any more, since the cloud and virtualization has made considerable progress with pretty much all the major software vendors.

I really don't know what will happen next; but I think people in the forum and else should prepare for a realistic possibility of AMD no longer in existence in the form that we have know, and the industry no longer having a second source of x86 chips. Several years ago, even with AMD's terrible financial position, I have always had confidence that they would pull it out with the engineering talent and their IP base. But in the last several weeks, particularly seeing the events at CES, I'm no longer confident of that prediction.
 

P4man

Senior member
Aug 27, 2010
254
0
0
I think you might be overstating the importance of MS porting windows to ARM. I think its more evidence of what is going to happen than a cause.

People buy windows, not because they like it so much, but because all the software they know runs on windows. An arm windows port will run none of that, besides what MS provides, like apparently Offce (which isnt exactly irrelevant, especially in the enterprise market, but still). If you are going to buy a platform that is not compatible with any of your existing software, you might as well buy (or get for free) the best OS with the widest range of software, and I just dont think that will be Windows for most consumers. Dont forget windows has already been ported to MIPS, Alpha and PowerPC, and utterly failed to catch on on these platforms.
 

Hard Ball

Senior member
Jul 3, 2005
594
0
0
I think you might be overstating the importance of MS porting windows to ARM. I think its more evidence of what is going to happen than a cause.

People buy windows, not because they like it so much, but because all the software they know runs on windows. An arm windows port will run none of that, besides what MS provides, like apparently Offce (which isnt exactly irrelevant, especially in the enterprise market, but still). If you are going to buy a platform that is not compatible with any of your existing software, you might as well buy (or get for free) the best OS with the widest range of software, and I just dont think that will be Windows for most consumers. Dont forget windows has already been ported to MIPS, Alpha and PowerPC, and utterly failed to catch on on these platforms.

10 years ago, what you said would definitely be true, but not any more.

You have to think about this with a perspective of the typical user of a client system on the street or in the home. The type of applications and work load that people in tech fora like this are note representative of what people normally use:
  • Does the typical user own photoshop? They probably can't afford to, and have no reason to even own Elements.
  • Does the typical user encode SD or HD quality videos all day on there system? Unlikely, they are much more likely to subscribe to Hulu than to have multi-terabyte collections of films.
  • Does the average user out there play D3D10/11 games? Not a chance, most probably have never even heard of any of the major titles that people here might be familiar with; few probably would even be able to tell you what WOW means.
  • Do most people tweak their machines, and have a 3DMark shortcut sitting on their desktop? Not a chance in this world.

The types of that are difficult to work with on cross-compilers, such as those that take multiple passes of IR-optimization between semantic parsing and assembly/register assignment, and those that requires a specialized asm kernel to be fully optimized, or those that rely on very specific SSE or other x86 extensions, are precisely the same types of applications that I mentioned above. I think techies like us tend to have a rather narrow view of what type of applications are necessary to comprise a fully functioning and satisfactory system. The typical owners of 300USD desktop or 450USD laptops probably never would use any of the applications of that ilk.

And those bear little resemblance to the typical apps that the average user need to access their web documents, or facebook, or on-line music collection, or some simple mail client, etc. I know exactly, since I have worked on cross-compiler frameworks including LLVM, and architectural simulators. Most common apps today and tomorrow that most people use either are cross-platform by nature, rely on some JIT compiler, or simply run as a web service. The one common app suite that a large portion of the people run, that is not natually cross-platform, would be Office, which incidentally is controlled by MS.

I think a few things in the recent couple of years concoct an almost perfect storm:
  • Proliferation of virtualization and web/cloud services, which is the direction most major soft vendors are heading; e.g. EC, Azure, ChromOS, etc; really starting to take off within the last couple of years.
  • Advent of the iPad and other subsequent tablet style embedded systems; which has the potential to replace much of the functionality of content consumption on the client side.
  • The rapid ascendency of Android, which brings customers to many ARM manufacturers, and provides a new open ecosystem for application development (along with iOS, BB, etc) free from ISA compatibility.
  • Intel's inability to gain any traction on Atom in any device more mobile than a netbook; which seriously blunts their mobile strategy and put them on the defensive against ARM.
  • Now the capitulation of the the wintel alliance that will allow the large scale encroachment of ARM devices onto x86 home territory.

I think when we put all of this together, it spells some major changes in the balance of power between the two major client side ISAs of our time. Only time will tell the exact outcome, but things are looking more grim for x86 than any time during its life.

Edit:
I think it's important to note that ARM doesn't have to convert all or most of the users of personal systems from x86. But even if a minority of the users of netbooks, ultra-portables, nettops, SFF swing to ARM system, it will diminish the financial viability of x86 ecosystem considerably. These will heavily affect the viability of designs such as Atom, Zacate, and Nano. And the amount of users that absolutely need x86 for certain specialized apps is a minority, and probably will be a shrinking minority in the future.
 
Last edited:

P4man

Senior member
Aug 27, 2010
254
0
0
Reread your arguments, and then apply them to a desktop linux, like ubuntu. ubuntu today is able to run a whole lot more software, even more windows software (through wine) than windows on ARM will. Why hasnt it taken off then?

Now there are some arguments to be made to explain this, like MS preinstalling windows, forcing users to buy windows, there is complete ignorance and what not, but thats just not enough to explain ubuntu's utter lack of success so far.

Ironically, I suspect windows on ARM will suffer the same problem of "not being compatible" and while I really hope this arm push will finally make people discover ubuntu, I suspect new OSs like ChromeOS will do a lot better than either windows or ubuntu on ARM.

I do fully agree with your edit though.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
I feel that MS porting Windows (and Office) to ARM (while promoting Cloud computing) is a purely reactionary move to Google's plans.

Therefore let us try to deduce what Google's plans are given what little information we already have.

From a hardware standpoint:

1. Nvidia has claimed Google was the company's future.

2. Nvidia AFAIK doesn't have any presence for future gaming consoles (Rumors from Feb 2009 claim Intel has won the graphics contract for Sony PS4)

3. Nvidia could kill two birds with one stone (Intel/AMD) by positioning themselves and Google as more efficient alternative to playing PowerPC console ports on Windows. More performance from less hardware because the software is programmed natively to the Platform and High Power Cortex A15? If Nvidia is already sponsoring Millions of dollars for the PC TWIMTBP program why wouldn't they do the same for Google?

4. Having a dominant share on a Desktop platform might help them expand CUDA/Physx.

---------------------------------------------------------------------------------------

Maybe I am wrong about this, but I am having a hard time believing they wouldn't at least try to do something like that.

Hopefully AMD has time to contact Google to see what is happening with future versions of Android? Then if possible adopt Cortex A15 High Performance Implementation if necessary? I'm sure Google would love that as it brings more competition and lower prices to their platform.

---------------------------------------------------------------------------------------

Does anyone have opinions on the Software standpoint?

One theory I have is that ARM has more software development momentum behind it. For example young people cutting their teeth on writing new programs may naturally have gravitated towards ARM for various reasons.

As the ARM platforms mature the young programming talent looks for more opportunities to create greater programs? Google (and others) understand this drive so they try to accommodate their wishes by opening the doors to more powerful (yet affordable) hardware/ software platforms?

Maybe someone can shed more light on the issue here?
 

Hard Ball

Senior member
Jul 3, 2005
594
0
0
Reread your arguments, and then apply them to a desktop linux, like ubuntu. ubuntu today is able to run a whole lot more software, even more windows software (through wine) than windows on ARM will. Why hasnt it taken off then?
Now there are some arguments to be made to explain this, like MS preinstalling windows, forcing users to buy windows, there is complete ignorance and what not, but thats just not enough to explain ubuntu's utter lack of success so far.

For a couple of very simple reasons.
  • Desktop Linux has never had a comprehensive strategy of capturing non-technical users until Ubuntu very recently (really since Lucid LTS, which I love, and run on my main work laptop), and has really never targeted average end users' home systems until now. Now that they actually have a good comprehensive strategy, and possibly a credible consistent UI design underway (Unity; Gnome-shell is still too much in flux, and Mutter performance is still aweful).


    If you have seriously used desktop linux on a regular basis, you will know that it has never been user friendly in any sense of the word (any of Gnome 2.x, KDE 2.x, 3.x, XFCE, LXDE, etc, etc). I use it everyday of my life, sometimes multiple distributions, in the last 8-10 years, and I have until recently (2010) seen a system with credible, consistent UI that is designed according to good HCI principles. That's simply not existed until Shuttleworth's recent push. Even though, it still has a long way to go, I'm hoping for big things for Natty with Unity.

  • Drivers, drivers, drivers. If you use linux distributions regularly, you would know that many HW drivers have been problematic.
    • As late as 2007, many wireless chipset vendors required manual ndiswrapper patch (for as large vendors as broadcom).
    • Sound input output often had problems operating (actually some versions of Suse, and Ubuntu Karmic had serious problems with a lot of headphone/microphone combo jacks, of which my laptop was one of them).
    • Problems with a variety of peripherals, including certain types of printers due to incompatibilities in CUPS, or problems with webcams.
    • Problems with x windowing systems, numerous image corrputions, particularly with SVG images; remember the infamous "SWcursor" and "EXANoDownloadFromScreen" options that had to be enabled as workarounds in Jaunty and Karmic? Or perhaps you recall the imfamous xserver 7.4 errror that wiped a good number of Mint and Ubuntu systems earlier in the year?
    • Natually, also have to mention graphics drivers: until 2007 fglrx for Radeons were notoriously crash prone, and situation only started to improve with a mature set of radeon and radeonhd OSS drivers. On the NV side, today if you try to install Gnome-shell on Fedora or Ubuntu with NV hardware, more often than not, you will get a black screen, or if you are lucky, fall back to metacity. Until the nouveau project is fully ready, NV hardware with new gnome-shell or Unity DE is basically a no go.
    • Perhaps you also remember a lot of ACPI related errors, or some notebooks' BIOS's DSDT cannot be properly read by the 2.6.31 kernel in late 09, for PM55/HM55 and a couple of other models, that cause some notebooks to overheat and shut down. That actually eventually required a patch in an early 2.6.33rc version to fix, had to be done by getting a mainline kernel before the 2.6.33 was available in the major repos.
    • I can probably go on for 20 pages about problems an potential problems a Linux user might face in terms of drivers and other hardware/firmware compatibility issues.

    I love using linux distros, and there is no other OS that have the functionalities that I and many other IT professional require. And once configured, patched, and adminstrated properly, certain distros are absolutely a joy to use, having the most advanced compiler framework to the most stunning GUI. But I'm under no illusion that an average distro of Linux is ready to be tossed to someone that has little technical aptitude or experience to deal with myriad of the problems that he/she might face in the course of installing, updating, and using the OS. To expect the average non-technical person to use even Ubuntu without on-hand tech support is simply irresponsible and non-sensical.

    I actually have great hope that the upcoming Natty will begin to fulfill the promise of Shuttleworth on finally deliverying a consumer worthy version of a major Linux distro. I would probably even have more confidence in 12.04 when Wayland should have replaced X.

  • Last, and very importantly, many software and hardware companies are less willing to deal with and develop for a complete OSS system, since there are always licensing issues and entanglements in GPL to be worked out. And these usually amount to no small matter, and often prevent proprietary software to be developed for and deployed on OSS systems.

Ironically, I suspect windows on ARM will suffer the same problem of "not being compatible" and while I really hope this arm push will finally make people discover ubuntu, I suspect new OSs like ChromeOS will do a lot better than either windows or ubuntu on ARM.

I do fully agree with your edit though.

As I have said, I think Ubuntu, particular among Linux distributions, has made great strides. But they are not quite there yet for a fool-proof, average-joe/jane compatible and manageable system. It is my hope that they will get there by 12.04 LTS.


Edit:
Actually for ARM based systems, these will be closer to embedded versions with some form of app_store/android market than your typical desktop OS is. So there should be greater level of HW SW integration, and less compatibility problems even than desktop Windows on x86's myriad of HW options. So in terms of comparison to Linux distros, there is no comparison in terms of ease of administration and maintenance of the OS, it's quite the opposite, really.
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
Edit:
Actually for ARM based systems, these will be closer to embedded versions with some form of app_store/android market than your typical desktop OS is. So there should be greater level of HW SW integration, and less compatibility problems even than desktop Windows on x86's myriad of HW options. So in terms of comparison to Linux distros, there is no comparison in terms of ease of administration and maintenance of the OS, it's quite the opposite, really.

So Nvidia Project Denver and Desktop Google Android combined together could be more like a console PC? (possibly competing against Windows Desktop and Xbox simultaneously?)
 

Hard Ball

Senior member
Jul 3, 2005
594
0
0
So Nvidia Project Denver and Desktop Google Android combined together could be more like a console PC? (possibly competing against Windows Desktop and Xbox simultaneously?)

I don't know about xbox level of integration; but I think the ecosystem of hardwares would be considerably smaller, and easier for QC than a typical x86 ecosystem, at least for the initial couple of years.
 

P4man

Senior member
Aug 27, 2010
254
0
0
For a couple of very simple reasons.

Most of those reasons dont apply to factory installs. Let HP/Dell/whoever sort out the driver issue, by selecting appropriate components and/or put pressure on the supplier to provide linux drivers, and the user would be shielded from that. Considering a windows license cost around $50-$100, using ubuntu would increase their profit margins manyfold and youd think OEMs would have jumped on that opportunity if they saw any consumer interest. So far, its not been very much.

As for the UI.. Ive been using ubuntu since 6.10. Ever since 8.04 IMO its been ready for Joe Six pack, provided the OS where factory installed and properly configured, so the user doesnt have to modify kernel parameters to maybe make suspend/resume work, doesnt have to compile his own wifi drivers or figure out how to get flash/mp3/codecs working properly.

If an OEM would preinstall it with everything working, there is nothing wrong with ubuntu 8.x. Its not like the UI has really changed since 8.04 or heck, even since 6.10 (except for unity, which incidentally, I despise in its current state, but thats another discussion).
 

Hard Ball

Senior member
Jul 3, 2005
594
0
0
Most of those reasons dont apply to factory installs. Let HP/Dell/whoever sort out the driver issue, by selecting appropriate components and/or put pressure on the supplier to provide linux drivers, and the user would be shielded from that. Considering a windows license cost around $50-$100, using ubuntu would increase their profit margins manyfold and youd think OEMs would have jumped on that opportunity if they saw any consumer interest. So far, its not been very much.

Yes they do,

Dell does already have a Ubuntu line of computers, and yet they often do not deal with these issues.

Most of these driver issues that we have been talking about are kernel level issues, which require them to make contributions to open source projects under GPL, most of which they are quite reluctant to do.

From this conversation, it doesn't seem that you have much experience developing any software under GPL or looked into the Kernel code for Linux. I encourage you to try to do that, after which you would be able to appreciate a lot more of what I have been saying.

As for the UI.. Ive been using ubuntu since 6.10. Ever since 8.04 IMO its been ready for Joe Six pack, provided the OS where factory installed and properly configured, so the user doesnt have to modify kernel parameters to maybe make suspend/resume work, doesnt have to compile his own wifi drivers or figure out how to get flash/mp3/codecs working properly.

That entirely depends on your HW system, if your system is lucky enough to have the proper development and support in place, then it will run fine with a stock kernel and stock distro. Kernel parameters should not need to be modified to get suspend/resume working with all of the devices, a problem usually means something is amiss with bootstrapping BIOS, perhaps DSDT or SSDT. It would be a workaround if you do have to.

If an OEM would preinstall it with everything working, there is nothing wrong with ubuntu 8.x. Its not like the UI has really changed since 8.04 or heck, even since 6.10 (except for unity, which incidentally, I despise in its current state, but thats another discussion).

I'm sure that they can install any UI layer or windowing mgr that they like, but short of them undertaking a full scale DE development alone (such as Unity, which no every distro even has the resource to do), they are basically stuck with some minor modifications and theming of a standard DE (four most popular being gnome2.0, KDE4, xfce or lxde). Of course the UI has not changed since 6.10, that's why I said I have high hopes for Unity (possibly also G-S) in Natty; gnome2.0 is generally not an acceptable UI paradigm for everyone, and plenty of people find major fault with it. If you don't believe me, you can see what kind of complaints Torvalds has with it: http://www.desktoplinux.com/news/NS8745257437.html . And there are also numerous complaints with KDE4 as well, on the other hand.

And you do not represent the typical user. Think about your grandmother, and how she would look at a new OS, new interface, and having to do some amount of admin on top of that.

The problem is not with the distro, it's with larger community maintained projects such as X, or gnome, or the kernel itself.
 
Last edited:
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |