Best possible scenario for post-Bulldozer AMD x86 CPUs?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Mar 10, 2006
11,715
2,012
126
Yep, its true.

Per the thread's title question, the best possible post-bulldozer scenario is that Intel elects to become a division of UNICEF and thereby voluntarily chooses to manufacture AMD's future processors on a pro-bono basis. You know, for the betterment of mankind and all that.

Because right now AMD doesn't have too many other options when it comes to improving on price/performance. Only Intel's management and decision makers had the wisdom and fortitude year's ago to see to it that they were smartly investing in the R&D that they would need some 7-10 yrs later (now).

And for that they should be punished, nobody likes a smart kid who has planned smartly for his future. Bring in the FTC and bust them up!

If you believe AMD management, the move to ARM is going to fix everything!

The same comments about thread crapping go to you. This post added nothing of value to the conversation and was nothing more than a swipe at AMD
-ViRGE
 
Last edited by a moderator:

CakeMonster

Golden Member
Nov 22, 2012
1,428
535
136
How much current bias is there in these numbers?

Forgive me for not having a clue about foundries and materials, but looking at these graphs, 20nm and 16/14nm only appears to be slightly more expensive than 28nm. And looking at the fact that 1) 20/16/14 is being actively pursued and in fact built by several companies and 2) 28nm has improved massively during its lifetime - then why would anyone stay at 28nm for a long time when we should expect 20/16/14 to improve at least performance wise, but probably when it comes to production efficiency and cost too after a while.

I realize that things are slowing down, and that 28nm will live longer than the nodes before it, but given what looks like a very small cost penalty of 20/16/14 from these numbers, I still don't see why those would not still take over after a relatively short while. I'm sure some 28nm factories can stay in business for a good while to take care of demand from customers that don't need the bleeding edge but I can't see why we would not still move ahead even if physics ends up stalling us completely at 14ish (though I don't believe that either).
 
Last edited:

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
[...]then why would anyone stay at 28nm for a long time when we should expect 20/16/14 to improve at least performance wise, but probably when it comes to production efficiency and cost too after a while.

Because "the gate utilization is an issue because of limitations of the design tools and parasitic effects. “The other factor is parametric yields, which are strictly tied into leakage control for the 20nm and of course for the 16nm FinFETs,” he said. “You can break this. Intel has shown that it can be broken and of course that’s an excellent achievement. But, it’s based on very high design costs, potentially $1 billion per design, so you need $10 billion in revenue. It also takes a number of years."
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
I think the best possible scenario for AMD is one where they lose a significant chunk of their share on the consumer market but still gets a viable x86 business when combining their small x86 share with their embedded business.

I don't think they have good prospects at anything. With the world going mobile they sorely lack IP to get into this market bracket, they'll be lucky if they are able to retain any share, and they shouldn't have any share on the server business beyond micro servers.

Now, this is the good scenario. The bad scenario is another "unmitigated failure", effectively wiping them out on consumer and servers, leaving just embedded to carry on the rest of the business, with the dire consequences on cash flow profile and R&D budget.
 

SOFTengCOMPelec

Platinum Member
May 9, 2013
2,417
75
91
Yep, its true.

Per the thread's title question, the best possible post-bulldozer scenario is that Intel elects to become a division of UNICEF and thereby voluntarily chooses to manufacture AMD's future processors on a pro-bono basis. You know, for the betterment of mankind and all that.

Because right now AMD doesn't have too many other options when it comes to improving on price/performance. Only Intel's management and decision makers had the wisdom and fortitude year's ago to see to it that they were smartly investing in the R&D that they would need some 7-10 yrs later (now).

And for that they should be punished, nobody likes a smart kid who has planned smartly for his future. Bring in the FTC and bust them up!

My response is somewhat off-topic.

Analogy:

What if it was 1948, and the transistor had recently been invented, and only Intel (who did not exist at this point in time) could make them, and they were considerably faster/cheaper/lower-power-consumption/better/reliable than everyone else's Electronics, which would have been Tubes(Valves) and mechanical-relays.

To further rub salt in the wound, let's say that Intel ONLY used these transistors, to make mainframe computers (Servers), which although fast and good, were very over-priced, as they were the only ones making (transistorized) mainframe computers, with no realistic competition.
Except AMDs terribly slow/expensive valve mainframes, which were bad compared to Intels.

The world would get no (transistor based) colour (color) TVs, Radios, Integrated circuits (as they need transistors), etc etc.

This would be very bad for the world.

So maybe sooner or later, something needs to be done, otherwise the whole world would suffer.
i.e. Such a big/wonderful thing such as an integrated circuit (the latest ones), needs to be shared, NOT given to just one person/company in the world.

Back on topic:

Everyone, AMD included, ideally need the best IC chip plants, so it being in procession of just one company (Intel), who mainly only make one thing with it (cpus), does not sound good, either for AMD or the world, as a whole.

Maybe the best/ideal solution for AMD would be to have an APU line, which can be boosted/expanded up to be FX like (i.e. higher GHz, more cores, and other SoC stuff).
 
Last edited:

Fjodor2001

Diamond Member
Feb 6, 2010
3,938
408
126
Even Samsung says the same.

Some comments:

* The graph does not provide any absolute numbers for the design cost, nor does it put it into relation to AMD's total R&D budget. So it's impossible to draw any conclusions from that.

* Price per transistor is not static over time. Likely 20/16/14 nm will come down in price going forward, just as has happened for all other nodes that precede them.

* Since AMD will be sticking to 28 nm for Carrizo/Excavator, chances are they can jump directly to Samsung/GF 14 nm after that for their next non-Bulldozer based x86 APU generation after that. Thereby they would save the design cost for 20 nm.

* If your graph and theory is accurate, I assume Intel's ~1 year process tech lead on 14 nm compared to Samsung/GF is of little importance for desktop CPUs. Because 14 nm will apparently not bring much benefit compared to 28 nm on the desktop (not lower price initially, and not much higher frequency either). I.e. AMD might actually be smarter than Intel, by staying on 28 nm instead of 14 nm for now, since the cost is lower.
 
Last edited:

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
My response is somewhat off-topic.

Analogy:

What if it was 1948, and the transistor had recently been invented, and only Intel (who did not exist at this point in time) could make them, and they were considerably faster/cheaper/lower-power-consumption/better/reliable than everyone else's Electronics, which would have been Tubes(Valves) and mechanical-relays.

To further rub salt in the wound, let's say that Intel ONLY used these transistors, to make mainframe computers (Servers), which although fast and good, were very over-priced, as they were the only ones making (transistorized) mainframe computers, with no realistic competition.
Except AMDs terribly slow/expensive valve mainframes, which were bad compared to Intels.

The world would get no (transistor based) colour (color) TVs, Radios, Integrated circuits (as they need transistors), etc etc.

This would be very bad for the world.

So maybe sooner or later, something needs to be done, otherwise the whole world would suffer.
i.e. Such a big/wonderful thing such as an integrated circuit (the latest ones), needs to be shared, NOT given to just one person/company in the world.
You analogy (no TV, radio, IC,...) fails because you assume that Intel won't innovate and milk its technology.

That doesn't reflect reality. Intel is 2-4 years ahead of any other semiconductor company. If they didn't innovate, another company would make Intel irrelevant by making better products.

The reason why your assumption is wrong is because innovation doesn't only benefit consumers. Intel gains more market share and the transistors will be cheaper. Higher margins and higher revenue. Sure there is this risk you say, but in theory much competition (>2-3 companies) isn't sustainable in the semiconductor industry.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Some comments:

* The graph does not provide any absolute numbers for the design cost, nor does it put it into relation to AMD's total R&D budget. So it's impossible to draw any conclusions from that.

The numbers comes from IBS. And they put the cost in around 4-500m$ for modems and around 1B$ for MPUs. That roughly means 10B$ revenue to pay it back.

* Price per transistor is not static over time. Likely 20/16/14 nm will come down in price going forward, just as has happened for all other nodes that precede them.



Still hopeful?

* Since AMD will be sticking to 28 nm for Carrizo/Excavator, chances are they can jump directly to Samsung/GF 14 nm after that for their next APU. Thereby they would save the design cost for 20 nm.

Its not getting any cheaper. And they simply keep losing more marketshare in an even faster pace with ever more behind products. The world doesnt wait. And 14/16nm FF aka 20nm FF is even more expensive.

* If your graph and theory is accurate, I assume Intel's one year process tech lead on 14 nm compared to Samsung/GF is of little importance for desktop CPUs. Because 14 nm will apparently not bring much benefit compared to 28 nm on the desktop (not lower price, and not much higher higher frequency either). I.e. AMD might actually be smarter than Intel, by staying on 28 nm instead of 14 nm for now, since the cost is lower.

Intels lead is around 3½ years. And Intel have the lower transistor cost unlike smaller companies due to the ability of paying for the IC design that allows it.
 
Last edited:

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
* Since AMD will be sticking to 28 nm for Carrizo/Excavator, chances are they can jump directly to Samsung/GF 14 nm after that for their next non-Bulldozer based x86 APU generation after that. Thereby they would save the design cost for 20 nm.
Nope. AMD confirms 20nm in 2015.

AMD CEO Rory Read also commented on the transition to 20nm and mentioned the old business side of the logic behind the transition. Rory says that you make a move to a new technology at the crossover point between profitability, cost of technology and cost of the product that the company can sell. Chipmakers are not moving 20nm simply because they can, a company moves to a lower manufacturing node to make higher performance products and makes some money selling them.

Rory also answered a question from John Pitzer of Credit Suisse, who inquired about the 20nm rollout. Rory told him that there is a lot of life in 28nm and that we will see some mix of both 28nm and 20nm chips for AMD.

* If your graph and theory is accurate, I assume Intel's ~1 year process tech lead on 14 nm compared to Samsung/GF is of little importance for desktop CPUs. Because 14 nm will apparently not bring much benefit compared to 28 nm on the desktop (not lower price initially, and not much higher frequency either). I.e. AMD might actually be smarter than Intel, by staying on 28 nm instead of 14 nm for now, since the cost is lower.
It isn't 1 year. It's more like 2-3 years. And Intel's 14nm has huge advantages compared to 28nm. About 3x higher density and up to 4x lower power consumption.
 

positivedoppler

Golden Member
Apr 30, 2012
1,112
174
106
Its not possible for AMD to make an x86/ARM core on 20nm and below without having higher transistor cost than 28nm, or an IC design cost they can never recoup. So that alone limits the options drasticly.

The title is "Best possible scenario" not "Should AMD just give up and go home"
 

positivedoppler

Golden Member
Apr 30, 2012
1,112
174
106
Getting back on topic, one of the most interesting possibilities is pin compatible ARM and x86 SoCs. This will let OEMs offer Windows and Android devices with the same design, varying only a single component. This will hopefully reduce both design and inventory costs.


Yes getting back on topic as oppose to all these doom and gloom predication in a thread title "Best Possible Scenario"
AMD makes a 14/16nm CPU that is competitive with Intel while the IGPU vastly beats it. ARM cpu and servers takes off and Microsoft decides to offer variants of windows 9 that supports ARM. ARM overtakes X86 instruction for home PC and Laptops. Intel gets alot smaller, AMD get and a whole bunch of company gets bigger and there are competitive cpus made by multiple designer, thus, offering greater variety and price option for all the consumers.

I miss the day when it was Intel, AMD, and Cyrix. Intel had the best FP, Cyrix(cheapest) had the best Integer, and AMD was in between.
 

Fjodor2001

Diamond Member
Feb 6, 2010
3,938
408
126
Not for their next post-Bulldozer x86 APU generation, which was I wrote. So far they've mentioned sticking to 28 nm for Carrizo. The next jump for AMD x86 big core after that could be directly to 14 nm.
It isn't 1 year. It's more like 2-3 years.
According to this post Apple will push out A9 SoC on Samsung's 14LPE in late 2015. That's ~1 year after Intel on 14 nm.
And Intel's 14nm has huge advantages compared to 28nm. About 3x higher density and up to 4x lower power consumption.

Source for the 4x lower power consumption? And no Intel PR slide please.

Regardless, density does not matter much, if the cost per transistor is higher anyway in the end. Lower power consumption does not matter much either on desktop, while staying at roughly the current TDP numbers.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Not for their next post-Bulldozer x86 APU generation, which was I wrote. So far they've mentioned sticking to 28 nm for Carrizo. The next jump for AMD x86 big core after that could be directly to 14 nm.

And you read that where?

According to this post Apple will push out A9 SoC on Samsung's 14LPE in late 2015. That's ~1 year after Intel on 14 nm.

Its more comparable to Intels 22nm than to 14nm. It shouldnt be a surprise since the fabled 14/16nm from TSMC and Samsung is still 20nm. So no, they are much further behind Intel. Assuming they even come out with any of the claims at time.

Regardless, density does not matter much, if the cost per transistor is higher anyway in the end. Lower power consumption does not matter much either on desktop, while staying at roughly the current TDP numbers.
But the cost per transistor is lower due to the IC design that enables it.
 
Last edited:

Fjodor2001

Diamond Member
Feb 6, 2010
3,938
408
126
Its not possible for AMD to make an x86/ARM core on 20nm and below without having higher transistor cost than 28nm, or an IC design cost they can never recoup. So that alone limits the options drasticly.
The title is "Best possible scenario" not "Should AMD just give up and go home"

Yes, it's amazing that he managed to turn this thread into a "Worst possible scenario for post-Bulldozer AMD x86 CPUs?" in just a few posts. Somehow I'm not surprised. :\
 

Fjodor2001

Diamond Member
Feb 6, 2010
3,938
408
126
But the cost per transistor is lower due to the IC design that enables it.

So what do the graphs you showed earlier apply to? "Good" vs "Bad" design?

To make any definitive conclusions we need those graphs for both good vs bad design, and then compare them. We also need absolute numbers for the design costs in both cases.
 

Idontcare

Elite Member
Oct 10, 1999
21,118
59
91
* If your graph and theory is accurate, I assume Intel's ~1 year process tech lead on 14 nm compared to Samsung/GF is of little importance for desktop CPUs. Because 14 nm will apparently not bring much benefit compared to 28 nm on the desktop (not lower price initially, and not much higher frequency either). I.e. AMD might actually be smarter than Intel, by staying on 28 nm instead of 14 nm for now, since the cost is lower.
Intel's 14nm is not the same as Samsung's 14nm for a host of reasons, including the fact they are delivering a full node's worth of areal shrinking over that of their existing 22nm node.

Part of the reason no one else is able to deliver reduced xtor expense per node is because no one but Intel has invested in the tools necessary to enable a full shrink for the BEOL for the 14nm node.
 
Last edited:

Fjodor2001

Diamond Member
Feb 6, 2010
3,938
408
126
And you read that where?
Here and here for example.
Its more comparable to Intels 22nm than to 14nm. It shouldnt be a surprise since the fabled 14/16nm from TSMC and Samsung is still 20nm. So no, they are much further behind Intel. Assuming they even come out with any of the claims at time.
Not much is known about Samsung/GF 14 nm. If you make such bold statements that Samsung 14 nm is comparable to Intel 22 nm, please show us metrics for transistor density, transistor cost, power consumption, etc to prove that. Do you have such metrics for Samsung 14 nm and Intel 14 nm to back your statements up?
 

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
According to this post Apple will push out A9 SoC on Samsung's 14LPE in late 2015. That's ~1 year after Intel on 14 nm.
If you think going to 3D transistors is enough to name your node 1 step lower, sure, that's ~1 year after Intel on 10nm.

See what I did there?

(If you also count strained silicon and HKMG, we'll be at 5nm by the end of this year.)

Source for the 4x lower power consumption? And no Intel PR slide please.
22nm has a up to 2x lower power consumption than 32nm, Broadwell-Y consumes 0.7x the power of Haswell-Y. That was silicon that wasn't optimized however, so final silicon + lower voltage for Atom could bring that close to 0.5x. Note that I said 'up to'.

Regardless, density does not matter much, if the cost per transistor is higher anyway in the end. Lower power consumption does not matter much either on desktop, while staying at roughly the current TDP numbers.
Density does matter. 20FF will not be less expensive because it does not have a lower density. 14nm will be ~2.2x more dense according to Intel, while wafer cost increases less rapidly.
Power consumption does matter too. I could also have said higher efficiency or higher performance because those things are about the same, from a transistor POV.

 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Here and here for example.

Not much is known about Samsung/GF 14 nm. If you make such bold statements that Samsung 14 nm is comparable to Intel 22 nm, please show us metrics for transistor density, transistor cost, power consumption, etc to prove that. Do you have such metrics for Samsung 14 nm and Intel 14 nm to back your statements up?

You links shows no such thing as 14nm or 16nm for AMD.

You make it sound like Samsungs 14nm is some big secret.

 
Last edited:

positivedoppler

Golden Member
Apr 30, 2012
1,112
174
106
If you refuse to work inside the possible parameters. Then we can always come up with one fantasy after the other to get to the predefined goal you wish.

Fair enough, but now that you've defined the parameters, I assume you came to this thread to answer the title.

What is your take on "Best Possible Scenario for post bulldozer AMD?"
 

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
Not much is known about Samsung/GF 14 nm. If you make such bold statements that Samsung 14 nm is comparable to Intel 22 nm, please show us metrics for transistor density, transistor cost, power consumption, etc to prove that. Do you have such metrics for Samsung 14 nm and Intel 14 nm to back your statements up?
It isn't a bold statement.

We have Intel 32 and TSMC/Samsung 28.

Intel increases density and adds FinFET and calls it 22.
TSMC/Samsung increases density and then adds FinFETs afterwards, calls it 20 and then 16/14.

See what they did? 20nm should have been called 20LP and the FinFET version should have been called 20HP, just like they did with 28nm.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Fair enough, but now that you've defined the parameters, I assume you came to this thread to answer the title.

What is your take on "Best Possible Scenario for post bulldozer AMD?"

Possible bigger 28nm dies. Perhaps a few 20nm dies with reduced or stagnat transistor count to enable lower consumption at a somewhat equal production price compared to today.

But competition wise we are not talking performance products. Embedded is the future for them.
 

positivedoppler

Golden Member
Apr 30, 2012
1,112
174
106
Possible bigger 28nm dies. Perhaps a few 20nm dies with reduced or stagnat transistor count to enable lower consumption at a somewhat equal production price compared to today.

But competition wise we are not talking performance products. Embedded is the future for them.

so after BD, you expect AMD to stay on 28 and 20 nm?
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
so after BD, you expect AMD to stay on 28 and 20 nm?

For a foreseeable future yes. When/if EUV ever comes. It can possible change it to maybe enable lower transistor cost again for those companies. But that has to be seen. 450mm wafers may also help, assuming that 28nm doesnt get 450mm.
 
Status
Not open for further replies.
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |