[Techpowerup] AMD "Zen" CPU Prototypes Tested, "Meet all Expectations"

Page 45 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
No, I don't, AMD is anything but a focused company: They still have two wars to wage against much bigger and resourceful competitors, and the stakes on both markets only get higher, not smaller. In fact, on top of waging the two wars they have the semi-custom distraction because they need to somehow make profits.

AMD strategy as of lately has been a reversal and dismantling of the strategy Rory Read tried to develop for the company. Rory tried to take AMD outside of the competition against Intel for new markets, once this strategy backfired he was fired and now Lisa Su reverted back trying to reassert their place on the x86 market while still competing against Nvidia. Not really radical, not really intelligent, not really worth their investor's money. They seem to think that with improved execution they will be able to survive and even thrive in the medium term. I don't think they understand one of their most fundamental problem today, which is that they lack resources to do everything they are supposed to do on the markets they are supposed to compete and because of that they won't be able to execute well. By executing well I mean developing good products, manufacturing, marketing and supporting it on the supply chain.

I don't think they will get any traction with their HSA approach, they don't have the software muscle to support such a venture and even if it did the high end isn't moving towards general purpose hardware like AMD is advocating, but dedicated accelerators on the MPU and FPGAs (Altera alone was almost as big as Nvidia in terms of market cap).

Maybe with a little luck ATI will become a separate company again.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
that is a terrible comparison.
in 2010 and the years before it ( you know asics are multi year) you had:
Small core APU
CON core CPU
CON core APU
LLano APU development
HSA design and planning
Server interconnect ( hyper transport 3.x)
Memory technology (z ram anyone, GDDR4, GDRR5, edram/rop of 360)
GPUs
Middleware for gaming industry (bullet, opencl havok etc)
Semi-custom MPU business ( xbox 360, flipper)

All that has happened is CAT core got dropped for ARM core, you still have an overlap in APU, to bring the server part to market first. The ARM core is also going to have more in common with Zen then CAT ever did with CON.

seriously poor quality post there........

Speaking of 2010 and Hypertransport, that was the last time the Hypertransport Consortium had any activity.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Maybe with a little luck ATI will become a separate company again.

Unfortunately I think that dream is dead. They seem to be years behind on the GPU R&D front too. Starved to death.

So even if they become separate, I dont think they are going to become anything near the good old ATI.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
You can use that argument for mainstream desktop CPUs too, which do have iGPUs. And yes, it's a win-win situation.

No, it is not. You don't see many people cheaping out on workstation, because the hardware cost of a workstation in quite a few cases is irrelevant when compared to the software and the job at hand, but you do ser a lot of people cheaping out on desktops (otherwise AMD would be dead already). A faster workstation means more job produced, and more money incoming, a faster desktop processor doesn't mean much for almost every consumer out there, so you should be adding more CPU performance but not to the latter, as price on this case carries a much bigger weight in the purchase decision.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
that is a terrible comparison.
in 2010 and the years before it ( you know asics are multi year) you had:
Small core APU
CON core CPU
CON core APU
LLano APU development
HSA design and planning
Server interconnect ( hyper transport 3.x)
Memory technology (z ram anyone, GDDR4, GDRR5, edram/rop of 360)
GPUs
Middleware for gaming industry (bullet, opencl havok etc)
Semi-custom MPU business ( xbox 360, flipper)

All that has happened is CAT core got dropped for ARM core, you still have an overlap in APU, to bring the server part to market first. The ARM core is also going to have more in common with Zen then CAT ever did with CON.

seriously poor quality post there........

OpenCL and Havok weren't AMD creation, and conceptualize HSA is far different than develop it to the ecosystem. The console business was also far smaller, as AMD was mostly an IP provider, not the IHV responsible for the entire package. Same with memories.
 

Fjodor2001

Diamond Member
Feb 6, 2010
3,926
404
126
No, it is not. You don't see many people cheaping out on workstation, because the hardware cost of a workstation in quite a few cases is irrelevant when compared to the software and the job at hand,
Yes, you do. Most companies I've worked for have paired Intel workstation CPUs with only low/mid-range GPU cards. A decent iGPU would do just fine if available. Why add a more powerful GPU than that when it is not needed for the work tasks that the computer is aimed at?

Just because the computer is expensive to begin with does not mean there is any point in wasting money on a powerful GPU that has no use.
A faster workstation means more job produced, and more money incoming, a faster desktop processor doesn't mean much for almost every consumer out there, so you should be adding more CPU performance but not to the latter, as price on this case carries a much bigger weight in the purchase decision.
Sure, and that's why they buy a workstation CPU to begin with, because they need good CPU performance. But a faster GPU in a workstation is pure waste for many typical workstation tasks. A decent iGPU will do just fine.

As for desktop CPUs, the need for CPU performance varies. For gaming, compiling source code, video encoding and similar you definitely benefit from more CPU performance. And for those with a discrete GPU the iGPU is waste. So they would be better off with more CPU cores instead.

It all differs depending on use case.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
Yes, you do. Most companies I've worked for have paired Intel workstation CPUs with only low/mid-range GPU cards. A decent iGPU would do just fine if available. Why add a more powerful GPU than that when it is not needed for the work tasks that the computer is aimed at?

Just because the computer is expensive to begin with does not mean there is any point in wasting money on a powerful GPU that has no use.

Sure, and that's why they buy a workstation CPU to begin with, because they need good CPU performance. But a faster GPU in a workstation is pure waste for many typical workstation tasks. A decent iGPU will do just fine.

Why someone who actually needs cpu performance would be willing to sacrifice some of it to cheap out a 50-100 bucks dGPU?
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
This is where we disagree. Because I do think the iGPU will be used. Typical tasks like SW development, video editing, and office-type side-activities like document editing and web surfing requires it. A decent iGPU that is, nothing more, nothing less. You can of course always add a discrete graphics card, but then the total platform cost will be higher except for the cases where a really powerful GPU is needed. Same reasoning as for mainstream desktop CPU/APUs.

Rumor has it that Cannolake will have eight core as mainstream. So it that case I'll bet a person can get the octocore with a small iGPU.
 
Aug 11, 2008
10,451
642
126
Rumor has it that Cannolake will have eight core as mainstream. So it that case I'll bet a person can get the octocore with a small iGPU.

Source? I think we will be very, very lucky to see six cores on the mainstream. We have been stuck on 4 cores for what, 10 years now? What is cannonlake, supposedly 2 years, 3 at the most? Do you seriously think Intel will double core count in that time after 10 years with no increase?

I think at best we will see cheaper 8 cores on the HEDT platform. And actually, as much as I have argued for a hex core on the mainstream, unless some new software pops up, I dont really think there will be much of a market for 8. What I would like to see is six cores on the mainstream, and the HEDT start at 8.
 

Fjodor2001

Diamond Member
Feb 6, 2010
3,926
404
126
Why someone who actually needs cpu performance would be willing to sacrifice some of it to cheap out a 50-100 bucks dGPU?
It doesn't work like that. This has already been discussed and clarified before.

Also, again you can use your exact same argument for the iGPU on mainstream desktop CPUs. But there it's even worse, since the iGPU can occupy ~50-70% of the die area, compared to only ~10-15% for the workstation CPU in the case I mentioned before.

Anyway, I think this discussion is looping now and not getting much further. We've both made our points by now. I suppose we just have different opinions on this and will have to agree to disagree.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
It doesn't work like that. This has already been discussed and clarified before.

Workstations that do CPU-bound tasks by definition need all the compute power it can get, so when you put a GPU in there you are making the product WORSE, not better, because the GPU will at least consume the TDP that would somehow be destined to CPU computing, and if not complicate the validation process of an already complex product.

Whenever you say that iGPU makes sense for workstations you are saying that it makes sense that in order to save 100 dollars on a dGPU plus a bit of power consumption it is worth to make every single other job that the workstation delivers slower. I can't stress enough how this is a bad business decision, one that nobody who actually has actual money at stake would make.

Don't even bring mainstream chips on this discussion, they do not belong to the same realm. Your office employees won't churn out more PPTs or XLSs if you double the number of cores, but data will get analyzed faster, videos will get edited faster and software will get compiled faster if you double the number of core in a workstation. This is why what makes sense for a workstation doesn't always make sense for a mainstream computer.
 

itsmydamnation

Platinum Member
Feb 6, 2011
2,867
3,418
136
In my experience these workstations dont really exist anymore in Enterprise atleast. They are the first to be moved to citrix/view VDI. They are the actual class of desktop were you can get a hardware ROI just moving it, ignoring all the other advantages like quick environment builds, backup/restore etc.


edit:
just for details
large amounts of memory required only some of the time
large amount of CPU time only some of the time
systems that require end to end ECC
These make them the perfect target based on hardware for virtualization
 
Last edited:

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
In my experience these workstations dont really exist anymore in Enterprise atleast. They are the first to be moved to citrix/view VDI. They are the actual class of desktop were you can get a hardware ROI just moving it, ignoring all the other advantages like quick environment builds, backup/restore etc.

We have a few cases on remote project sites, and a few of these sites have connectivity issues.

But don't forget that most of the dies of Workstation chips are also server dies, where most of the considerations I made are still valid.
 

Fjodor2001

Diamond Member
Feb 6, 2010
3,926
404
126
Workstations that do CPU-bound tasks by definition need all the compute power it can get, so when you put a GPU in there you are making the product WORSE, not better, because the GPU will at least consume the TDP that would somehow be destined to CPU computing, and if not complicate the validation process of an already complex product.

Whenever you say that iGPU makes sense for workstations you are saying that it makes sense that in order to save 100 dollars on a dGPU plus a bit of power consumption it is worth to make every single other job that the workstation delivers slower. I can't stress enough how this is a bad business decision, one that nobody who actually has actual money at stake would make.

Don't even bring mainstream chips on this discussion, they do not belong to the same realm. Your office employees won't churn out more PPTs or XLSs if you double the number of cores, but data will get analyzed faster, videos will get edited faster and software will get compiled faster if you double the number of core in a workstation. This is why what makes sense for a workstation doesn't always make sense for a mainstream computer.
The total platform cost will be lower with a decent iGPU on-chip in the workstation CPU case too. It's simply cheaper to add ~10% die area for an iGPU than to produce a complete separate GFX card.

Also, you're assuming that all mainstream desktop PCs are only used for office type work requiring minimal CPU performance, which is incorrect.

But all of this has already been explained before. See previous posts.
 
Last edited:

Fjodor2001

Diamond Member
Feb 6, 2010
3,926
404
126
In my experience these workstations dont really exist anymore in Enterprise atleast. They are the first to be moved to citrix/view VDI. They are the actual class of desktop were you can get a hardware ROI just moving it, ignoring all the other advantages like quick environment builds, backup/restore etc.


edit:
just for details
large amounts of memory required only some of the time
large amount of CPU time only some of the time
systems that require end to end ECC
These make them the perfect target based on hardware for virtualization
Often in SW development you have dedicated build servers, typically used for continuous integration builds and similar. So they work as you described (lots of CPU, RAM, and disk being shared). But you also need to compile locally during SW development, testing and debugging. So you need a powerful workstation type CPU (and fast I/O and RAM) as well to do that quickly. These local builds have not been done through Citrix or similar, at least not at the places I've worked. Oh, and BTW the same workstation PC is also often used for other office-type work, so it needs basic iGPU/GPU too (but nothing more).
 
Last edited:

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
The total platform cost will be lower with a decent iGPU on-chip in the workstation CPU case too. It's simply cheaper to add ~10% die area for an iGPU than to produce a complete separate GFX card.

You are thinking in terms of upfront costs for the consumer, but you are not thinking in terms of wasted time because of the lower CPU performance (the iGPU will eat some the TDP that would go CPU). Penny wise, pound foolish.
 

Fjodor2001

Diamond Member
Feb 6, 2010
3,926
404
126
You are thinking in terms of upfront costs for the consumer, but you are not thinking in terms of wasted time because of the lower CPU performance (the iGPU will eat some the TDP that would go CPU). Penny wise, pound foolish.
I'm thinking in terms of production costs and TCO. And you're going to need a GPU anyway (whether as a discrete GFX card or iGPU) in most cases. So it'll consume platform TDP anyway. If you need more CPU cores just buy a chip with more cores (and iGPU). It's not either or. You're somehow assuming that it's either 8 cores + iGPU, or 9 cores and no iGPU. But it can just as well be 9 CPU cores + iGPU. Or 16 CPU cores + iGPU for that matter.

For a pure server CPU it's different though, since they often run without any display at all, and then no iGPU or GFX card is needed.
 
Last edited:

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
I'm thinking in terms of production costs and TCO.

This isn't about TCO, it's about DCF. Let's say that in order to save this $100 I need to skimp just 5% of the CPU power (added circuitry and static leakage) I would have available to the workstation. Sounds like a great business, right? It's not:

- So if you work with video encoding, that means 5% less deliverables per year.

- If you work with an expensive programmer ($150.000+), that means he'll spend 5% more time at the coffee machine while waiting for a code to compile.

Nobody, except you it seems, would make this trade off.

But let's say that the trade off didn't exist and companies were willing to add a iGPU to workstation product, what would you get to your customers? Not much, really. You would allow them to save roughly 5% of their workstation processors, but that would add costs and/or time (probably both) to your development, and this for a relatively low volume family of products.

It's a completely different situation from the mainstream market, where cost is a much more fundamental driver than in the workstation market, and the low-end dGPU might be responsible for roughly half the cost of the processor for a given case and we have orders of magnitude more units to amortize R&D and validation costs.
 

Fjodor2001

Diamond Member
Feb 6, 2010
3,926
404
126
Let's say that in order to save this $100 I need to skimp just 5% of the CPU power (added circuitry and static leakage) I would have available to the workstation.
Sure it's a trade off. And you do the same trade off on mainstream desktop CPUs too. If you skip or reduce the iGPU, you can get more CPU performance. Except on mainstream desktop it's worse, since the iGPU occupies 50-70% of the die compared to only ~10% on a workstation CPU for the case mentioned earlier.
If you work with an expensive programmer ($150.000+), that means he'll spend 5% more time at the coffee machine while waiting for a code to compile."
Sorry, that's not how SW developers work. They do not compile code 100% of the time. Most of the time they do not. And while you are compiling, you can continue coding, attend meetings, answer emails and whatever. In reality I'd say the workstation is busy compiling code 10-30% of the work time, so 5% of that is closer to 0.5-1.5%.

Also, for people using their mainstream desktop PC for CPU intensive stuff, they're in exactly the same situation. Except it's your spare time that is getting hit (or you're getting a worse user experience). But that extra time you could otherwise spend at work getting payed, so it should be valued just as much. Giving up spare time is not for free!

Regardless, if you really value your time so much that 5% makes a difference, you should not be worried about the 5%, but about the 100% you lose by not buying a CPU with twice the amount of cores.
 
Last edited:

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Sorry, that's not how SW developers work. They do not compile code 100% of the time. Most of the time they do not. And while you are compiling, you can continue coding, attend meetings, answer emails and whatever. In reality I'd say the workstation is busy compiling code 10-30% of the work time, so 5% of that is closer to 0.5-1.5%.

As a software engineer who writes programs compile time is very important. If I am working, I attempt to fix some error then I wait for a recompile (could be 1 min to 2 hours depending on what I need to recompile, but mostly a few mins) then check the fix, re-compile again, etc. I don't need check emails (do that first thing) or have time to go to a meeting (which I couldn't schedule around my compilation anyway), nor can I really fix something else.

Regardless, if you really value your time so much that 5% makes a difference, you should not be worried about the 5%, but about the 100% you lose by not buying a CPU with twice the amount of cores.

My current work machine has 12 slow cpu cores as that's what someone in IT thought was best. They were wrong. A lot of the time I am held up by that slow single threaded speed. Running stuff mostly comes down to single threaded performance, compilation can on occasion use all the cores but then you have to do some huge link which is single threaded again. I would take 6 fast cores over 12 slow ones. What is more key is having a lot of memory as then windows caches everything which saves a lot of disk accesses.
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,425
8,388
126
so is there anything at all here about zen anymore or are we just discussing AMD's business plans and execution or lack thereof?
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |