Official AMD Ryzen Benchmarks, Reviews, Prices, and Discussion

Page 231 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Head1985

Golden Member
Jul 8, 2014
1,866
699
136
Ryzen 5 1600x has basically made the core i5 7600k irrelevant. Even for gaming we are seeing 1600X match 7600k while crushing it in productivity, content creation and a range of apps.

https://www.computerbase.de/2017-04/amd-ryzen-5-test/2/#abschnitt_anwendungen_windows
https://www.computerbase.de/2017-04/amd-ryzen-5-test/3/
http://www.hardware.fr/articles/959-17/indices-performance.html

I think AMD will sell a lot of these Ryzen 5 CPUs. AMD needs to sell aggressive bundles of Ryzen 5 with B350 and Rx 570 / Rx 580 to OEMs to improve AMD's presence in affordable gaming PCs.
ryzen5 1600 is much better.1600x is only for average joe who dont know how oc cpu.

Best ryzen cpus are:
ryzen7 1700
ryzne 5 1600
Rest lineup is not that good.R5 1400 is worst because only 4MB L3 per CCX and have huge performance penalty in games.

Btw 7600k is dead cpu for very long time.I am glad AMD finaly killed it.6700k/7700k is like 1000x better choise for gaming than 7600k.
 
Last edited:
Reactions: Face2Face

SpaceBeer

Senior member
Apr 2, 2016
307
100
116
Are you sure R5 1400 has low results because of "only" 8MB of L3 cache and not due to low(er) clocks (compared to R5 1500X)?
 

guskline

Diamond Member
Apr 17, 2006
5,338
476
126
You can see my specs of my three machines below. What has Ryzen really done? Well it's allowed me to build an AMD rig without being embarrassed, performance wise, by high end Haswell E and Skylake rigs.

Has it "stomped them to the ground"? Heavens no but it has made AMD a true competitor in the cpu market.

I wanted the highest end Ryzen 7 and paid for it. From what I read, the real power now is the Ryzen 5. It won't beat the 7700k outright but it sure narrows the playing field and gets the checkmark for a balanced rig between content creation and gaming.

For those dreamers who think Intel is finished, think again. HOWEVER, I do think Ryzen has got Intel's attention.
 

sushukka

Member
Mar 17, 2017
52
39
61
Why are you being so hostile???

I'm not criticizing AMD. I'm just saying that for a certain segment of the HEDT market, and it's not a trivial one given likely profit margins, x99 and Intel is still the way to go and probably will be so for the foreseeable future (barring a new platform from AMD... which I guess is speculated to be "x399"). That's isn't criticism of AMD, it's just me saying that this is what I see in my industry and what I see as being valid arguments in favor of that x99 platform. If you don't think it's valid then we can discuss that on its merits, but please spare me the hostility.

Well I'll tell you what buddy: Then next time you watch something on TV just think about what gear produced what you're watching. Because chances are that a ton of that gear was hooked up using TB. Do I care about it? No, not in the least. Will I care about it? If my upcoming work demands devices that demand that interconnect; absolutely.
Point with my original "obsolete" post was that the current Intel's HEDT repetoire has to be changed. AMD has raised the bar so much that Intel have to bring their high-end server Xeon cores to HEDT lineup or they just cannot justify the current HEDT prices anymore. Your example unfortunately forms a mere minor area of HEDT segment so I stay behind my "obsolete" opinion. Also the cost is always the dominating factor. Someone would state boldly "the highest rendering speed no matter of the cots" or whatever...but at the end of day when the magnitude increases it does matter. How about buying three Ryzen systems instead of two Intel's or bigger design team with 50 computers get 50k€/$ just from CPU savings. Moreover the SLi/Corsshair idea is just slowly dying. If you cannot achieve the required performance with a single high-end card (including the professional number crushers well above 1080Ti) then you probably need something more than single HEDT box sitting on your desk.

Regarding Thunderbolt, the TB3 migration to USB-C type connector was probably a good thing, but basically makes everything more complicated from consumer side as you need to have special cables, cannot connect TB devices to normal USB3 even they have the same connector etc (read more: http://blog.fosketts.net/2016/10/29/total-nightmare-usb-c-thunderbolt-3/). Also TB is purely owned by Intel and Apple, you can still use USB3 devices on Apple and TB is seen mainly on Apple --> so markets are afraid to take the bite for a good reasons. I assume that royalty costs are not small either which is probably the reason why TB devices are way more expensive than USB ones. USB3g2 is already in par with TB1 speeds which is more than enough for all normal usage scenarios. TB2/3 is even faster, but where do you use that? Basically it's external 4xPCIe port. Probably fits very well on Apple's strategy where you keep devices slim/pretty as possible and then sell all kind of darn expensive external peripherals (with Apple/TB you basically externalize the ports away from the computer). However, you don't put any graphic card on this which was one the issues you mentioned. Also in professional computers, having all kind of dingle-dongs hanging around providing some extra features to the main unit is just not the way to go. My five cents is that TB is and will stay as a minor protocol on Apple ecosystem where there will be only a few things where it brings something more than USB3/x could do. Cough...Firewire...cough. Also according to market shares and sales, I don't believe that there are "tons" of TB stuff behind TV production.

But back to Ryzen, the point is that the current Intel invented HEDT production line is now exposed -> profit maximizing with artificial product segmentation. Of course business is business, AMD would do the same (so far never been caught though) and blabla, but the thing is that when (too many sources to be anymore "if") AMD announces their HEDT version of Naples the bar has rised substantially from where Intel has been keeping it so far.
 
Last edited:
Reactions: french toast

sm625

Diamond Member
May 6, 2011
8,172
137
106
You're missing some important things though, things that didn't make Intel's "HEDT line" go "obsolete" at all. Not even close I would say. I'll give you a very concrete example to illustrate the considerations in that segment:

I work in content creation, working with audio for tv/film etc. I need a powerful computer. It just so happens that the r7 is an excellent value for most of what I do. I'm also considering moving to video editing to get a bit more steady work. However, I then face several considerations:

1. Total lanes + lane configurations: With Intel's x99 platform I could start with a 28-lane CPU which would be more lanes than the Ryzen can give me, and by simply swapping a CPU I can get up to 40 lanes. The drawback with Ryzen is that I get x8/x8 on the x370, and the rest is PCIe 2.0. But even if I get another x16 slot it'll run at x4 from what I can see, and it still often shares resources with x1 slots. All of that goes through an x4 connection to the CPU.

Can you show me an example of a benchmark that actually suffers from being forced to run over PCIe 2.0 x4? Because looking at GTX1080 scaling tests it seems to only lose 8%:



I'm not sure 8% is worth quibbling over. And it is only a 3% loss at PCIe 3.0 x4 compared to x16.
 

tamz_msc

Diamond Member
Jan 5, 2017
3,865
3,729
136
Can you show me an example of a benchmark that actually suffers from being forced to run over PCIe 2.0 x4? Because looking at GTX1080 scaling tests it seems to only lose 8%:



I'm not sure 8% is worth quibbling over. And it is only a 3% loss at PCIe 3.0 x4 compared to x16.
Its relevant in multi-GPU setups only.
 

zinfamous

No Lifer
Jul 12, 2006
110,819
29,571
146
Yes. AM4 is a dog and needs at least another six months of BIOS updates and possibly hardware mobo revisions. Kaby Lake works out of the box. As for multitasking I have yet to bring this i5 to its knees so all those extra cores and cache are meh.

obviously AM4 has growing pains, as with any completely new architecture. This should not shock anyone that is a member of websites like this. But by all accounts, it has been a far better launch than most previous brand new uarchs.

KabyLake has been around, essentially, for 6 or more years. Obviously it is incredibly mature. Calling AM4 "A dog" is rather overblown.

You're right about one thing, though--and it's why I had planned to wait until ~May or later to make my first Ryzen purchase. I know there are going to be issues with a new release, so I let the early adopters iron those out and multiple MoBo revisions before I purchase anything. I'm not about to start flashing BIOS because I'm simply not a power user. Enough past experience dealing with brand new uarch.
 

Malogeek

Golden Member
Mar 5, 2017
1,390
778
136
yaktribe.org
He's not a hack, that much is obvious, he's just a person who isn't stoked out of their mind about Ryzen, and he's explained why. He's looking at Ryzen through the metrics and workloads that he personally finds important and is coming to his own conclusions about that; he's not like a forum poster who is able to write off Ryzen's problems (which do exist, and certainly were much worse at launch when GN hate was at its apex) because of how much of an accomplishment it is for AMD, (and it absolutely is) he's looking at through the lens of an end user that isn't concerned about the implications of a product for a specific company, just as one that's interested in its advantages and disadvantages over other options in the price range for the workloads tested. Personally, that's something I find valuable in a reviewer.
Yeah Gamers Nexus seems to be very focused testing, basically a PC is a closed gaming box to him where users do nothing else and build according to that. He also focuses completely on the now will little forethought to the future. It's understandable as he does mention how workloads are changing in gaming but he specifically deals with what's the current situation.

His R5 review must have been very painful for him as you can tell he basically doesn't want to admit that the i5 is dead as a gaming CPU, it was like pulling teeth reading his conclusion and even in the end his label is "i5 hangs on with fading grasp"?? "If CPU rendering is your thing..." ?? Really? You can tell just how biased he is towards Intel and that's bad.

I don't like how much attention he puts on BF1 benchmarks when all he does is run the built-in one, which is completely irrelevant for the game. He should not include the game at all if he doesn't want to bother with multi-player. I can play BF1 and process a h265 video in handbrake running in the background, let's see him do that on his beloved 7700k.
 

coercitiv

Diamond Member
Jan 24, 2014
6,403
12,864
136
Can you show me an example of a benchmark that actually suffers from being forced to run over PCIe 2.0 x4?I'm not sure 8% is worth quibbling over. And it is only a 3% loss at PCIe 3.0 x4 compared to x16.
Its relevant in multi-GPU setups only.
It's even more nuanced than single vs. multi-GPU, for professionals is also application dependent. See this Puget Systems investigation.

Overall, the results of our testing is pretty mixed. With a single Titan X, we saw a wide range of results between using a PCI-E 3.0 slot at x8 and x16. Some applications (Unigine Heaven Pro and Octane Render) showed no difference, while others (Ashes of the Singularity, GRID Autosport, and Davinci Resolve) showed up to ~5% difference in performance.

With dual GPUs, the results actually got a bit more confusing. Although Unigine Heaven Pro didn't see much of a difference with a single card, with two cards in SLI driving three 4K displays in surround we saw roughly a 15% drop in performance running at x16/x8 and a massive 30% drop in performance running at x8/x8. On the other hand, Ashes of the Singularity only showed minimal differences, and GRID Autosport was actually faster at 1080p when running in x8/x8 - although it was about 8% slower at 4K and 4K surround. On the professional side, Octane Render still didn't show a difference when using two cards but Davinci Resolve did see up to a ~10% drop in performance with both x16/8 and x8/x8.

At this point, we would say that if you are using a high end video card like the Titan X (Pascal) or possibly even a GTX 1080, it is probably a good idea to try to use a PCI-E 3.0 x16 slot - especially in multi GPU configurations.
 

unseenmorbidity

Golden Member
Nov 27, 2016
1,395
967
96
Yeah Gamers Nexus seems to be very focused testing, basically a PC is a closed gaming box to him where users do nothing else and build according to that. He also focuses completely on the now will little forethought to the future. It's understandable as he does mention how workloads are changing in gaming but he specifically deals with what's the current situation.

His R5 review must have been very painful for him as you can tell he basically doesn't want to admit that the i5 is dead as a gaming CPU, it was like pulling teeth reading his conclusion and even in the end his label is "i5 hangs on with fading grasp"?? "If CPU rendering is your thing..." ?? Really? You can tell just how biased he is towards Intel and that's bad.

I don't like how much attention he puts on BF1 benchmarks when all he does is run the built-in one, which is completely irrelevant for the game. He should not include the game at all if he doesn't want to bother with multi-player. I can play BF1 and process a h265 video in handbrake running in the background, let's see him do that on his beloved 7700k.
Yah, I am done with GN. I used to like his content, because he used 0.1% lows and 1% lows for testing. But now others are picking that up as well, and he almost seems to be abandoning it. He rarely seems to talk about better lows for Ryzen, which was extremely obvious when he tried to compare the 1800x to an i5.

Also, how do you ignore the future as a gamer!? That's not logical... A good CPU can easily last you 5+ years for gaming. It will see multiple generations of GPUs! And then, assuming you bough AMD, you can simply upgrade to the latest generation of AM4. That's what I am going to do, because it nets the best performance for the lowest investment, while not sacrificing power.

His conclusion was yet again cringeworthy. It was almost as bad as the last one, but I give the R7 the edge, as it actually had technical flaws too. His conclusion can be summarized as "8 core CPUs are worthless, not just for gaming, but in general."

I am actually surprised AMD hasn't blacklisted him after that stunt with the phone recordings.
 
Reactions: sushukka

mattiasnyc

Senior member
Mar 30, 2017
356
337
136
I do. But then you are the 0.001% user

0.001% of what? All users? Of the HEDT market? I doubt it's the latter.

which still means Intel HEDT is dead (which was the inital point you replied to).

I think the word "dead" is one of those words that has become popular and is thrown around as if it means something when in fact it no longer seems to mean anything. If it is "dead" then Intel isn't selling those chips any longer, nor are there motherboards supporting it. That's what "dead" means to me. You could maybe say it's "doomed", but that's only a prediction of the future.

But you tell me: What is the current market share for Intel HEDT products and what was it back in January?

If you can answer that then I suppose we can easily see, all of us, if it's a "dead" product line or not.

Besides that show me a benchmark where a GPU running in x8 PCIe 3.0 loses performance. Depending on workload even x4 might be good enough.

Yeah, but you're missing my point above: The point wasn't that this power user needs two GPUs running at x16 PCIe 3.0, the point was that the user in question would need more than two GPUs running at least x8 PCIe 3.0. On the Ryzen motherboards I've seen you're done after you hit x8/x8 if you also want to use two or more x1 cards off of the x370. DaVinci Resolve for example needs one card for the gui, it's recommended to use one card for monitor output (i.e. of the video only, not desktops), and then you load up the machine with as much GPU power you can afford after that. The monitor output card could run x4, if it wasn't for the x1 cards. And with one GPU taking up desktop duties it only leaves one x8 PCIe for GPU computing.
 

mattiasnyc

Senior member
Mar 30, 2017
356
337
136
Who are these people you are talking about? On one side they are looking for affordable 8c/16t systems, on the other side they are looking for "affordable" 16c/32t systems. What systems where they working on prior to the Ryzen launch, and what kind of operations and earnings their systems provide? Because it seems to me that for those who really need them, 16c CPUs would just pay for themselves.

The people I'm talking about are people who are trying to get machines that perform like the top end Ryzen does, which in my opinion is in the mid-range of the HEDT class. It's a tricky term to define of course and maybe we'd have a different opinion about that. But from what I can see there is a certain 'class' of content creators that 'save' by choosing Ryzen over i7, but without the current ability to 'move up' within the same Ryzen ecosystem. Or to put it differently; If someone can't afford a $1,500 CPU, or even a $500 CPU, but has HEDT needs, an 1800x would work well - but if AMD then comes out 6 months later and says "Oh, btw, here's a $799 12 core part that rivals Intel's $1,500 part" then that's going to be sour grapes if the line was drawn around $800.

Obviously, this is a narrow group of all people that buy HEDT products, but if your objection above is correct you're also agreeing with me that Intel's HEDT isn't dead at all. If it has the performance lead and people can pay for it because it pays itself back (and it does to many individuals and companies) then it isn't a dead platform.

What I really can't understand in your point of view is how you consider R7 to not really be the equivalent of Intel's HEDT due to connectivity issues (debatable but fair enough from a certain perspective), yet you definitely see a conflict between R7 and whatever the X399 platform will bring to the table.

Objectively it isn't the same. I think we can agree on that. As for why you're seeing a discrepancy it's pretty much as I explained it. You can get far on a Ryzen system but the risk is lacking headroom when you need it.

Suppose I decide to delve into video editing and color correction/grading. It wouldn't be in the high-end of the market if I do it on my own system, at least not for the first year since I'd be learning a new skill. Since it's a bit of a wager to embark on that I'd be looking to keep costs low but performance high, at least in the beginning. Therefore, Ryzen makes sense, in the beginning. If I use Premiere I can deal with quirks that put Ryzen behind Intel CPUs in some workloads, and if I use Davinci Resolve I can make do with only one GPU for acceleration. But as soon as I start making money and saving time becomes more important, as well as connectivity, I have to look at something else.

At that point I would be 'miffed' if I hadn't been offered a somewhat pricier x399 from the beginning.

And to top that off, you will believe X399 when you see it, but at the same time you see a timing conflict between AM4 and X399: between a platform in such early stages that you don't even fully believe it exists, and a platform that was already launched.

I don't see a disconnect above. What's the problem exactly? People are assuming x399 exists and is on the way to the market "soon", or "very soon". If that's the case then it's exactly the type of nuisance I'd expect some people would be annoyed by, the group of people who figured "I want to support AMD, and this Ryzen thing is like 90% there, so I'll spend my money on it" only to discover that all of a sudden AMD has exactly what they need for exactly the money they would have been willing to spend.

But really, all I was saying was that Intel's HEDT isn't "dead", that's just wishful thinking, and that the x399 has been super-hyped and I'll believe its existence when I see it. It doesn't mean I don't think it exists, just that people make up all sorts of things for all sorts of reasons, and I typically wait until there's definite evidence of a product before I get all excited about it... - and again, I possibly missed some AMD announcements.....

These people in the market for high throughput performance computing must have some seriously conflicting thoughts about their needs and the future.
 

mattiasnyc

Senior member
Mar 30, 2017
356
337
136
Part of your problem is that R7 (and R5) is not the high end work station part. The 16c/32t part (and cut down 12c/24t part) is for that.

Right. So if the r7 is not HEDT, then Intel's HEDT isn't "dead", correct? It seems we would agree on that.

And it'll basically have 2x everything that the R7 does. Surely you've heard about those parts coming?

I've seen rumors about it. I don't trust rumors. Did you follow the US election?

I wouldn't put too much into AMD not making any public announcements themselves yet. They currently have lot on their plate already with R7/R5, the upcoming APU's, and Vega. We have to consider that AMD has ~1/10th the resources that Intel does. What they are doing, mixing it up with the big boys (nVidia and Intel) in 2 different hardware segments, is no small endeavor. That's not counting consoles, HBM, various API's, etc... that they've been involved with.

I agree 100%. What AMD has done is incredible in my opinion, and hopefully they'll do even better. I for one am very excited about what they'll bring to the table in the future, and I hope they'll add a platform with more lanes on it.
 

nathanddrews

Graphics Cards, CPU Moderator
Aug 9, 2016
965
534
136
www.youtube.com
It's even more nuanced than single vs. multi-GPU, for professionals is also application dependent. See this Puget Systems investigation.
Looking at their limited number of applications, I'm not sure the difference is enough to lose sleep over. Other than a 30% loss with Unigine (not a game or application), everything else is of limited gain or loss. I can't say that I'm surprised given the history of PCIe analyses. I think it's really only fair to state that if you're competing to be "the best", even if that means 1%-8%, then x16/x16 is what you want.





 
Reactions: sushukka

mattiasnyc

Senior member
Mar 30, 2017
356
337
136
Point with my original "obsolete" post was that the current Intel's HEDT repetoire has to be changed.

I think that might be true, depending on what consumers choose. I guess time will tell, but currently I don't see how Intel's CPUs are "obsolete". If they were I think we'd have seen cuts in pricing to compete already.

Your example unfortunately forms a mere minor area of HEDT segment so I stay behind my "obsolete" opinion. Also the cost is always the dominating factor. Someone would state boldly "the highest rendering speed no matter of the cots" or whatever...but at the end of day when the magnitude increases it does matter. How about buying three Ryzen systems instead of two Intel's or bigger design team with 50 computers get 50k€/$ just from CPU savings. Moreover the SLi/Corsshair idea is just slowly dying. If you cannot achieve the required performance with a single high-end card (including the professional number crushers well above 1080Ti) then you probably need something more than single HEDT box sitting on your desk.

I think you're not seeing the specific use case I'm talking about. I don't think it scales as you imply above, because you're probably not going to save by getting another operator of a computer over getting a better performing computer. The tasks are highly different (design by a human versus rendering) and the savings won't translate. SLI isn't really something used in at least editing/color grading, and as far as I know all grading is done on single computers with multiple GPUs at the high end.

I acknowledge that I'm talking about a "minor area", but that's only because that's the area with which I'm familiar and know something about. I actually think there's a good reason to believe there are other fields where the same thing applies.

My five cents is that TB is and will stay as a minor protocol on Apple ecosystem where there will be only a few things where it brings something more than USB3/x could do. Cough...Firewire...cough. Also according to market shares and sales, I don't believe that there are "tons" of TB stuff behind TV production.

There is though. There's everything from TB-connected drives to displays to chassis etc. It's pretty widely used. The issue isn't "is this needed to do this job or could it get done differently", the issue is are you going to say "Sorry, no TB" to a client in order to save a few hundred dollars? It is what it is. Even irrational reasons for adopting a technology becomes a rational reason for others to adopt it, if you know what I mean.

But back to Ryzen, the point is that the current Intel invented HEDT production line is now seen what it really is: profit maximizing with artificial product segmentation. Of course business is business, AMD do the same and blabla, but the thing is that when (too many sources to be anymore "if") AMD announces their HEDT version of Naples the bar has rised substantially from where Intel has been keeping it so far.

I agree with your description of the market of course. Intel has done what large corporations do when they have a virtual monopoly on something. So yeah, I agree.

As for "too many sources": Just for the record - my skepticism is based on my experience of very often finding that all of these "many sources" really boil down to the same one or two sources. In other words you go to Anandtech and WCFTCSFTech or whatever it's called and Hardforum or whatever and they all provide links to sources, but often those sources are other blogs/news outlets and eventually it boils down to just one or two "original" sources that are no more than rumors. So the volume of reports is one thing, and the volume of sources is another.
 

Veradun

Senior member
Jul 29, 2016
564
780
136

It is known dual rank ram is faster, point is: how much can you achieve in terms of MT/s on single vs dual rank ram on Ryzen? because we all know any B-die SS kit can go up to 3200MT/s with no bclk oc. So the choice is between 2667 DR vs 3200 SR. I have no answer here, just brainstorming :>

edit: Karnak had the answer ready, apparently :>
 

beginner99

Diamond Member
Jun 2, 2009
5,223
1,598
136
0.001% of what? All users? Of the HEDT market? I doubt it's the latter.

Of all users of course. Still way too little to support an extra platform. People that need 3x GPUs at least with x8 connection are very thinly spread and does that actually do need it, do it commercially and then it probably is worth it to pay $5000 for a 20 core xeon.

About sales numbers: obviously they aren't that detailed to see how much HEDT parts where sold. But I'm sure it's less now than in January. Due to Ryzen but also because Skylake-X release is closer and closer so that more and more people just wait for it. If intel doesn't adjust their pricing with Skylake-X, they will move much less product.
 

imported_jjj

Senior member
Feb 14, 2009
660
430
136
It is known dual rank ram is faster, point is: how much can you achieve in terms of MT/s on single vs dual rank ram on Ryzen? because we all know any B-die SS kit can go up to 3200MT/s with no bclk oc. So the choice is between 2667 DR vs 3200 SR. I have no answer here, just brainstorming :>

edit: Karnak had the answer ready, apparently :>

Kinda doubt that folks know about the size of the difference and if you Google for it, i couldn't find anything just 2 days ago. You find forums threads where folks say, the diff is minimal, forget about it but no actual numbers.
It's also about price and capacity.
Many want 2x16GB and you can't do that with 2 single rank but , knowing that dual rank has better perf,makes it easier to accept lower settings.

As an example,i was considering 2x8GB at 3600 but was rather unhappy about going for only 16GB and the price would be somewhat steep even vs 3200 CL14 kits.
Now , given the perf diff i'm thinking that 2x16GB at 3200 CL 16 will do just fine, if AMD manages to enable it with the May update.
4x8GB SR should have similar perf and limitations as 2xDR but maybe slightly costlier and blocks the 4x16GB upgrade option.
 

mattiasnyc

Senior member
Mar 30, 2017
356
337
136
Can you show me an example of a benchmark that actually suffers from being forced to run over PCIe 2.0 x4? Because looking at GTX1080 scaling tests it seems to only lose 8%:

I'm not sure 8% is worth quibbling over. And it is only a 3% loss at PCIe 3.0 x4 compared to x16.

The people I'm talking about creating content aren't going to benchmark using Grand Theft Auto at 1080.
 

PhonakV30

Senior member
Oct 26, 2009
987
378
136
It's really sad most of the people think I5 7500(4C/4T) Is still better than Ryzen 5 1500x(4C/8T).that sooner or later they will regret what they are buying CPU with 4C/4T.
 
Reactions: guachi

mattiasnyc

Senior member
Mar 30, 2017
356
337
136
Of all users of course.

But then your argument is sort of like saying that Ferraris are obsolete because only a fraction of users buy them and now there are BMWs that are really really good. If people still buy Ferraris and BMWs aren't Ferraris, then Ferraris aren't "obsolete" or "dead".

Still way too little to support an extra platform. People that need 3x GPUs at least with x8 connection are very thinly spread and does that actually do need it, do it commercially and then it probably is worth it to pay $5000 for a 20 core xeon.

About sales numbers: obviously they aren't that detailed to see how much HEDT parts where sold. But I'm sure it's less now than in January. Due to Ryzen but also because Skylake-X release is closer and closer so that more and more people just wait for it. If intel doesn't adjust their pricing with Skylake-X, they will move much less product.

If it's too little to support an extra platform then the platform wouldn't exist. It's that simple. These companies are pretty pragmatic about the whole thing.

As for Intel and profitability it's my understanding that they've traditionally done very well in the very top end of the market. Since that's the case I would expect them to revise pricing of course, but also offer better performing products at the very high end. The professionals and consumers that buy the top end use software and do work that require very high efficiency and output, and it's only a matter of time before increased performance gets caught up by increased demand. In other words; just because you're now able to get all of what you're doing today done on a setup that you're now spending 80% on compared to before doesn't actually mean that those 20% end up being more profit. Over time it just turns into more work anyway. You can look at the last, what, 5-10 years or so and how content has moved from standard def to high def to 4k, all the while with the actual captured data often being higher than that (i.e. shooting content at 4k and then eventually outputting 1080p, or these days moving to above 4k and outputting to 4k).

So, demands catch up with capacity, and that in turn keeps the very high end relevant and not obsolete.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |