Official AMD Ryzen Benchmarks, Reviews, Prices, and Discussion

Page 145 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

unseenmorbidity

Golden Member
Nov 27, 2016
1,395
967
96
Did you notice I said 4.7Ghz old i5? At that speed it keeps up with new games at 1080p more than fine, beats Ryzen. I know that in future games that don't heavily favor one core, then I may need an upgrade to more cores. I thought Ryzen would be it, but going from a 4.7Ghz i5 to a 3.9Ghz Ryzen just is a sidegrade at best for gaming NOW.
 
Reactions: looncraz

DarkKnightDude

Senior member
Mar 10, 2011
981
44
91
This is anecdotal, but I swear the 1700X feels smoother for me when playing BF1 and Siege than my i7 3770K. Maybe its placebo, but sometimes I used to get stuttering on my OCed 3770K even when frames were high, but Ryzen seems to have eliminated that for me. (I have it on a second PC with a GTX980 and a new Crosshair mobo from Asus).
 

unseenmorbidity

Golden Member
Nov 27, 2016
1,395
967
96
This is anecdotal, but I swear the 1700X feels smoother for me when playing BF1 and Siege than my i7 3770K. Maybe its placebo, but sometimes I used to get stuttering on my OCed 3770K even when frames were high, but Ryzen seems to have eliminated that for me. (I have it on a second PC with a GTX980 and a new Crosshair mobo from Asus).


Multiplayer is nothing like those benchmarks in bf1. When you are on a map with 63 other people and explosions are going off everywhere, it's another beast entirely. My old 6 core xeon x5650 @ 185 Bclk is on it's knees at times.
 
Last edited:

Agent-47

Senior member
Jan 17, 2017
290
249
76
0. Perform the NOR and we land that you do not care about CPU performance in games. The fact that you use fx-6300 verifies it, and as such we come to conclusion that ryzen is indeed a better deal for you. Logic!

1. Full core turbo of 4Ghz on what CPU? If you really want to know, turbo max makes it have 4Ghz on a single core, MCE makes it have 3.7Ghz on all cores, if you were talking about 6900k. Not 4Ghz, as you seem to think. By the way, thanks for reminding me that BF1 results in GN review were really not that bad, /r/AMD had me thinking otherwise for some reason.

2. It took Intel 2 months to devise a patch that by some dark sorcery adds AVX support to Blender. 2 months after seeing Ryzen ES compete with 6900k in a workload that did not use AVX at a time. I give it 2 years before Intel spends it's dollars on getting all the big productivity software to use AVX actively. It won't affect me since it is borderline impossible to make compilers use it in meaningful sense. But rendering, photo editing, encoding, hell even some rare games? It's already started going this way and hell, it would be more effective than any amount of bribing Intel would otherwise do.t.

0. Lol . aren't you the Mr. Know it all.

1. /r/AMD made u think that because you are blinded by intel fanboy fevel. Bit please tell us more about this r7 cores were hitting 100℅ that you saw. Lol.

2. And we saw no life altering changes. Thanks for reminding what an eye wash it is.

It seems rather futile to reason with someone as lost. I give up.
 
Reactions: sirmo

looncraz

Senior member
Sep 12, 2011
722
1,651
136
Don't you start too lol, the HBM APU is absurd.
Some folks want a pink elephant to keep in their bathroom and refuse to accept that it won't work.

Take a 11CU APU towards 2TFLOPS at X TDP vs a 16CU APU +HBM at same X TDP.
The memory BW is not that limiting, especially if Vega saves on that so in practice you go wider but lower clocks and maybe you gain some perf on memory but overall the gain is relatively small. I do expect that you won't agree on how limiting the BW is.
The comparison gets worse from a perf perspective if both are 16CU.
From a cost perspective you more than double the manu costs and you got very high dev costs.

For an APU with a large GPU, the memory BW is the barrier so you would think a monolithic APU with HBM would work but only if you don't consider alternatives.
They could just use a traditional MCP to pair an existing CPU die and a Vega 11 with minimal costs Higher latency between CPU and GPU but they compete with discrete so the latency here is much lower.
It would have a TDP penalty with both CPU and GPU in the same package but there are form factor and design costs advantages for OEMs

In high end GPU it works from the classical perspective of perf power and cost.but , to use in any other area, you need a good reason.
The chiplets angle for lower costs and increased flexibility works if you have cheap enough packaging and preferably relatively affordable memory (the OS being able to take advantage of it and save on DRAM costs would help too). Si interposer and HBM are not that and scale doesn't help much.

Aside from that , AMD can't afford to do different implementations in high end desktop and server, can't do a native quad in desktop or a native 16 cores in server. Can't do a HBM APU in server either.
They are not gonna invest in a consumer APU that makes no sense.

When they gain access to packaging solutions that make financial sense, they'll use them to pair CPUs, GPUs, memory and so on- the chiplets angle.

HBC != HBM.

HBM would be awesome for performance, but it would have a very small market. An exascale APU, however, may have enough of a market to make it worthwhile.

HBC is specifically designed by AMD to enable bandwidth restricted scenarios to perform MUCH better (+50% avg, +100% min according to AMD... so 20%/40% better ).

XBone does exactly that, but with a strict reliance on software support. If AMD has required software support for the HBC, then they would be stupid to even develop it - a waste of resources (unless it's just for consoles...).

APUs are insanely limited by memory bandwidth. To the point that the lowest APU and highest APU in the same generation perform about the same, despite 33% more resources on the higher end APUs. My A8-7600 with DDR3-2133 CL9 is much better than with DDR3-1600 CL9. It took Hitman Absolution from laughably unplayable to reasonably playable.
 

DrMrLordX

Lifer
Apr 27, 2000
21,815
11,171
136
So is that real? I mean if Intel is going to release a 6 Core for the 7700K price I can wait til August or so

Nobody knows what will be the pricing of Skylake-X. Bear in mind that Intel HEDT motherboards are quite expensive. You pay to play. Even if they have a 6c Skylake-X for the same price as the 6800k, expect $250 minimum for the board.

If you want "affordable" hexcore action from Intel, Coffeelake offers that in early 2018.

fwiw I game at 1600x1200 with whatever settings I can get away with. By pixel count, that's very close to 1080p. Though I might get a better monitor someday. Maybe.
 

hotstocks

Member
Jun 20, 2008
81
26
91
I don't care how fast your cpu and monitor is, when you have a game and put settings to ULTRA , like hairworks and shadows, distance, etc. even a Nvidia 1080 goes below 60 fps in some games. Now I agree with you that if you are a professional twitch gamer that sets all the eye candy off so you can get 144 fps with shittier imagery so you can frag other professionals, then ok. But myself and most normal gamers certainly can tell that 30 fps sucks (whether pc, xbox one, or ps4) and can tell that a solid 60fps or higher looks buttery smooth. That is all that matters. I have played DOOM at 200 fps on a 60hz screen and a 144hz screen and they look and play the same. So my target is the best possible image at 1080p and 60+ fps, NOT 144+ fps and sacrifice imagery. I like the art and beauty in a game, I don't just run around shooting people with LOW settings to get the edge on them and tea bag them. I am not 17 years old, but I get why that niche would like it. Now I do realize my i5 will need an upgrade soon, so I was looking at Ryzen, but the loss of clock speed and all the problems right now is holding me back. I have the money, I just don't want the headaches (and I've built over 20 systems). If Ryzen improves in gaming and compatibility of parts before Intel releases a new 6 or 8 core or lowers price significantly, I will go Ryzen, if not I will go Intel. Either way my 4.7ghz i5 is more than fast enough with my 1080 to get over 60 fps for now. We will just see if Ryzen delivers sooner than Intel. And anyone thinking they can get over 60 fps at 4K with Ultra settings from a Nvidia 1080, think again. Maybe in old games, but not in nice looking new ones. That is why I haven't moved to 4K
 

imported_jjj

Senior member
Feb 14, 2009
660
430
136
Nobody knows what will be the pricing of Skylake-X. Bear in mind that Intel HEDT motherboards are quite expensive. You pay to play. Even if they have a 6c Skylake-X for the same price as the 6800k, expect $250 minimum for the board.

If you want "affordable" hexcore action from Intel, Coffeelake offers that in early 2018.

fwiw I game at 1600x1200 with whatever settings I can get away with. By pixel count, that's very close to 1080p. Though I might get a better monitor someday. Maybe.

Why do you think hexa Coffee Lake would be cheap? It still has a GPU and if they price it at 350$, they got to push down all desktop prices and it's sill on 14nm so that would be a problem for margins.Maybe if it has a very small GPU it could work out for Intel but less than ideal. We'll shall see how CfL is and how Zen+ is...
 

Mopetar

Diamond Member
Jan 31, 2011
8,024
6,489
136
I have a 4.7gh i5 that has seen 4 graphics cards and am now on a Nvidia 1080 on the same 1080p monitor.

You should really get a 1440p monitor as a 1080 is seriously overkill for 1080p. Find a nice G-Sync monitor and the framerate dips don't really matter all that much or at least don't seem as jarring.
 

imported_jjj

Senior member
Feb 14, 2009
660
430
136
HBC != HBM.

HBM would be awesome for performance, but it would have a very small market. An exascale APU, however, may have enough of a market to make it worthwhile.

HBC is specifically designed by AMD to enable bandwidth restricted scenarios to perform MUCH better (+50% avg, +100% min according to AMD... so 20%/40% better ).

XBone does exactly that, but with a strict reliance on software support. If AMD has required software support for the HBC, then they would be stupid to even develop it - a waste of resources (unless it's just for consoles...).

APUs are insanely limited by memory bandwidth. To the point that the lowest APU and highest APU in the same generation perform about the same, despite 33% more resources on the higher end APUs. My A8-7600 with DDR3-2133 CL9 is much better than with DDR3-1600 CL9. It took Hitman Absolution from laughably unplayable to reasonably playable.


Not sure where you said HBC and too lazy too look it up lol
Anyway with DDR4 they would have 30-50GB/s in laptop and Vega is supposed to have larger cache + more to use less BW. My expectation is that they can get to Polaris 11 level of perf in gaming. Maybe 11CUs at 1GHz or so for 1.5RFLOPs and greater utilization in gaming than Polaris.
 
Reactions: CatMerc

unseenmorbidity

Golden Member
Nov 27, 2016
1,395
967
96
I don't care how fast your cpu and monitor is, when you have a game and put settings to ULTRA , like hairworks and shadows, distance, etc. even a Nvidia 1080 goes below 60 fps in some games. Now I agree with you that if you are a professional twitch gamer that sets all the eye candy off so you can get 144 fps with shittier imagery so you can frag other professionals, then ok. But myself and most normal gamers certainly can tell that 30 fps sucks (whether pc, xbox one, or ps4) and can tell that a solid 60fps or higher looks buttery smooth. That is all that matters. I have played DOOM at 200 fps on a 60hz screen and a 144hz screen and they look and play the same. So my target is the best possible image at 1080p and 60+ fps, NOT 144+ fps and sacrifice imagery. I like the art and beauty in a game, I don't just run around shooting people with LOW settings to get the edge on them and tea bag them. I am not 17 years old, but I get why that niche would like it. Now I do realize my i5 will need an upgrade soon, so I was looking at Ryzen, but the loss of clock speed and all the problems right now is holding me back. I have the money, I just don't want the headaches (and I've built over 20 systems). If Ryzen improves in gaming and compatibility of parts before Intel releases a new 6 or 8 core or lowers price significantly, I will go Ryzen, if not I will go Intel. Either way my 4.7ghz i5 is more than fast enough with my 1080 to get over 60 fps for now. We will just see if Ryzen delivers sooner than Intel. And anyone thinking they can get over 60 fps at 4K with Ultra settings from a Nvidia 1080, think again. Maybe in old games, but not in nice looking new ones. That is why I haven't moved to 4K

Did you watch the video I just linked? The guy was struggling to get over 70 fps, because his 4690k wouldn't let him get higher. And he just had a 1070, and a relatively new i5.

You should probably just buy a 4k monitor then. It would be a lot better than playing at 1080p with extreme multisampling.
 

hotstocks

Member
Jun 20, 2008
81
26
91
I actually have a new Kaby Lake 7700HQ laptop with a 1070 in it, and it is an awesome laptop whether I turn on Gsync or not, it is always above 60 fps at 1080p and 60, 90, 120 appear almost identical. I used to compete semi-pro on Halo 3 days, so I know what people are getting at for that little extra edge, hell I even used a keyboard/mouse attachment to Xbox before I got better with the pad. But now I like to play more single player games that look nice and play well like Witcher 3. I have no desire to play multiplayer Witcher 3 where 20 witchers go around killing each other as fast as possible. But I digress, all of this debate is about whether something makes a HUGE difference or a barely noticable one. When I went from the fastest hard drive (velociraptor) to my first SSD, it was a HUGE difference. The laptop I just told you about came with a 256gb Samsung 950 NVMe and a 1TB HDD. Well I just wanted one large drive for everything and had a 512gb Samsung 850 pro lying around. I cloned the 950, took out both drives and installed a re-imaged to the 850 pro. Guess what, yeah I know the 950 synthetic benchmarks blow the 850 pro out of the water. Guess what, NO noticable difference in the user experience. Humans are too slow to tell the difference between those drives (unless you are reading and writing to the same drive continuosly). So the 256gb stick drive is sitting in the box. Believe me if it was faster I would be using it, it may be slightly faster, but not worth being half the size. Same thing once you are over 60 fps in normal games (non-pro competition), almost no one can tell 60 from 80 or 120 fps for that matter. But everyone can tell that a game on ULTRA graphics settings looks a hell of a lot prettier than on Normal or low settings.

P.S. In reply to the previous post, no I will not get a 4K monitor yet, because as I said new games or even graphically intensive ones like The Witcher or anything with shadows and hairwork can not be played at ULTRA settings even with a Nvidia 1080 at 4K they will drop below 60 fps, and that is where I notice a HUGE difference. Playing at say 47 fps in 4K looks stuttery and like crap, but at 60 fps it is like butter smooth. After 60fps you get diminishing returns, anything below 60 fps is unnacceptable, hence no 4k monitor for me YET.
 
Last edited:

Dante80

Junior Member
Jun 3, 2009
8
14
81
With regards to gaming reviews.

The only way to test a processor for gaming is to provide a - partly unrealistic, ok - CPU bottleneck. This is done by removing AA and resolution while keeping the other graphical settings up. Thus, your CPU has to work overdrive so as to keep up with the huge number of frames that the GPU is providing in such a setup.

The relevance of such a test for real life scenarios goes as follows: A CPU that performs markedly worse in such an environment will have reduced lifetime performance and endurance for the consumer, since gamers tend to change GPUs faster than CPUs and a future upgrade together with more demanding games will introduce a bottleneck (and a worse real life performance) faster.

Having said that, there is a benefit for the consumer to also see gaming testing in real life scenarios too, simply because these will be his own experience if he purchases the product. For this reason, I value reviews that tend to show performance in both scenarios. Remember, the gaming reviews are in the end trying to extrapolate holistic gaming performance by focusing on the specific object rated, be that a monitor, a GPU, a CPU etc. What you get now by using this or that is still valuable info, since PC gamers tend to use many different resolutions, aspect ratios, monitor refresh rates etc etc.

At the same time, there has been an argument around concerning gaming for some time now (I think since the bulldozer architecture was introduced). This goes like "as time passes by, game engines and APIs take advantage of more parallelization thus a good performer right now is not as valid as we might think, unless it has the spare resources (that current games don't utilize and future games will) to be "future-proof". This is partly a loaded argument, for two reasons.

1. The guys that really value gaming performance and do stay at the enthusiast part of the market follow closely the changes in both software and hardware. The whole idea of futureproofing is moot for someone that his main task is gaming and is regularly changing a 700$ GPU every year or so. Those people will simply buy what gives them the best performance every month.

2. The other (non enthusiast) parts of the gaming market - which btw make up the majority of it as far as software sales are concerned - use a lot more modest hardware (take a look at the averages in steam for resolution, CPUs, GPUs etc to understand what I am talking about). Also, they change hardware much slower than the enthusiast segment. The software providers know this very well, and the rate of adoption on tech that destroys their own client base's ability to pay them is - not surprisingly - much slower than the "future is now, your expensive four cores will stutter in a year or so" crowd wants to admit.

Somewhere inside the whole debacle there is an additional argument about the software market being driven by the consoles that have that many cores so in the future the games will--- but I'm really tired of typing now.
 

Agent-47

Senior member
Jan 17, 2017
290
249
76
1. The guys that really value gaming performance and do stay at the enthusiast part of the market follow closely the changes in both software and hardware. The whole idea of futureproofing is moot for someone that his main task is gaming and is regularly changing a 700$ GPU every year or so. Those people will simply buy what gives them the best performance every month.

2. The other (non enthusiast) parts of the gaming market - which btw make up the majority of it as far as software sales are concerned - use a lot more modest hardware (take a look at the averages in steam for resolution, CPUs, GPUs etc to understand what I am talking about). Also, they change hardware much slower than the enthusiast segment. The software providers know this very well, and the rate of adoption on tech that destroys their own client base's ability to pay them is - not surprisingly - much slower than the "future is now, your expensive four cores will stutter in a year or so" crowd wants to admit.

.

1. Valid argument for GPUs, yes. Not so much with CPU where advancement is slow and gpus still being the major bottleneck. so future proofing is easy with cpu.
2. I will overlook the contradiction in your second point and your first as I agree with you here. But let me extrapolate on it as i don't agree with the conclusion. One such segment is console gaming, which is a much larger market powered by weak 8c cpus, which are not fully utilized. So to keep up with increasing graphics demands developers will increasing use more thread, like we are already seeing in all major titles from 2016. This is a case of software development which is not constrained by hardware capability in the modern consoles in terms of threading
 
Reactions: unseenmorbidity

unseenmorbidity

Golden Member
Nov 27, 2016
1,395
967
96
You make strange purchasing decisions. You have an old i5 and a $700 GTX 1080, and now a Kabylake Laptop with a 1070.
With regards to gaming reviews.

The only way to test a processor for gaming is to provide a - partly unrealistic, ok - CPU bottleneck. This is done by removing AA and resolution while keeping the other graphical settings up. Thus, your CPU has to work overdrive so as to keep up with the huge number of frames that the GPU is providing in such a setup.

The relevance of such a test for real life scenarios goes as follows: A CPU that performs markedly worse in such an environment will have reduced lifetime performance and endurance for the consumer, since gamers tend to change GPUs faster than CPUs and a future upgrade together with more demanding games will introduce a bottleneck (and a worse real life performance) faster.

Having said that, there is a benefit for the consumer to also see gaming testing in real life scenarios too, simply because these will be his own experience if he purchases the product. For this reason, I value reviews that tend to show performance in both scenarios. Remember, the gaming reviews are in the end trying to extrapolate holistic gaming performance by focusing on the specific object rated, be that a monitor, a GPU, a CPU etc. What you get now by using this or that is still valuable info, since PC gamers tend to use many different resolutions, aspect ratios, monitor refresh rates etc etc.

At the same time, there has been an argument around concerning gaming for some time now (I think since the bulldozer architecture was introduced). This goes like "as time passes by, game engines and APIs take advantage of more parallelization thus a good performer right now is not as valid as we might think, unless it has the spare resources (that current games don't utilize and future games will) to be "future-proof". This is partly a loaded argument, for two reasons.

1. The guys that really value gaming performance and do stay at the enthusiast part of the market follow closely the changes in both software and hardware. The whole idea of futureproofing is moot for someone that his main task is gaming and is regularly changing a 700$ GPU every year or so. Those people will simply buy what gives them the best performance every month.

2. The other (non enthusiast) parts of the gaming market - which btw make up the majority of it as far as software sales are concerned - use a lot more modest hardware (take a look at the averages in steam for resolution, CPUs, GPUs etc to understand what I am talking about). Also, they change hardware much slower than the enthusiast segment. The software providers know this very well, and the rate of adoption on tech that destroys their own client base's ability to pay them is - not surprisingly - much slower than the "future is now, your expensive four cores will stutter in a year or so" crowd wants to admit.

Somewhere inside the whole debacle there is an additional argument about the software market being driven by the consoles that have that many cores so in the future the games will--- but I'm really tired of typing now.
This is patently false. I have explained this earlier, so I won't go into details, but a 720p testing is a drawcall benchmark that is in no way indicative or current or future gaming performance. It would be like pointing to cinebench, and saying it's a good benchmark to test gaming performance.

Furthermore, the 7700k is obviously not more futureproof than the 1800x.

If you want to buy a new system every 2 years, then sure, get the 7700k. The average joe is better off buying a futurepoof CPU that will last the better part of a decade, and has a solid upgrade path though.

Go play multiplayer BF1. Your 4 core will stutter now, nvm a couple years from now.
 
Last edited:

hotstocks

Member
Jun 20, 2008
81
26
91
Not really strange at all. I bring my laptop to work every day where I have a lot of down time. I had a 2 year old laptop with a 970M in it. When I buy a laptop I want the best (within reason) as I know it will have to last 2-4 years. I don't mind spending the money on a 7700HQ/1070 laptop at 1080p because I KNOW it will last at least a few years. I bought 3 different models with same specs and returned two of them. One died in 2 months, the other ran at over 80C. The best 15" gaming laptop is MSI Dominator by far, beats Asus, Gigabyte, and even Clevo/Sagers for cool and reliable as well as overall build. When going to dinner with the wife is $100 or the family is $200, well I would rather skip a few dinners and have something I really want that is not gone in an hour. I also don't go on $2000 vacations that are gone in a week. I would rather have a new laptop and build a new 6 or 8 core desktop, still waiting on Ryzen to be no broken or Intel to come down in price. I am not cheap but I am not stupid about my purchases, not gonna buy a $1000 cpu unless it was like 10 times as fast as my current one.
 

unseenmorbidity

Golden Member
Nov 27, 2016
1,395
967
96
Not really strange at all. I bring my laptop to work every day where I have a lot of down time. I had a 2 year old laptop with a 970M in it. When I buy a laptop I want the best (within reason) as I know it will have to last 2-4 years. I don't mind spending the money on a 7700HQ/1070 laptop at 1080p because I KNOW it will last at least a few years. I bought 3 different models with same specs and returned two of them. One died in 2 months, the other ran at over 80C. The best 15" gaming laptop is MSI Dominator by far, beats Asus, Gigabyte, and even Clevo/Sagers for cool and reliable as well as overall build. When going to dinner with the wife is $100 or the family is $200, well I would rather skip a few dinners and have something I really want that is not gone in an hour. I also don't go on $2000 vacations that are gone in a week. I would rather have a new laptop and build a new 6 or 8 core desktop, still waiting on Ryzen to be no broken or Intel to come down in price. I am not cheap but I am not stupid about my purchases, not gonna buy a $1000 cpu unless it was like 10 times as fast as my current one.

No offense, but you don't seem to make wise tech purchasing decisions....
 

sirmo

Golden Member
Oct 10, 2011
1,014
391
136
With regards to gaming reviews.
The only way to test a processor for gaming is to provide a - partly unrealistic, ok - CPU bottleneck. This is done by removing AA and resolution while keeping the other graphical settings up. Thus, your CPU has to work overdrive so as to keep up with the huge number of frames that the GPU is providing in such a setup.
I am totally fine with this test as a datapoint. Where I have a problem is when it's not contrasted with the benefit of more cores.

The question isn't "[faster theoretical draw calls] or [slower theoretical draw calls]?" when you pit a 7700k vs Ryzen 7. The correct question is: "[faster theoretical draw calls] or [more cores / higher minimum frames / multithread future proof]?".

Many reviewers have failed to present this choice to the reader/viewer. And I find that disingenuous.
 

DrMrLordX

Lifer
Apr 27, 2000
21,815
11,171
136
Why do you think hexa Coffee Lake would be cheap? It still has a GPU and if they price it at 350$, they got to push down all desktop prices and it's sill on 14nm so that would be a problem for margins.Maybe if it has a very small GPU it could work out for Intel but less than ideal. We'll shall see how CfL is and how Zen+ is...

Intel will push down all the desktop prices. 14nm cost per transistor should drop as time goes by, mitigating the impact of using more wafers to provide the same amount of product to the consumer.

Intel is basically stuck. They can't do it any other way.
 

guachi

Senior member
Nov 16, 2010
761
415
136
This is anecdotal, but I swear the 1700X feels smoother for me when playing BF1 and Siege than my i7 3770K. Maybe its placebo, but sometimes I used to get stuttering on my OCed 3770K even when frames were high, but Ryzen seems to have eliminated that for me. (I have it on a second PC with a GTX980 and a new Crosshair mobo from Asus).

Do you often have other things taking CPU cycles in the background? Might the extra cores be helping you?
 
Reactions: DarkKnightDude

piesquared

Golden Member
Oct 16, 2006
1,651
473
136
Testing 1080p would have had no controversy if the reviewers would have framed it differently (it was to test performance in gaming at 1080p, instead of trying to 'bottleneck' the CPU). It all started with some youtuber claiming that AMD requested they test at 4K. This naturaly led to an AMD witchhunt even though the request was only that they include 4K benchmarks to give the full scope of the chip, since there are more than just 1080p gamers out there. And its been blown out of proportion ever since.
Ryzen is just fine at 1080p and performance is only going to get better thats virtually guaranteed. I dont think we can say that about quad cores. Theres nowhere to go but wider and smarter. Frequency is peaked and node shrinks will become commercialy unviable. AMD nailed 8 high performance cores right into a high volume market. This is new. Considering devs have extensive experience with coding for 8 cores on PS4 and XB1 id imagine optimizations will be making it to DT sooner rather than later. Interestingly PS4 also has a split L3, 1 per 4 cores.
 
Last edited:

imported_jjj

Senior member
Feb 14, 2009
660
430
136
With regards to gaming reviews.

The only way to test a processor for gaming is to provide a - partly unrealistic, ok - CPU bottleneck. This is done by removing AA and resolution while keeping the other graphical settings up. Thus, your CPU has to work overdrive so as to keep up with the huge number of frames that the GPU is providing in such a setup.

The relevance of such a test for real life scenarios goes as follows: A CPU that performs markedly worse in such an environment will have reduced lifetime performance and endurance for the consumer, since gamers tend to change GPUs faster than CPUs and a future upgrade together with more demanding games will introduce a bottleneck (and a worse real life performance) faster.

Having said that, there is a benefit for the consumer to also see gaming testing in real life scenarios too, simply because these will be his own experience if he purchases the product. For this reason, I value reviews that tend to show performance in both scenarios. Remember, the gaming reviews are in the end trying to extrapolate holistic gaming performance by focusing on the specific object rated, be that a monitor, a GPU, a CPU etc. What you get now by using this or that is still valuable info, since PC gamers tend to use many different resolutions, aspect ratios, monitor refresh rates etc etc.

At the same time, there has been an argument around concerning gaming for some time now (I think since the bulldozer architecture was introduced). This goes like "as time passes by, game engines and APIs take advantage of more parallelization thus a good performer right now is not as valid as we might think, unless it has the spare resources (that current games don't utilize and future games will) to be "future-proof". This is partly a loaded argument, for two reasons.

1. The guys that really value gaming performance and do stay at the enthusiast part of the market follow closely the changes in both software and hardware. The whole idea of futureproofing is moot for someone that his main task is gaming and is regularly changing a 700$ GPU every year or so. Those people will simply buy what gives them the best performance every month.

2. The other (non enthusiast) parts of the gaming market - which btw make up the majority of it as far as software sales are concerned - use a lot more modest hardware (take a look at the averages in steam for resolution, CPUs, GPUs etc to understand what I am talking about). Also, they change hardware much slower than the enthusiast segment. The software providers know this very well, and the rate of adoption on tech that destroys their own client base's ability to pay them is - not surprisingly - much slower than the "future is now, your expensive four cores will stutter in a year or so" crowd wants to admit.

Somewhere inside the whole debacle there is an additional argument about the software market being driven by the consoles that have that many cores so in the future the games will--- but I'm really tired of typing now.


Low res favors high clocks, real gaming is shifting and already the 6900k is better than the 7700k. In real gaming, from a certain point you are not ST bound with some exception but you can be MT bound in games that do scale. The problem is that it's like the Android doesn't scale myth, folks take it as true and can't be bothered to check.

Enthusiasts are not the ones that spend a lot on poor value , that's what research companies think but that's wrong and insulting. Enthusiasts are the ones that care and that's about all that is needed, even if they are dirt poor or ....rational consumers.

As for future proofing, ST doesn't scale so to gain access to more CPU resources, devs must go to more cores and that's nothing new. One of the reasons Intel has limited the number of cores is to upsell gamers, force them to pay more but that is over.

Anway, that doesn't mean that there isn't something less than ideal with Ryzen for now but if things get better and perf gets closer to the 6900k, it would become great for gaming too.
 

imported_jjj

Senior member
Feb 14, 2009
660
430
136
Intel will push down all the desktop prices. 14nm cost per transistor should drop as time goes by, mitigating the impact of using more wafers to provide the same amount of product to the consumer.

Intel is basically stuck. They can't do it any other way.

I don't think yields are poor enough for cost to get much better,
Cycle time will be reduced a bit but depreciation (and that's the bulk of the cost) won't come down.
They are also ramping 10nm and it's already delayed with volume shipments next year- remains to be seen if it ramps normally or has problems like 14nm but it's a drag on costs.
They will also have lower volumes and revenue if AMD gains any share at all and AMD has 3% revenue share, they will gain quite a bit and pressure volumes and ASPs in laptop too. Point being that lower volumes decreases utilization so costs go up for Intel.
Maybe they have 14++ but no idea how that impacts costs as there are zero details on such a process,if it exists.

All in all i believe it will depend on how Zen+ clocks and if AMD can launch it in a year from now, as opposed to 1.5years from now.
Without it, they might go 399$ and 499$ for 6 cores but they won't go lower unless Zen+ forces them to.
They'll focus marketing on ST and try to exploit software that doesn't fit Zen.-will be funny how some synthetic benchmarks will change behavior in a fundamental way soon and it will work as reviewers are too lazy not to use a lot of synthetic benches. AVX2 will be very popular this year lol
 
Last edited:

Udgnim

Diamond Member
Apr 16, 2008
3,664
111
106
This is anecdotal, but I swear the 1700X feels smoother for me when playing BF1 and Siege than my i7 3770K. Maybe its placebo, but sometimes I used to get stuttering on my OCed 3770K even when frames were high, but Ryzen seems to have eliminated that for me. (I have it on a second PC with a GTX980 and a new Crosshair mobo from Asus).

https://www.computerbase.de/2017-03.../#diagramm-battlefield-1-dx11-multiplayer-fps

too bad most reviewers don't do tests for more relevant CPU limited scenarios such as multiplayer in current games due to inability to do consistent tests

personally, I've been curious if someone can load up something like BF1 multiplayer on two PCs with different CPUs and just follow each other around and compare #s when they are by each other
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |