AMD will launch AM4 platform in March 2016 says industry source

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
DX12 won't do diddly squat for AMD's single core problems in games.

This sounds like the same pie in the sky nonsense like how AMD's APU's were gonna use their GPU to destroy Intel processors in FP applications.

How did that "revolution" work out for AMD?

It has been established with Mantle that even at default clocks the FX will gain tremendously and close the gap in Games vs the Intel Core CPUs.

DX-12 performance will be very close to Mantle.
 

Abwx

Lifer
Apr 2, 2011
11,172
3,869
136
The FX' main problems in MT (games with just a few threads and DX11) are also a result of the now in comparison really low ST performance, as MT just scales this up. The remainder gets lost due to low power efficiency under a TDP cap.

With games FX cores affinity must be set to 0 2 4 6 1 3 5 7, if its set to 1 2 3 4 5 6 7 8 the perf will collapse by the MT penalty wich is 10-20% in integer apps.

Eventualy the main thread should be given a non shared module as long as the thread count is less than 8, if the main thread is on core 0 then core 1 should be let unused as much as possible.

If this kind of management is not implemented it s likely that the results will mimick the ones of PClab......
 

Dresdenboy

Golden Member
Jul 28, 2003
1,730
554
136
citavia.blog.de
HBM1 wont be faster than it is. Overclocking doesn't count. HBM2 will double the clock, still be far behind in latency.

And no, its still not better latency than DDR2/3/4. You really need to lookup latencies on DDR1/2/3/4 for more than the first word. But also 4th and 8th in the transfer.
Do you assume that because of the individual memory bus width? I've seen mentions of 2x128b bus per stacked die. In this case it should be fine. Your critized metric is BW/channel.

Otherwise - did you see worse numbers than these?

See: http://www.extremetech.com/gaming/204668-amds-upcoming-fiji-gpu-will-feature-new-memory-interface
 

Dresdenboy

Golden Member
Jul 28, 2003
1,730
554
136
citavia.blog.de
DX12 doesn't solve the ST issue. Current DX12 tests show the exact same behavior as always.
I have seen only benches of two DX12 games, which are not enough of a sample size to prove anything. So my assumptions are mostly based on the known changes in the DX12 MT behaviour.

So if you've seen at least 10 DX12 games, we could probably see some, where the bottleneck is to be found in one heavy game thread. The little differences between DX11 or DX12 at the different settings on Vishera point in that direction. This same thread might be the limiter on the i3, especially as such a thread would effectively run at ~70% throughput due to HyperThreading.
 
Last edited:

Dresdenboy

Golden Member
Jul 28, 2003
1,730
554
136
citavia.blog.de
With games FX cores affinity must be set to 0 2 4 6 1 3 5 7, if its set to 1 2 3 4 5 6 7 8 the perf will collapse by the MT penalty wich is 10-20% in integer apps.

Eventualy the main thread should be given a non shared module as long as the thread count is less than 8, if the main thread is on core 0 then core 1 should be let unused as much as possible.

If this kind of management is not implemented it s likely that the results will mimick the ones of PClab......
Wasn't that one of the important changes in the different Windows schedulers?
 

Abwx

Lifer
Apr 2, 2011
11,172
3,869
136
Wasn't that one of the important changes in the different Windows schedulers?

It has been discussed here by a member who has a FX.
Supposedly the scheduler should manage this kind of issue but i cant tell you for sure since i do not own this CPU, perhaps some members could bring some clues.
 

DrMrLordX

Lifer
Apr 27, 2000
21,813
11,168
136
Hynix only sells one HBM1 product and got no plans for anything else.

Exactly. They're moving to HBM2.


Yes.

Or does his "disagreement" come with certain requirements?

Talk about vague . . .

He couldn't be more vague.

See above.

But again, he is there to sell AMD products.

HBM/HBM2 are Hynix products, and AMD will not be the only one using them.

So its not a surprise. When was the last time we could trust such a guy(Corporate VP of any company with X position)? The right answer is never.

So lacking any evidence to the contrary, the only recourse is ad hominem?

So not sure where you get your numbers from.

https://www.dropbox.com/s/6xl5j4gduvo64jp/aida64latency.png?dl=0
 

Vortex6700

Member
Apr 12, 2015
107
4
36
(Of course, it could offer twice the performance of Skylake and would still be deemed a disaster by about half the people on this forum).

True.

I can think of a few mods who would be out locking all benchmark result threads as flame bait.

Mod callouts are not allowed.
Markfw900
 
Last edited by a moderator:
Aug 11, 2008
10,451
642
126
It has been established with Mantle that even at default clocks the FX will gain tremendously and close the gap in Games vs the Intel Core CPUs.

DX-12 performance will be very close to Mantle.

Established how? With 2 DX12 "games" (basically demos)?
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
2 demos. And then devs will use the CPU power for something else, leaving AMD in the hole again.
 

coercitiv

Diamond Member
Jan 24, 2014
6,403
12,864
136
And then devs will use the CPU power for something else, leaving AMD in the hole again.
You get rid of a thread that is limiting performance based on how strong an individual CPU core is, while other cores may or may not be significantly loaded.

The beauty of DX12 (or other similar techs) is that this "monster" thread disappears: CPU cores are loaded evenly as far as graphics rendering is concerned. What developers do afterwards with regards to game engine requirements is another matter entirely, but this is a very important milestone.

Also, this is vendor agnostic: it benefits Intel just as much in absolute terms. In relative terms we'll see higher gains from CPUs with few strong cores (say Pentium, i3) or more weaker cores (small core quads from both vendors, construction CPUs with more than 1 module). This is literally a free lunch for everybody, especially in thermally constrained environments (mobile, SFF), yet inherent brand bias compels us to downplay it's benefits for the ... "competition".

PS: it's kind of ironic that you and AtenRa agree on something - he argues CPU requirements will actually increase in modern DX12 games, although with some flexibility towards the user (enable/disable features to accommodate a wide range of CPUs).
 
Last edited:

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
I agree DX 12 may be a good thing, I just think it's hysterical the ADF is calling it AMDs savior. It's always the same with them, just wait, the next version will be good.

In the meantime my mid-range 6 year old CPU is as fast as AMDs current mid-range CPU, with a third fewer cores and a third less power consumption.
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
Established how? With 2 DX12 "games" (basically demos)?

I said established with Mantle. In Mantle the performance of FX CPUs have substantially closed the gap to Intel counterparts. The same will happen on DX-12. As coercitiv explained above, the need for a strong single thread will not play that big of a role in the game performance anymore and the higher Developers push DX-12 capabilities the more they will need more threads, not higher single core perf.
 

Dresdenboy

Golden Member
Jul 28, 2003
1,730
554
136
citavia.blog.de
I agree DX 12 may be a good thing, I just think it's hysterical the ADF is calling it AMDs savior. It's always the same with them, just wait, the next version will be good.
It's not a savior. It's just likely that DX12 is the API AMD had more influence on due to works on Mantle and the XB1 API.

AMD's DX12 drivers also work much better than their DX11 counterparts. I'm not convinced that AMD would still put a lot of effort into the older API implementation to improve it's MT performance like Nvidia did.

There might be a different lever than: console versions cause developers to distribute core loads equally with an already efficiently distributed API.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
It's not a savior. It's just likely that DX12 is the API AMD had more influence on due to works on Mantle and the XB1 API.

AMD's DX12 drivers also work much better than their DX11 counterparts. I'm not convinced that AMD would still put a lot of effort into the older API implementation to improve it's MT performance like Nvidia did.

I think DX12 will have much less relevance on the gaming desktop than with notebook and mobile gaming, where DX12 has the potential to have a big impact.

Given that AMD is a nobody on these markets and will be for the foreseeable future, I don't think AMD actually pushed Microsoft towards this direction. It would be handing over the market to Intel and Nvidia big time.
 

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
I think DX12 will have much less relevance on the gaming desktop than with notebook and mobile gaming, where DX12 has the potential to have a big impact.

Given that AMD is a nobody on these markets and will be for the foreseeable future, I don't think AMD actually pushed Microsoft towards this direction. It would be handing over the market to Intel and Nvidia big time.

Nah,DX12 will have a big impact on desktop,since most gamers are not rocking the latest massively overclocked Core i7 which are common on enthusiast forums like this.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
Nah,DX12 will have a big impact on desktop,since most gamers are not rocking the latest massively overclocked Core i7 which are common on enthusiast forums like this.

Not a lot of people are running AMD processors either, and anyone with SNB i5 and above shouldn't get any massive improvements with DX12.
 

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
Not a lot of people are running AMD processors either, and anyone with SNB i5 and above shouldn't get any massive improvements with DX12.

Lots of people don't overclock - I know more gamers running Core i3 chips or very low end and low clocked Core i5 chips and so on - more than Core i5s K or Core i7 K series chips(plus not that uncommon to even see a few people with FX based rigs here in the UK either). Plus there are still plenty of people on older generation chips too.

Don't also conflate the US market with the rest of the world as not everyone has dirt cheap special deals when they throw in massive rebates and free motherboards.

Plus there are games which are bottlenecked by the single threaded performance of SB and IB chips even especially if you don't overclock which I suspect is most gamers out there. I should know myself having a Core i7 3770 level CPU(Xeon E3).

Some games like ARMA3 are ridiculously bottlenecked by a single thread to the extent even on decent rigs its still has rubbish performance,and just throwing hardware at it has not really been that practical.

So yes it is going to make a big difference to desktop eventually,especially with the next two generations of AMD and Nvidia cards which are no doubt going to be much faster and have far more complete DX12 support.

Edit to post.

People said the same thing aout DX11 too - just look at what happen when Blizzard implemented it into WoW. It actually improved framerates on average over DX9.

It happened roughly a year after DX11 was released too:

http://www.tomshardware.co.uk/world-of-warcraft-cataclysm-directx-11-performance,review-32061-7.html

Blizzard managed to add support to an old engine via an update. So DX12 support in a game like WoW would be significant to playability especially during raiding.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Lets look at the current DX12 games beta. Does an i3 beat a 8 core FX? Certainly does by far. Is this any different than DX11? Certainly isn't.

Good, now drop the lame DX12 excuse as some kind of savior and move on.



And its not helping AMD cards either...

Train departed and the ADF was left standing at the station.
 
Last edited:

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
Lets look at the current DX12 games beta. Does an i3 beat a 8 core FX? Certainly does by far. Is this any different than DX11? Certainly isn't.

Good, now drop the lame DX12 excuse as some kind of savior and move on.



And its not helping AMD cards either...

Train departed and the ADF was left standing at the station.

Umm,plus why are you trying to make this some AMD vs Intel thing in some lame way?? I have not had an AMD CPU in my system for like a decade,and the last AMD card I had was like nearly three years ago.

This is why this forum is dying a death now.

E-PEEN.E-PEEN never changes.

Also LMAO,the same bugged benchmarks where people were argueing for ever about Async shaders and the same dev saying they were still working on improving things.

When the initial benchmarks of that hit people said Nvidia is doomed. When the benchmarks and drivers were updated,it was soon AMD is doomed.

So,if you think DX12 is pointless,then are you not going to run any games in DX12 then??

You seem terrified for DX12 to actually improve things for PC gamers.

Plus,everything Intel have been talking about is about making single thread bottlenecks less important,but obviously Intel knows less about it than you and for some reason you want DX12 to fail.

Is it because hardware enthusiasts like you are terrified that once DX12 gets into full swing,that someone with a cheaper CPU will be able to match you in games,so the E-PEEN factor comes into play,so burn anything which changes the status quo with fire and make it a brand war.

Plus what you are saying is massively contradicting what is shown in those graphs - at least with the AMD side of things this bugged benchmarked seems to be improving things weirdly enough.

Personally I think you will be eating your hat in the next year or two once more DX12 games become common.

But then you have the latest Core i7,so I expect you are putting a hedge on games being more multi-threaded anyway!
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Seems you lost focus in your anger. DX12 wont magically save slow CPUs or fix game logic. And whatever resources that will be free, developers will use just as fast. Do we get better games with DX12? Sure. Does the requirements change? Not at all.

I got my 6700K for the 4Ghz base clock since I dont OC and its the fastest 1-4 thread CPU there is. What does that tell you?
 
Last edited:

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
Also,if anyone looked at those charts,the Nvidia cards are having performance regressions under DX12. At this point its pointless to take anything from the Ashes benchmark at all.

Next year we will be having plenty of DX12 games being released,and I expect all of them will see improved performance over the DX11 render paths using DX12.

I would rather believe what Intel and every other industry source has said about DX12 now,which is reducing CPU bottlenecks.

Seems you lost focus in your anger. DX12 wont magically save slow CPUs or fix game logic. And whatever resources that will be free, developers will use just as fast. Do we get better games with DX12? Sure. Does the requirements change? Not at all.

You lost your focus in your anger - you seem to want DX12 to fail and then post some weird graph which shows Nvidia performance degrading under DX12,and then AMD cards having 30% jumps in performance with a Core i3 which indicates its nether here nor there as a benchmark.

I got my 6700K for the 4Ghz base clock since I dont OC. What does that tell you?

It tells me that instead of getting a cheap pre-overclocked G3258 bundle which would not be far off in performance,you went for a Core i7 since it has 8 threads and you hedged your bets on games being better threaded.

All the people detracting from DX12 seem to be magically have Core i5s and Core i7s often running at high clockspeeds. The irony.

Plus you could have always got a Core i3 6320. 3.9GHZ base clockspeed.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Nobody ever claimed it wont reduce CPU bottleneck for the render. However to think developers will just leave that freed resource be untapped is silly. And we can see they use it. Its no different than all the other times we had API reductions. DX10 gave a 20% reduction, it was used right away.

It tells me that instead of getting a cheap pre-overclocked G3258 bundle which would not be far off in performance,you went for a Core i7 since it has 8 threads and you hedged your bets on games being better threaded.

All the people detracting from DX12 seem to be magically have Core i5s and Core i7s often running at high clockspeeds. The irony.

And that's coming from a FX user. Ironic indeed. Just buy a proper CPU next time, then you wont feel left behind.
 
Last edited:

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
And that's coming from a FX user. Ironic indeed. Just buy a proper CPU next time, then you wont feel left behind.

I don't have an FX - what are you smoking?? Are you confusing me with someone else??

I bought a Core i3 2120 to replace a dead Q6600 based Shuttle a few years ago as a stop-gap and then bought a Xeon E3 1230 V2 two years ago. I have not had an AMD based rig for years especially being a SFF PC fan. The last ATX/mATX rig I had was a decade ago with an old Athlon based one. Only had Shuttles and mini-ITX based rigs since then and AMD sucked for motherboard choice anyway with lagging chipset implementations on them,and the TDP of many of their chips meant it would be absurd to even try to plonk them in one.
 
Last edited:

Dresdenboy

Golden Member
Jul 28, 2003
1,730
554
136
citavia.blog.de
Lets look at the current DX12 games beta. Does an i3 beat a 8 core FX? Certainly does by far. Is this any different than DX11? Certainly isn't.
Moving the goalposts? I know that global warming prevented building a snowman instead. Before jumping to conclusions, it would help to check, how many cores the game utilizes and what after gfx the next heavy thread is doing regarding CPU utilization.

And its not helping AMD cards either...
With faster CPUs you can clearly see ~50% and ~80% improvements depending on detail settings. As long as the already better multithreaded Nvidia drivers don't work on AMD cards, it would be stupid to say, that DX12 won't help them based on an arbitrarily chosen CPU bottleneck to hide the effect.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |