thesmokingman
Platinum Member
- May 6, 2010
- 2,307
- 231
- 106
Another implication is that nVidia was probably aware of this the entire time and has designed compute back in to Pascal. If this is true, Pascal will not have the same leap over Maxwell that Polaris will be getting with GCN.
you really don't know what you are talking about.
http://images.techhive.com/images/a..._of_war_19x10_medium_dx12-100647959-large.png
http://images.techhive.com/images/a...e_rate_high_quality_19x10-100647718-large.png
GoW uses only 2 cores.True DX12 will use all Cores ( as possible ) .I think Atos bench really hurts Nvidia users , because It's faster than 980Ti , right? oh dear
^^Psst, links are dead.
hmm , i can see them , can you see bottom images?
GoW "scales" up to 8 thread at least. I'm losing 30FPS with two cores in the benchmark. And with 4 cores all of them are nearly at 100%. With 4c/8t i get 10% more frames over 4c/4t.
So GoW is a "true" DX12 game. :thumbsup:
It says:
Error 403 Forbidden
Alright, then Crysis 3 is a true DX12 game as well. :thumbsup:GoW "scales" up to 8 thread at least. I'm losing 30FPS with two cores in the benchmark. And with 4 cores all of them are nearly at 100%. With 4c/8t i get 10% more frames over 4c/4t.
So GoW is a "true" DX12 game. :thumbsup:
Seems like DX12 is broken on Maxwell and Kepler. But nevermind you can Upgrade to precious Pascal for "small" money very soon.
It's not just DX12, RTG is pulling ahead in almost all recent games. GCN architecture is finally starting to pay off it seems.
Why is AtoS a "true DX12" game? It doesnt even use CR, ROV or Tiled Ressource Tier 3. :\
A true "DX12" would be unplayable on any DX11 hardware.
BTW: The Division uses CR for nVidia's HFTS. Guess this makes it a true DX12 game even without the DX12 API, ha?
True DX12 doesn't need to be ported from old engine! and you're funny.above 4 core they're all slower than 4 cores. 82.7 vs 82.9 , wow!!!
Thats because those are.. wait for it... DX 11.3 features, not DX12 ones.
Pretty obvious since well, its not using a DX12 api.
DX12 features are all related to lower overhead for better engines such as async compute.
DX12 doesn't have any unique graphical features, its purely for optimization reasons. So trying to say the worst optimized released game in recent history is a hallmark of DX12 is just ignorant.
What do you mean? 980 is getting pounded by 390 and the only way to get team green on top is to use heavily OC 980Ti.Another DX12 game not showing the AOTS effect, how surprising.
Let's see:
AMD have't knew Hitman is coming so no drivers
No Async
Gameworks
Physx
Biased review
Or whatever excuse why AMD is slower in Gaming Evolved/DX12/Async game? :sneaky:
What? Gears of War is true DX12. Rendering scales over 8 threads. That a third person shooter doesnt show a huge improvement should be clear. But this game doesnt have one "render" thread.
So it is a true DX12 game.
It is available under DX11.3, too. Fact is it was announced as a new graphics feature for DX12. Microsoft has ported it back to DX11.3, so that developers dont need to use a low level API for those.
nVidia is using their NVAPI to make it usable under <DX11.2.
It's not the only purpose of it. The fact that Oxide and IO Interactive dont use nVidia features to improve performance make it clear that these games are not "true" DX12 games either. But for you "AC" is the only purpose to use DX12, right?
What am I missing? 390 is 15% faster than 980. :\
What do you mean? 980 is getting pounded by 390 and the only way to get team green on top is to use heavily OC 980Ti.
Yep, seems like ShintaiDK and Good_fella are referring to earlier, erroneous, benchmark results.
What am I missing? 390 is 15% faster than 980. :\
What? Gears of War is true DX12. Rendering scales over 8 threads. That a third person shooter doesnt show a huge improvement should be clear. But this game doesnt have one "render" thread.
So it is a true DX12 game.
It'll be interesting seeing where the 390(X) land. My money's on another "and this is why the AMD fans didn't jump on the Fury in huge numbers" release.
Yes in an AMD paid game.
Look a GTX980 is much faster than a Fury in Gears of War: http://wccftech.com/amd-radeon-crimson-16-3-drivers/
:sneaky:
Sontin, at this point, after all of what you have written on every topic, I do not believe anyone takes seriously what you are writing.
But that is only your fault.
GoW "scales" up to 8 thread at least. I'm losing 30FPS with two cores in the benchmark. And with 4 cores all of them are nearly at 100%. With 4c/8t i get 10% more frames over 4c/4t.
So GoW is a "true" DX12 game. :thumbsup: