Ashes of the Singularity User Benchmarks Thread

Page 21 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

railven

Diamond Member
Mar 25, 2010
6,604
561
126
You said Xbone-exclusive, there are no Xbone games currently making use of ACEs. That is changing soon with Tomb Raider - http://gearnuke.com/rise-of-the-tom...breathtaking-volumetric-lighting-on-xbox-one/

There have only been a handful of PS4 games using async compute. None of which use async compute on PC.

There are no current PC games (except for Ashes of the Singularity) using async compute. This is because async compute needs DX12, Vulkan or Mantle (or console API).

That's why there were no Xbone-exclusive games ported over to the PC destroying Nvidia cards. They don't exist yet.

Thanks for the clarification.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
A GTX 980 Ti can handle both compute and graphic commands in parallel. What they cannot handle is Asynchronous compute. That's to say the ability for independent units (ACEs in GCN and AWSs in Maxwell/2) to function out of order while handling error correction.

Microsoft demonstrated Asynchronous Compute on a nVidia card...
The Multi-Engine demo is running on my GTX980TI just fine.

It's quite simple if you look at the block diagrams between both architectures. The ACEs reside outside of the Shader Engines. They have access to the Global data share cache, L2 R/W cache pools on front of each quad CUs as well as the HBM/GDDR5 memory un order to fetch commands, send commands, perform error checking or synchronize for dependencies.

The AWSs, in Maxwell/2, reside within their respective SMMs. They may have the ability to issue commands to the CUDA cores residing within their respective SMMs but communicating or issueing commands outside of their respective SMMs would demand sharing a single L2 cache pool. This caching pool neither has the space (sizing) nor the bandwidth to function in this manner.

What? AWS?! Inside and outside of compute units? I dont get it...

Therefore enabling Async Shading results in a noticeable drop in performance, so noticeable that Oxide disabled the feature and worked with NVIDIA to get the most out of Maxwell/2 through shader optimizations.

Its architectural. Maxwell/2 will NEVER have this capability.

And yet you claimed in the last few days that Async Compute is responsible for the performance drop with DX12.
 

guskline

Diamond Member
Apr 17, 2006
5,338
476
126
I didn't put all my eggs in one basket. Running a GTX980TI with the 5960x and 2 R9 290s in CF with the 4790k!:thumbsup:
 

Mahigan

Senior member
Aug 22, 2015
573
0
0
<p>
Microsoft demonstrated Asynchronous Compute on a nVidia card...</p>
<p>The Multi-Engine demo is running on my GTX980TI just fine.</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>What? AWS?! Inside and outside of compute units? I dont get it...</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>And yet you claimed in the last few days that Async Compute is responsible for the performance drop with DX12. <img src="images/smilies/familiar/grin.png" border="0" alt="" title="Grin" smilieid="29" class="inlineimg" />
</p>
<p>&nbsp;</p>
<p>What NVIDIA demonstrated was the ability to issue graphic commands and compute commands in parallel. This is not the same thing.</p>
<p>&nbsp;</p>
<p>As for my mentioning async compute, well Anandtech and everyone else claimed NVIDIA could perform it. Therefore I assumed the reason they had low performance, while performing it, was due to the latency involved (not being able to perform async compute out of order). Turns out that while this is true... No async compute is possible due to the low performance I mentioned in my theory. You can thank Oxide and NVIDIA for working together in order to find a workaround to this issue.</p>
<p>&nbsp;</p>
<p><img src="images/smilies/familiar/smile.png" border="0" alt="" title="Smile" smilieid="39" class="inlineimg" /></p>
 

Mahigan

Senior member
Aug 22, 2015
573
0
0
Again, I'm not the one you should be angry towards. I didn't misinform you.

I think that NVIDIA ought to respond to this. It's their responsibility. If I were Ryan Smith, I'd go looking towards NVIDIA for answers. If none were forthcoming, I'd publish in order to compel an answer.

We all deserve an answer as consumers.
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
There's no use in trying to mitigate any damage to one Corporation. It isn't your job. Your job is to inform others and help the new PC gamers make the correct investments going forward. Your fellow PC gamers are your allies not your foes. By supporting one another, we place the burden of innovation in the large tech companies (as it should be).

so we should be all out pushing products? That's our job? Seriously?


Want to know what I find suspect?

None of the big tech publications have commented on any of this so far.

So it's ok to quote false marketing but not ok to question the big tech companies...

Hurray for PR! Down with journalism!

Pardon me for sounding so negative but this is a big reason why I am working to create a new website. So many tech publications have turned into nothing more than 3rd party Public Relations firms.

Whoa whoa.....settle down.

You obviously are ready to take this single bit of info and go on a crusade but perhaps it is you being way to hasty. Ambition can be admirable but cant you see the possibility this call to arms may be premature?

This is after all, a pre-beta benchmark that may simulate a game that is a very long way from launching.

And right now, with so very little information, you have your mind made up. You are calling everyone to go out and inform the world. Nvidia has a weakness!!!

i am not saying that nvidia has a hope but I will say we don't have a single dx12 game right now much less many where we can actually see how their cards will perform. I know you are ready to write off maxwell entirely over a pre beta (alpha) benchmark that even runs with an AMD logo in it.

But some developer claims this and that, but does anyone here remember how quickly anandtech forums shut down what a developer from project cars or any other nvidia sponsored game said? I mean, talk is talk. It may very well be true that nvidia asked that asynchronous functions be turned off but this doesn't mean nvidia is doomed or they will suck at dx12 games

I think more info is absolutely necessary on this one
 

Mahigan

Senior member
Aug 22, 2015
573
0
0
My theory also evolved over time as I debated with game developers from Beyond3D, HardOCP and folks at Overclock.net.

The more we discussed... The clearer the picture became. Oxide then came in and confirmed it.

Now it's NVIDIA's turn to explain what's going on.
 

Mahigan

Senior member
Aug 22, 2015
573
0
0
so we should be all out pushing products? That's our job? Seriously?




Whoa whoa.....settle down.

You obviously are ready to take this single bit of info and go on a crusade but perhaps it is you being way to hasty. Ambition can be admirable but cant you see the possibility this call to arms may be premature?

This is after all, a pre-beta benchmark that may simulate a game that is a very long way from launching.

And right now, with so very little information, you have your mind made up. You are calling everyone to go out and inform the world. Nvidia has a weakness!!!

i am not saying that nvidia has a hope but I will say we don't have a single dx12 game right now much less many where we can actually see how their cards will perform. I know you are ready to write off maxwell entirely over a pre beta (alpha) benchmark that even runs with an AMD logo in it.

But some developer claims this and that, but does anyone here remember how quickly anandtech forums shut down what a developer from project cars or any other nvidia sponsored game said? I mean, talk is talk. It may very well be true that nvidia asked that asynchronous functions be turned off but this doesn't mean nvidia is doomed or they will suck at dx12 games

I think more info is absolutely necessary on this one

I'm not pushing products. If this were the other way around I would have done the same.

And yes, it's a journalists job to ask questions and seek answers. This is more than a tidbit of information.

But I get it, you have your opinions and you're entitled to them.
 

Mahigan

Senior member
Aug 22, 2015
573
0
0
Ps, sometimes you need to publish in order to compel a response. Not publish everything... Just what the oxide dev stated.

That's it. This ought to compel a response by NVIDIA thus providing more information. I'd actually prefer if I were left out of it.

If you knew the level of hate thrown at me you'd understand my frustrations. Some folks are downright cruel.
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91

They said

MSAA is implemented differently on DirectX 12 than DirectX 11. Because it is so new, it has not been optimized yet by us or by the graphics vendors. During benchmarking, we recommend disabling MSAA until we (Oxide/Nidia/AMD/Microsoft) have had more time to assess best use cases.
Read more at http://www.legitreviews.com/ashes-o...chmark-performance_170787#IoF5upE3hHjelgJU.99

you said:

Oxide official said that their MSAA implementation under DX12 has problem because Oxide hasnt optimized it.
The ball is in their court

You are ignoring this :

Because it is so new, it has not been optimized yet by us or by the graphics vendors. During benchmarking, we recommend disabling MSAA until we (Oxide/Nidia/AMD/Microsoft) have had more time to assess best use cases.

Seems all they are saying is that it's different and the standard dx12 implementation could be optimized or done some other better way. This is not a problem with their game on nvidia hardware but a statement that the current MSAA may not be representative later on. They are talking about MSAA itself

I did miss that quote though. Since they had said it was doing the same thing on dx11 before.

Fundamentally, the MSAA path is essentially unchanged in DX11 and DX12.

http://www.oxidegames.com/2015/08/16/the-birth-of-a-new-api/
 
Last edited:

Mahigan

Senior member
Aug 22, 2015
573
0
0
Also NVIDIA claimed they could perform this function as well. Ryan Smith published this in his asynchronous compute article.

And no, NVIDIA aren't doomed. I never said they were. In fact, if you read my comments over at overclock.net and hardocp you'll find that I've actually said the opposite. They'll focus on GameWorks imo. I think GameWorks titles will lack this function. Oxide also thinks the same (or alluded to it by mentioning the Unreal Engine).

Again... I'm not the one who needs to answer for anything. Look towards NVIDIA for that.
 

TheELF

Diamond Member
Dec 22, 2012
3,993
744
126
Now that the developers have a taste of what it's like to have easy cross-platform, they'll never want to let it go.]
Wait,where did they get this taste from?
Allmost all ports done from the moment PS4/xbone came out had problems,how did they get a taste?

As for virtual reality I am waiting for that one since the mid nineties ,virtualboy anyone? I'll believe it when someone brings out and sells a mass produced unit.
(And actually sells a large quantity)
 

Mahigan

Senior member
Aug 22, 2015
573
0
0
If you want an idea as to my perspective. I'm a fan of wikileaks, Edward Snowden and Chelsea Manning.

I'm a fan of open source, open standards and journalistic freedom.

So yes... I'm quite "in your face" but at least this perspective of mine had she'd new light into what is happening. You can hate me, that's cool, but look at the results...

I've found out why AMD FX performs poorly (memory bandwidth bottleneck) by going through a back and forth with Oxide. I've figured why we see a 290x matching a GTX 980 Ti. I've even obtained open information from closed tests at Oxide.

You have to admit. By personal hero's are onto something. Openness and transparency leads to information and knowledge sharing. We're the better for it. Rather than play green vs red... Now we're discussing the specifics.

I'm an investigative journalist at heart. It's in my genes. I'm also a tech buff. Wouldn't you enjoy reading material which isn't a copy paste of marketing PR? Maybe yes, maybe no.

One thing is for sure. People will have more knowledge on which to base their next purchase. If it compels a response from NVIDIA, I will have done my job.

Peace.
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
I'm not pushing products. If this were the other way around I would have done the same.

And yes, it's a journalists job to ask questions and seek answers. This is more than a tidbit of information.

But I get it, you have your opinions and you're entitled to them.

What else is that supposed to mean then? You say our job is to inform others and help others make the correct investment? There really is no other way to take that. In the context of this alpha stage benchmark, discussing nvidia's dx12 performance, you say that.

I don't for one second take your instructions or suggestions. That is not my job, it is not why I hang out on tech forums. I visit because PC is my passion and has always been. I love technology and like to share my time with others who have the same passion. We discuss the HW and technology behind them, as well as the companies behind the technology. I am not on a mission to sway others towards what I think they should buy.

Then you say that I have my opinions and you get that. Opinions?
It is a fact, we have very little dx12 to look at. It is a fact that this benchmark is not a game at this point. It is a fact that it is in very early stages, not even beta.

You have very very little information to go on. That is a fact. You are jumping to conclusions at this point. Perhaps nvidia won't have the ability to compete at all in dx12 games, I could go around saying that without a single dx12 game to base it on. But how is this helping others make an informed purchase? Ultimately, you don't know and neither do I. You might think you do, you might hope, but you really don't know
 

VR Enthusiast

Member
Jul 5, 2015
133
1
0
Wait,where did they get this taste from?
Allmost all ports done from the moment PS4/xbone came out had problems,how did they get a taste?

It's happening now, TheELF. This is happening right now, it hasn't been happening until this point because we only just got DX12.

As for virtual reality I am waiting for that one since the mid nineties ,virtualboy anyone? I'll believe it when someone brings out and sells a mass produced unit.
(And actually sells a large quantity)

Oculus has sold 175,000 Rifts. By this time next year that number will have increased by ten-fold at least. VR is coming you better believe it.
 

Mahigan

Senior member
Aug 22, 2015
573
0
0
What else is that supposed to mean then? You say our job is to inform others and help others make the correct investment? There really is no other way to take that. In the context of this alpha stage benchmark, discussing nvidia's dx12 performance, you say that.

I don't for one second take your instructions or suggestions. That is not my job, it is not why I hang out on tech forums. I visit because PC is my passion and has always been. I love technology and like to share my time with others who have the same passion. We discuss the HW and technology behind them, as well as the companies behind the technology. I am not on a mission to sway others towards what I think they should buy.

Then you say that I have my opinions and you get that. Opinions?
It is a fact, we have very little dx12 to look at. It is a fact that this benchmark is not a game at this point. It is a fact that it is in very early stages, not even beta.

You have very very little information to go on. That is a fact. You are jumping to conclusions at this point. Perhaps nvidia won't have the ability to compete at all in dx12 games, I could go around saying that without a single dx12 game to base it on. But how is this helping others make an informed purchase? Ultimately, you don't know and neither do I. You might think you do, you might hope, but you really don't know

This is how you take that...

By challenging a tech company to release information you end up in a better position to help other make choices as to which GPU to buy. The more information you can obtain, the better.

As for me not knowing. My theory has been confirmed by Oxide. That's pretty compelling. This alone ought to warrant a response from NVIDIA.

I started off with a hypothesis, I tested this hypothesis by communicating and arguing with various cuda programmers and game developers. I then formulated a theory, Oxide has confirmed that theory. All that is left is clarification from NVIDIA.

That's it.

I think I've demonstrated that I am quite capable at investigating a problem.

What have you done?
 

Mahigan

Senior member
Aug 22, 2015
573
0
0
Imagine if, hypothetically speaking, NVIDIA confirms my theory.

what sort of changes would that bring?

Now imagine nobody has said anything, sat on their asses, didn't investigate. Trusted the PR material. Had "faith" in the issue being driver related and spent $650 on a GPU. Wouldn't they feel misled?

Everyone told them it supported this feature. Turns out it doesn't?

Like I said... All that's left is a response from NVIDIA.

But in order to get that response... Someone has to poke them.
 

TheELF

Diamond Member
Dec 22, 2012
3,993
744
126
It's happening now, TheELF. This is happening right now, it hasn't been happening until this point because we only just got DX12.



Oculus has sold 175,000 Rifts. By this time next year that number will have increased by ten-fold at least. VR is coming you better believe it.

So they didn't get a taste yet.
Maybe they are getting a taste right now,maybe not,we don't know.

And yes they sold 175,000 units,developer units, for people who do their own thing with it (driving schools for example or universities) ,that is far from being a omnipresent consumer product.
 

Eymar

Golden Member
Aug 30, 2001
1,646
14
91
Imagine if, hypothetically speaking, NVIDIA confirms my theory.

what sort of changes would that bring?

Now imagine nobody has said anything, sat on their asses, didn't investigate. Trusted the PR material. Had "faith" in the issue being driver related and spent $650 on a GPU. Wouldn't they feel misled?

Everyone told them it supported this feature. Turns out it doesn't?

Like I said... All that's left is a response from NVIDIA.

But in order to get that response... Someone has to poke them.

Yup, please keep the "opinions" coming as I find them informative and judging by view count on this thread probably many others. As a maxwell 2 owner I'm hoping the questions on async compute posed here are clarified by either Nvidia or media and what possible impact on future gaming performance with dx12 games\game engines that use async compute. As a tech enthusiast, I find your posts more interesting than market discussions that's for sure.
 

smeg1161

Junior Member
Apr 28, 2013
6
0
0
I don't think they're lying. I've had a fruitful debate with Razor1 over at hard forums. We're both quite knowledgeable and in the end the conclusion was that since there's no "checking for errors" on the nVIDIA Asynchronous Warp Schedulers they cannot perform "Out of Order". The end result would be a pipeline stall due to dependencies. This results in lower performance. Just as I had stipulated in my original posts (HardOCP and Overclock.net) and what Oxide confirmed.

The entire conversation can be read here: http://hardforum.com/showthread.php?t=1873640

This prompted Oxide to respond and confirm what I was thinking.

Anandtech should edit their article on Asynchronous Shading and remove the statement which says "nVIDIA can do this too". They don't appear to be able to perform Async Compute without hampering performance.

DX12 will, as I initially thought, turn the tides towards GCN for the time being. As far as going forward. We will see if Pascal comes with improved Async capabilities or if this will come with Volta. As for Greenland, we already have an idea of what to expect in DX12 titles.

On a sidenote....

We, as a PC Gaming community, really need to fight all of this partisanship. We ought to encourage critical thinking rather than accept marketing claims by the large tech Corporations. We should encourage research and scientific queries rather than bash one another over Green vs. Red.

If I was able to deduce this result from a little bit of research on GPU architectures, imagine what we could do as a community?

Peace.

like most established reviewers with offices to run and a lot of staff to pay Anand will do as they are told or suffer the wrath of Nvidia.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
If Nvidia attempts to pay the developers to drop async shaders they are acting illegally.

Well, since they "requested" Oxide not activate MSAA and were quite annoyed Oxide didn't, To the point that it's escalated into a bit of a "War of Words", trust me, nVidia "partner" games will not take advantage of anything nVidia doesn't support. They'll pay them for something else, A couple more nVidia logos in the game, but the game will be gimped to their liking. That's not going to change all of a sudden.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
imo price is a very big factor since nobody wants to lose money on consoles any more. ARM is cheap, and so if nvidia cut a deal I could see it. Especially since nVidia wants to sell some Denver cores. Also, I don't think x86 is as big a deal on consoles. ARM is a very popular instruction set as well and if it is powerful enough (which apparently is as low as Jaguar cores...) then it could happen. I dont think its probable, but its certainly possible

Msft and Sony specified X86. nVidia wasn't even in it. They blamed it on low margins and not being interested, but they just had nothing to go in with.
 

VR Enthusiast

Member
Jul 5, 2015
133
1
0
Well, since they "requested" Oxide not activate MSAA and were quite annoyed Oxide didn't, To the point that it's escalated into a bit of a "War of Words", trust me, nVidia "partner" games will not take advantage of anything nVidia doesn't support. They'll pay them for something else, A couple more nVidia logos in the game, but the game will be gimped to their liking. That's not going to change all of a sudden.

AMD is not a helpless victim. They are more than capable of proving illegal activity and did so in the past. If they are a sitting duck then they should just sell off their entire inventory at $2 a card and take Nvidia with them.

This is a different matter. It is illegal to harm your competitor through financial incentives. This isn't like gameworks where nvidia can pay devs to use their software - in this case they need to pay devs to not use AMD optimisations. That's the illegal part.
 
Last edited:

tential

Diamond Member
May 13, 2008
7,355
642
121
AMD is not a helpless victim. They are more than capable of proving illegal activity and did so in the past. If they are a sitting duck then they should just sell off their entire inventory at $2 a card and take Nvidia with them.

This is a different matter. It is illegal to harm your competitor through financial incentives. This isn't like gameworks where nvidia can pay devs to use their software - in this case they need to pay devs to not use AMD optimisations. That's the illegal part.

They'd sell off their entire inventory at $2, Nvidia would buy it all, and resell it and take AMD out and keep themselves....

Dumping your inventory first off is illegal like that.

Secondly, it's been proven to be a completely horrendous strategy so really don't see that.

If they were a sitting duck they'd sell off their company... lol....
Company's are in the business of making a profit, not "taking another company with them".
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
If you want an idea as to my perspective. I'm a fan of wikileaks, Edward Snowden and Chelsea Manning.

I'm a fan of open source, open standards and journalistic freedom.

So yes... I'm quite "in your face" but at least this perspective of mine had she'd new light into what is happening. You can hate me, that's cool, but look at the results...

I've found out why AMD FX performs poorly (memory bandwidth bottleneck) by going through a back and forth with Oxide. I've figured why we see a 290x matching a GTX 980 Ti. I've even obtained open information from closed tests at Oxide.

You have to admit. By personal hero's are onto something. Openness and transparency leads to information and knowledge sharing. We're the better for it. Rather than play green vs red... Now we're discussing the specifics.

I'm an investigative journalist at heart. It's in my genes. I'm also a tech buff. Wouldn't you enjoy reading material which isn't a copy paste of marketing PR? Maybe yes, maybe no.

One thing is for sure. People will have more knowledge on which to base their next purchase. If it compels a response from NVIDIA, I will have done my job.

Peace.

While I agree with you that about open source, etc.. I think your politics suck. You might want to leave them out of VC&G. Not only isn't this the place, but it could bias people's opinion of you, unnecessarily.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |