The AMD Mantle Thread

Page 188 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

krumme

Diamond Member
Oct 9, 2009
5,956
1,595
136
I suspected as much. This is a very draw call heavy way to do motion blur because its applied per object. Its more accurate than screen based motion blur. Its basically impossible to do this with DX with that many objects by calling on them all individually, you would need to batch them together. That would actually perform better in all cases (but be less accurate).

I think this is a classic example of the sort of problem you get comparing the two APIs. You would never choose this algorithm for DX, it makes no sense to spend 5k draw calls just motion blurring 5k objects when the impact of the effect is going to be minimally better than a much cheaper effect. The GPU can do it but the CPU is going to struggle. Its both a good reason for Mantle but also a bad choice of algorithm for a benchmark, because its obviously not something you would do on DX at all.

I understand the argument but the oxide guys said at the q&a at apu13 that right now using dx you are in a situation eg asking your creative artists to stack multiple textures together for no logical reason! Thats the reality. Not fancy blur.

We are severely drawcall limited for rts games. There is no other way around it. How can we actually debate it? We have all known for years it was a serious limitation. The one actually defining what could be done in a rts. We all agreed...

until mantle arived and practically removing this limitation with the cost of 2 man months for swarm.

Then suddenly there is all those long arguments with whining. For ones we could get radically better rts games and then people cry instead of saying its great.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
I am in no way against having more draw calls. My issue is the costs in getting there. I don't think the costs are miniscule ignorable things, but I dont doubt draw calls are a big issue, Arma 3 is definitely draw call limited (determined a in profiler that myself), Mechwarrior is also, planetside 2 has a lot of draw call issues as do many other games I have played. But the cost of loosing backwards compatibility, that its AMD GCN only, that it looses a lot of the safety for developer flexibility and the additional cost of the extra APIs. Its hard to add another API with these characteristics to the market as it stands, its got a lot of negatives based on the way its been done.

If Intel, AMD and Nvidia had got their engineers in a room and designed a better replacement for DirectX and were releasing drivers for it with the performance benefits and multithreading support then it would be awesome. But as it stands Nvidia largely supports openGL with extensions, DirectX with multithreading, Intel makes its extensions for DX and AMD has pretty poor openGL drivers and limited multithreading support for DX and is putting its weight behind Mantle on PC now. I don't know where all this leads but its not good for developers to not know what to write for.
 

Saylick

Diamond Member
Sep 10, 2012
3,513
7,776
136
And amd 6 series does not meet the minimum festure level. But what about kepler?

Who knows. Until AMD releases to the public what that minimum feature level is, it is anyone's guess as to whether or not nVidia/Intel may be able to run Mantle.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
Since game developers are showing some of the most excitement
about Mantle, I don't share that concern.

Lets say that Nvidia and Intel ignore it regardless of how awesome it appears to be. Both favour a standard they can work with in DirectX and openGL. Given that Mantle would then be on about ~5% of cards worldwide, maybe a little more I don't know the exact numbers but its not many on the grand scheme of things. It would thus lack critical market penetration and hence despite all its benefits it would never take off, and that would be a shame (assuming it does what its meant to).

I have the same concern about Gsync. I don't want to be stuck on Nvidia's cards, I would quite like a pair of 290X's but the only reason I haven't bought them is because I want gsync more than anything else. What I really want is 4k, gsync, Mantle, physX and trueAudio. I see the benefits from every bit of the advancements that both parties are making and its extremely annoying having to choose between them, because they are all great improvements.

If there was one wish I would have for this year its that AMD, Nvidia and whoever else can start working on making all this stuff work more widely, and if Microsoft is in the way then do it yourselves by choosing the API together and then working on their own implementations.

The current situation sucks, it potentially kills perfectly good improvements because of the limited availability. If its not on all cards then developers wont use it as much if at all.

I am stuck between knowing that innovation comes from a company working in isolation without standards stifling its creativity and at the same time wanting progress like with had with DX 7-> DX11 where the companies managed to work together to improve graphics without adding too much fragmentation. The current situation just sucks and we can't seem to have it all and if some of the players don't want to play the Mantle game it might not ever be popular enough to be anything more than a basic addon like physX.
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
Seriously what's the big draw on G-Sync? Isn't it only really useful in a narrow range at 40-60 fps? While that's an important range Mantle should have it mostly covered as well, assuming 33%+ gains over DX.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Seriously what's the big draw on G-Sync? Isn't it only really useful in a narrow range at 40-60 fps? While that's an important range Mantle should have it mostly covered as well, assuming 33%+ gains over DX.

So Mantle has tearing, stuttering at sub 60 fps framerates, and input lag covered? Interesting.

Or we could wait until 2015 for free-sync, but after OpenCL physics and HD3D being empty promises, free-sync will be the same. But that is outside the scope of this thread.

That's aside from the fact that you brought g-sync up, and g-sync fixes things that Mantle doesn't fix. Unless Mantle magically fixes input lag and tearing somehow......
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
Seriously what's the big draw on G-Sync? Isn't it only really useful in a narrow range at 40-60 fps? While that's an important range Mantle should have it mostly covered as well, assuming 33%+ gains over DX.

Gsync is great at all fps's from 30-144. You just get into the realm of diminishing returns with it above 60 fps which is why most demos of it have been from 30-60 where its benefits are most evident.

I will as it again in reply to backwards compatibility. Being backwards compatible has no point. Their GPUs are going to continue to support direct x for all past, present, and future games. Mantle doesn't need to be backwards compatible. It needs to be forward compatible with upcoming generations of games and GPUs to have staying power.

Get off the backwards compatibility argument it doesn't apply.
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
So Mantle has tearing, stuttering at sub 60 fps framerates, and input lag covered? Interesting.

The point would be 40fps on an Nvidia card would be closer to 50-60fps on an AMD card with Mantle.

That's aside from the fact that you brought g-sync up, and g-sync fixes things that Mantle doesn't fix. Unless Mantle magically fixes input lag and tearing somehow......
http://www.guru3d.com/articles_pages/nvidia_g_sync_review_guide,6.html

I've been using the monitor and have been playing with G-Sync for a couple of days now. My old and everlasting recommendation still stands, eveything below 30 FPS on average is crap, 30 to 40 is okay. But once you pass 40 FPS on average, that's where you'll get a wow factor in combo with G-Sync. For reasons I can't explain passing 40 FPS is precesiely the threshold where G-Sync starts to show off like a rock star and amaze.
Sure it looks to have a "cool" factor but you still need 40+fps. Dips are still bad, and dips are what you'll always get with an inferior API and Nvidia card. Mantle will smooth out the performance so 40fps under Mantle will feel like G-Sync at 40fps in most cases anyway. It's fps dips that cause most of the problems.
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
Gsync is great at all fps's from 30-144. You just get into the realm of diminishing returns with it above 60 fps which is why most demos of it have been from 30-60 where its benefits are most evident.

Not according to Hilbert over at Guru3d. He makes it clear that 40fps is the cut-off for G-Sync to work like it should and anything below that is basically much the same.

If you already use Vsync you'll be seriously underwhelmed anyway.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Lets say that Nvidia and Intel ignore it regardless of how awesome it appears to be. Both favour a standard they can work with in DirectX and openGL. Given that Mantle would then be on about ~5% of cards worldwide, maybe a little more I don't know the exact numbers but its not many on the grand scheme of things. It would thus lack critical market penetration and hence despite all its benefits it would never take off, and that would be a shame (assuming it does what its meant to).

I have the same concern about Gsync. I don't want to be stuck on Nvidia's cards, I would quite like a pair of 290X's but the only reason I haven't bought them is because I want gsync more than anything else. What I really want is 4k, gsync, Mantle, physX and trueAudio. I see the benefits from every bit of the advancements that both parties are making and its extremely annoying having to choose between them, because they are all great improvements.

If there was one wish I would have for this year its that AMD, Nvidia and whoever else can start working on making all this stuff work more widely, and if Microsoft is in the way then do it yourselves by choosing the API together and then working on their own implementations.

The current situation sucks, it potentially kills perfectly good improvements because of the limited availability. If its not on all cards then developers wont use it as much if at all.

I am stuck between knowing that innovation comes from a company working in isolation without standards stifling its creativity and at the same time wanting progress like with had with DX 7-> DX11 where the companies managed to work together to improve graphics without adding too much fragmentation. The current situation just sucks and we can't seem to have it all and if some of the players don't want to play the Mantle game it might not ever be popular enough to be anything more than a basic addon like physX.

I mostly agree with the above, especially "What I really want is 4k, gsync, Mantle, physX and trueAudio" though for me personally I'll be on 3K (Eyefinity 1080p) for a while.

Anyway, I think there's a slightly better chance that Mantle will take off in that some engine makers are incorporating it into their game engines. Lots of devs license something like Unreal Engine to make games with, so what if someone licenses Oxide's engine? Since Oxide has Mantle support, wouldn't games built on that engine support it by default, too? And if Mantle-supporting engines become popular and licensed the same way Unreal is popular and licensed today, wouldn't that mean that a significant chunk of games will come "Mantle ready"?

Also, the 5% marketshare number is today's number (based off Steam Hardware Survey dGPU numbers), but as time goes on that number should increase as more people upgrade to GCN video cards.
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
There is absolutely no chance that G-Sync will get near to Mantle's adoption. G-Sync is just the next halo product from Nvidia, that should be obvious to everybody. I'm interested in it and always have been but when I weighed it all up I just couldn't justify the purchase. It'll be another decent PR win for Nvidia though.
 

ams23

Senior member
Feb 18, 2013
907
0
0
Not according to Hilbert over at Guru3d. He makes it clear that 40fps is the cut-off for G-Sync to work like it should and anything below that is basically much the same.

If you already use Vsync you'll be seriously underwhelmed anyway.

You have a very poor understanding of both G-Sync and Mantle. In the current implementation of G-Sync, per Anandtech, "the longest NVIDIA can hold a single frame is 33.3ms (30Hz)", while "the upper bound is limited by the panel/TCON at this point". So on a 60Hz panel, the main benefit will be in the upper 30's to 60 fps range. And in this fps range, using G-Sync will be far superior to using V-Sync. As for Mantle, the main benefits of Mantle in the games that actually support it in the first place will be in CPU-limited scenarios (ie. GPU-limited scenarios will likely show only a relatively small benefit). G-Sync, on the other hand, "just works" with virtually any past, present, or future game when the fps hovers in the targeted range.
 
Last edited:

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
So Mantle has tearing, stuttering at sub 60 fps framerates, and input lag covered? Interesting.

Or we could wait until 2015 for free-sync, but after OpenCL physics and HD3D being empty promises, free-sync will be the same. But that is outside the scope of this thread.

That's aside from the fact that you brought g-sync up, and g-sync fixes things that Mantle doesn't fix. Unless Mantle magically fixes input lag and tearing somehow......
He asked a simple questuon. No need for the "I'm on a witch hunt!" Response.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Not according to Hilbert over at Guru3d. He makes it clear that 40fps is the cut-off for G-Sync to work like it should and anything below that is basically much the same.

If you already use Vsync you'll be seriously underwhelmed anyway.

Even if 40 FPS is the cut off point, that is a huge improvement. Many of these cutting edge games have you dipping below 60 all the time. Most dip to about 40 FPS. It's also great at 60 FPS, due to not having the added latency that V-sync adds. It's great all the way up to 144hz, but up to 80 FPS appears to be the sweet spot of great improvement. As a 120hz monitor user, I find FPS all the way up to 80 FPS to be very evident. With smooth rendering and no tearing, 40-80 FPS is a nice range of great improvement.

Another thing you seem to not realize, is G-sync does more than just fix tearing and latency. It allows the GPU to display frames with the same display time as the games internal clock. If the game starts frames with variable times, the sequence of events gets off, when forced to be displayed at steady time intervals on the display. With G-sync, the time intervals between the start of each frame is only altered by rendering times, the display does not interfere. This results in smoother, more accurate time sequences.
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
You have a very poor understanding of both G-Sync and Mantle. In the current implementation of G-Sync, per Anandtech, "the longest NVIDIA can hold a single frame is 33.3ms (30Hz)", while "the upper bound is limited by the panel/TCON at this point". So on a 60Hz panel, the main benefit will be in the 40-60fps range. And in this fps range, using G-Sync will be far superior to using V-Sync. As for Mantle, the main benefits of Mantle in the games that actually support it in the first place will be in CPU-limited scenarios (ie. GPU-limited scenarios will likely show little benefit). G-Sync, on the other hand, "just works" with virtually any past or present game when the fps hovers in the targeted range.

You have a very poor understanding of G-Sync. As mentioned by everybody, it's only really worth it at the 40-60fps range. Sure it might have a tiny benefit above that, and suck marginally less below...but the real workable range is still 40-60fps. By all means wax lyrical about how amazing it is in that range, but that's the fps range where it's truly operating.

G-Sync does not increase fps - in fact it lowers fps slightly. G-Sync does not help with DX overhead, stutters or anything that is outside of the GPU's ability. In fact it is completely dependent on the game and what other hardware you have - pretty far from "just works"

As far as your Mantle point goes ams, I stopped listening to anyone who thought it was only useful for CPU-limited situations a long, long time ago.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
You have a very poor understanding of G-Sync. As mentioned by everybody, it's only really worth it at the 40-60fps range. Sure it might have a tiny benefit above that, and suck marginally less below...but the real workable range is still 40-60fps. By all means wax lyrical about how amazing it is in that range, but that's the fps range where it's truly operating.

G-Sync does not increase fps - in fact it lowers fps slightly. G-Sync does not help with DX overhead, stutters or anything that is outside of the GPU's ability. In fact it is completely dependent on the game and what other hardware you have - pretty far from "just works"

As far as your Mantle point goes ams, I stopped listening to anyone who thought it was only useful for CPU-limited situations a long, long time ago.
You must be very focus on one or two reviews. Most the hands on reviews were extremely excited with the results. They were excited with much higher FPS than 60 as well. I don't know why you seem to think 60 is the cut off point for usefulness. Heck, at 60 FPS and I'm still getting nauseated.
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
Even if 40 FPS is the cut off point, that is a huge improvement. Many of these cutting edge games have you dipping below 60 all the time. Most dip to about 40 FPS.

Do you have proof of "most" games dipping to 40fps? If so you might have a case, somehow I have my doubts though and what will actually be the case is smooth gameplay between 40 and 60 fps, while still dipping to 30's and below in many cases.

Mantle will behave much more in line with consoles, and even with lower fps it will seem much smoother because the dips are far less obvious. G-Sync does nothing at all to prevent the dips, which are the real issue with PC gaming imo.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I'd love to look for you, but I have to head out. However, on reviews that show minimums, it is quite common that these minimum FPS are in the 30-40 range.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
You must be very focus on one or two reviews. Most the hands on reviews were extremely excited with the results. They were excited with much higher FPS than 60 as well. I don't know why you seem to think 60 is the cut off point for usefulness. Heck, at 60 FPS and I'm still getting nauseated.

SW assumes that everyone uses vsync. That isn't the case.

Even with the magical 60 fps "cut-off" that he suggests, g-sync allows you to add more image quality without any corresponding stuttering which WOULD happen when framerates dip below 60. When framerates dip below 60, at all, you get stuttering and tearing. Or you can use vsync and get input lag. Or you can turn vsync off and just get tearing and stuttering anyway.

G-sync fixes these problems. Mantle does something entirely different, and doesn't provide a magical fixed 60 fps at every resolution and image quality level for every piece of hardware that supports Mantle. On some hardware? Maybe. But if you use vsync? Mantle doesn't fix input lag. So i'm entirely confused as to why we're comparing two fundamentally different things. Anyway, it is cool that we're talking about G-sync in the Mantle thread. Gotta correct the mis-information. I figured that SW would want to talk about Mantle more rather than g-sync, I dunno though.
 
Last edited:
Status
Not open for further replies.
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |