The AMD Mantle Thread

Page 233 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

OCGuy

Lifer
Jul 12, 2000
27,227
36
91
I just feel that artificially creating CPU bottlenecks so that Mantle may work is not the best laid plan.
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Every GPU bound setup? What sample size do you have? 1? So I am now upset because I refute your baseless speculation?

PC Perspective has shown 4 benchmarks:


Dice provided 3 benchmarks in their blog:


If other benchmarks exist, I would be happy to see them.

But of what is known to be available, only one of these benchmarks can conclusively be deemed GPU bound. Considering that none of the benchmarks show performance details of quality changes, by what basis are you concluding the performance based on the question on hand?


Giving blind advise is just as ignorant as willfully accepting opinion as facts:whiste:

I'll wait for the drivers to be release before I further this discussion.
Baseless? Your post showed exactly what I said. You get small increases with a fast CPU and not too fast of a GPU and you get large increases with a slow CPU and fast GPU. You do realize that going with Crossfire doubles the GPU power, making the CPU the bottleneck again.

Your benchmarks show exactly what I said.

And seriously, take a chill pill. I gave no advise. I only gave an opinion that is backed up by your post, that he probably won't see much improvements with Mantle with his setup, using a 4k monitor.
 
Last edited:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
No, its the possibility mantle opens for people shelling even more cash on the gpu than ever.

This only proves those cpus arent weak, rather that the api we all are using right now (dx) is weaksauce. Mantle might open the eyes of the people in charge of making that weak api get better (MS).

Ps: and please dont even bring up dx's dumb brother. If it was actually half as good as some people claim, OGL wouldnt have such a hard time competing against DX.

I think you missed his point. If someone in a GPU limited situation gets a weak CPU just for the heck of it just for one game? Or at last check, there are 3 and possibly 4 confirmed games for 2014 release using Mantle. Actually, 2 of those aren't confirmed. Star Citizen won't have all modules (won't be a complete game) until next year, in other words Star Citizen won't be done until 2015. Then there's Thief, and a couple others. So BF4 and Thief are known. The rest are "maybe", but then you have a maximum of 4 games POTENTIALLY in 2014 using Mantle. NFS rivals was supposed to have Mantle. But it isn't getting Mantle as it turns out, despite using Frostbite 3. Anyway, in terms of games using Mantle in 2014, what we know for sure is BF4 and Thief. PC gamer's preview of Thief pinned it as quite a turd of a game, by the way, they had a preview up yesterday and they were underwhelmed. Anyway.....

So you're buying a bargain bin crap CPU for 50-100$ for just those games. To pair with a 290X? Who are we kidding here? I think anyone blowing their load on a high end 290X GPU for 700$ in the states (600-650$ if they're lucky) would not pair it with a cheap junk CPU. Because PC gamers tend to play a lot of games, not just one game. I'm sure there are exceptions, but gamers I know, play a lot of stuff. Especially since PC games are very cheap with Steam sales and what not. A Haswell 4670/4770k chip gets you a great experience in all games. A bargain bin AMD APU might give you a semi okay experience in one game. Maybe. Big if. But probably not. So maybe one game, but all games? Definitely a big no. So why would anyone do that, it's beyond ridiculous to pair a 500-700$ GPU with a cheap CPU. Or then there's the 8350. Maybe it will give you a good experience in one game. Why go cheap for one game, most PC gamers will buy a Haswell or IB-E for a good experience in all games. And it's not like Haswell is expensive either. A 4670k sets you back around 200 bucks or less at Microcenter, while the 7850k costs 175$...so I can't see someone blowing 600-700$ on a GPU and then trying to nickel and dime on a CPU for 25 bucks. When the Haswell is the clearly better experience in all games, not just 1-2 games.
 
Last edited:

james1701

Golden Member
Sep 14, 2007
1,873
59
91
Do you think Futuremark might incorporate this at some point? Since they did Physx for Nvidia, could they do this for AMD?
 

Grooveriding

Diamond Member
Dec 25, 2008
9,108
1,260
126

Thanks for that graph. That is actually interesting because of the middle benchmark using the FX-8350 @4.0Ghz and a 7970 at 1080p resolution on ultra for Battlefield 4 multiplayer. That is a GPU limited situation and shows a 25% gain in performance.

Here you can see the same CPU @ 4.0ghz in BF4 multiplayer, same resolution and settings, gamegpu refers to ultra as 'VHQ'. The only better performing CPUs are Intel Hexacore SB-E, which is expected as BF is one the few games where SB-E/IB-E are the best performing chips, or SB/Haswell i7 chips that support hyperthreading.




I would guess Intel i5 users (the majority of gamers most likely) and higher end FX AMD users are going to see a good performance boost with Mantle.
 
Last edited:

MutantGith

Member
Aug 3, 2010
53
0
0
So you say ms is self sabotaging with DX?

Nope, I think the implication is that artificially creating a huge overload on the CPU to create a situation where Mantle shines isn't necessarily the best course.

Hamfisted tesselation added on to Crysis 2 with a massive tesselation factor crippled AMD cards back when the Hi-res patch was released too. The solution wasn't to try and shame AMD into writing a better driver or redoing DX 11, it was to realize that turning up tesselation that high made NVIDIA cards look better by comparison, but for no real image quality bump in most scenarios.

While NVIDIA cards were better at tesselation and so were less impacted, and AMD was able to add a custom tesselation factor in CCC in order to alleviate the bottleneck if necessary, the proper approach really would have been to have fixed the coding before releasing it to a more nuanced and intelligent application.

While you can definitely do things when coding a game/app/benchmark to incredibly bottleneck any system under DX, Why? There needs to be a balance between the benefit of a feature and the impact on system resources.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Thanks for that graph. That is actually interesting because of the middle benchmark using the FX-8350 @4.0Ghz and a 7970 at 1080p resolution on ultra for Battlefield 4 multiplayer. That is a GPU limited situation and shows a 25% gain in performance.

Here you can see the same CPU @ 4.0ghz in BF4 multiplayer, same resolution and settings, gamegpu refers to ultra as 'VHQ'. The only better performing CPUs are Intel Hexacore SB-E, which is expected as BF is one the few games where SB-E/IB-E are the best performing chips, or SB/Haswell i7 chips that support hyperthreading.

I thought that middle graphic was a mix, or more balanced situation of CPU and GPU bound. They turned off 4x AA for it, at 1080p and they are running a middle of the road CPU.

It is likely finding spots where it is CPU bound, and others where it is GPU bound and the last one is clearly CPU bound.
 

el etro

Golden Member
Jul 21, 2013
1,581
14
81
^ I hope the suggestion above that people should recommend weak CPUs is just misguided enthusiasm.

How many people here need i7/i5 powers for things other than gaming??

Comproved it comparing my Pentium G620 with i5 3570 Windows response times.
 

MutantGith

Member
Aug 3, 2010
53
0
0
Is Mantle running slower than the D3D version of the same game?

Nope...Sorry, there seem to be a couple of nested quotes back. I think that enhancing the CPU limitations of the DX build of the same game, or, similarly, intentionally recommending a lower capability CPU might not be the best strategy.

Of course, I could have misinterpreted. The dizzying rate of quick, single line responses is a little hard to keep a hold on some times.
 

Imouto

Golden Member
Jul 6, 2011
1,241
2
81
That is a GPU limited situation and shows a 25% gain in performance.

The performance is balanced in DX and after Mantle it gets totally CPU bottlenecked. The CPU is pushing 15.08 ms frame times (66 FPS) while the GPU is pushing 11.61 ms (86 FPS).
 

Grooveriding

Diamond Member
Dec 25, 2008
9,108
1,260
126
I thought that middle graphic was a mix, or more balanced situation of CPU and GPU bound. They turned off 4x AA for it, at 1080p and they are running a middle of the road CPU.

It is likely finding spots where it is CPU bound, and others where it is GPU bound and the last one is clearly CPU bound.

Yeah basically your average system for BF4; 7970 and a FX8350 there which performs similarly to a modern 4670K i5 in the game. 25% gain in performance in 64 player multiplayer from Mantle.

Looks really good :thumbsup:

Will be interesting to see if Intel i5's gain similarly from running Mantle in that setup, I would guess they likely will, maybe even more so since the FX8350 does so well in BF4 because it is a properly multithreaded game.
 

Tmf

Senior member
Jan 15, 2014
247
1
81
I really wish they would release the driver so we can cool down this thread, it's getting a little hot in here!
 

Irenicus

Member
Jul 10, 2008
94
0
0
Too many fractured results released so far. We need some results with more constant variables.

AMD 8350 with:
280x
290(x)
cf 290(x)
@ 1080p / 1440p/1600p


Repeat with the Kaveri A10, i5, i7 quad core, i7 hex core, later haswell-e


How do things vary depending on scenes? I really want to see how these different configurations handle things like increased unit counts/draw distances. I suppose since the oxide demo is live we can at least see extreme examples of that as well once the mantle patch goes live. But we really need better context for performance.


If mantle really does wipe away the cpu bottlenecks, shouldn't we see the gaps close considerably between an 8 core 8350 and a 4 core i7 with hyperthreading?
 

Xarick

Golden Member
May 17, 2006
1,199
1
76
errr.. so if what I am reading in the notes is true the only people who benefit are people with a 290 and above? Apparently they are going to optomize for every other GPU in a future release? I am confused.. does this mean mantle has to be catered to each GPU type independently?
 

rancherlee

Senior member
Jul 9, 2000
707
18
81
My rig will be good "test" for Mantle. My old PIIx6 can feed my 7950 to 99% for the most parts but there are situations where sometimes my GPU drops down to ~70% usage like the Shanghai building collapse. Mine seems to be an ideal system that Mantle should help and I might just be able to run 1080p Ultra @ 60fps V-sync with it! right now I average 55-65fps for an entire round and lower graphics setting make me CPU limited.
 

Ertaz

Senior member
Jul 26, 2004
599
25
81
What if I get a couple 290X's to go in my I5-2500 running 1440P? Is mantle supposed to help me much then?
 

TreVader

Platinum Member
Oct 28, 2013
2,057
2
0
Why is everyone so anxious?

Probably because people have spent the last 200 pages of this thread hating on mantle, saying it was fake/useless/vaporware/etc and are just now being proven wrong.


ITT: A lot of haters, and a few people who actually want to see what mantle can do.
 

el etro

Golden Member
Jul 21, 2013
1,581
14
81
So you only play one game? Gotcha.

hahahahaha.

Top class game engines can optimize Mantle support from the development, and then will be developers choice to add in-game support of the tech.


I think you missed his point. If someone in a GPU limited situation gets a weak CPU just for the heck of it just for one game?

I agree. I said it here too.

Nope, I think the implication is that artificially creating a huge overload on the CPU to create a situation where Mantle shines isn't necessarily the best course.


If it were artificial, why the performance gains was not over 300%?


While you can definitely do things when coding a game/app/benchmark to incredibly bottleneck any system under DX, Why? There needs to be a balance between the benefit of a feature and the impact on system resources.


What could prove that AMD/DICE is doing it to the situation on BF4?? If you talked about the 300% gains Oxide games stated about Mantle performance in their demo...


So you're buying a bargain bin crap CPU for 50-100$ for just those games. To pair with a 290X? Who are we kidding here? I think anyone blowing their load on a high end 290X GPU for 700$ in the states (600-650$ if they're lucky) would not pair it with a cheap junk CPU.


If i explained more my arguments you would understand my point.

Most of users don't need the CPU power of i5 to do other things than gaming. The use of powerful processors was a necessity that comes to us that wanna drive powerful GPUs in our rigs.


A bargain bin AMD APU might give you a semi okay experience in one game. Maybe. Big if. But probably not.So maybe one game, but all games? Definitely a big no.


If this was true, Sony would not chose the use of a 8-core low power CPU to pair with $150 of graphics horsepower in their system. And we all know how games were 4-thread limited until 2013...
Sony was aware of the growth of rival gaming solutions(PC gaming too) and needed a affordable solution that can still deliver next-gen graphics quality. And Sony engineers work with gaming systems since 1995, so they know what they're doing.


So why would anyone do that, it's beyond ridiculous to pair a 500-700$ GPU with a cheap CPU.


More deep than the bottleneck question is the bad frametime performance of low-end processors paired with strong GPUs. And Mantle seems to heavily decrease it.


Or then there's the 8350.

This is where i said you i need to explain more the things i write, it make it seems that i'm poorly biased with my convictions.
Since many newer games can use very well 6 to eight threads, the use of the heavily threaded processor can help in situations where per-core performance is not that good, as with the PS4 Cpu.
And here enters the 8350(or the 8320 that is a MUCH better buy).


Maybe it will give you a good experience in one game. Why go cheap for one game, most PC gamers will buy a Haswell or IB-E for a good experience in all games. And it's not like Haswell is expensive either.

No, is not, but with a good search you can "upgrade" the planned GPU with the 90 dollars you save from a i5 to the FX-6300.


A 4670k sets you back around 200 bucks or less at Microcenter, while the 7850k costs 175$...so I can't see someone blowing 600-700$ on a GPU and then trying to nickel and dime on a CPU for 25 bucks. When the Haswell is the clearly better experience in all games, not just 1-2 games.

The FX-6300($110 at newegg) is the processor that really needs to consider in this discussion. For applications that use many cores, it is the best value processor.
For limited budget systems the 6300 is a great deal paired to GPUs up to a 270x.


It holds a good overclocking power too, and with all cores at 4.8Ghz have great CPU power(for what you can take of a $110 processor) to drive most of todays GPUs.



If Mantle heavily reduces the performance GAP with weak and strong processors like the graph shows,what will be the GPU limit for a 4.8Ghz 6300?





The only processor that hold more overall value than the FX-6300 is the Athlon 750k(and I don't believe this one will make a good pair with a 290/290x).

My rig will be good "test" for Mantle.

Will be the rig for Mantle. But needs to have one more 7950 here to make the perfect test.












EDIT: Last imagination of mine(picked the lowest prices because its all hypothetic):

A55 Mainboard: $43
AMD Quad Core Processor: $75
4GB Ram DDR3: $48
320GB HDD: $30
Cheap PSU: $25
Cheapp SFF case: $45
R7 260x: $139


This is a PS4 performer at ~$400.
With a little cheaper GPU-less quad-core processor and a little better 250w PSU at the same price, we will be able to pick a 7870 for the config, and make a real PS4 killer for about to the same price of PS4.
 
Last edited:
Status
Not open for further replies.
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |