Slomo4shO
Senior member
- Nov 17, 2008
- 586
- 0
- 71
Looks like those who got 7950X2 when they were under 200 will really get their moneys worth.
I sure did, sold all 4 of mine for $375 each
Looks like those who got 7950X2 when they were under 200 will really get their moneys worth.
Baseless? Your post showed exactly what I said. You get small increases with a fast CPU and not too fast of a GPU and you get large increases with a slow CPU and fast GPU. You do realize that going with Crossfire doubles the GPU power, making the CPU the bottleneck again.Every GPU bound setup? What sample size do you have? 1? So I am now upset because I refute your baseless speculation?
PC Perspective has shown 4 benchmarks:
Dice provided 3 benchmarks in their blog:
If other benchmarks exist, I would be happy to see them.
But of what is known to be available, only one of these benchmarks can conclusively be deemed GPU bound. Considering that none of the benchmarks show performance details of quality changes, by what basis are you concluding the performance based on the question on hand?
Giving blind advise is just as ignorant as willfully accepting opinion as facts:whiste:
I'll wait for the drivers to be release before I further this discussion.
I just don't feel that artificially creating CPU bottlenecks so that Mantle may work is not the best laid plan.
No, its the possibility mantle opens for people shelling even more cash on the gpu than ever.
This only proves those cpus arent weak, rather that the api we all are using right now (dx) is weaksauce. Mantle might open the eyes of the people in charge of making that weak api get better (MS).
Ps: and please dont even bring up dx's dumb brother. If it was actually half as good as some people claim, OGL wouldnt have such a hard time competing against DX.
So you say ms is self sabotaging with DX?
I just don't feel that artificially creating CPU bottlenecks so that Mantle may work is not the best laid plan.
Thanks for that graph. That is actually interesting because of the middle benchmark using the FX-8350 @4.0Ghz and a 7970 at 1080p resolution on ultra for Battlefield 4 multiplayer. That is a GPU limited situation and shows a 25% gain in performance.
Here you can see the same CPU @ 4.0ghz in BF4 multiplayer, same resolution and settings, gamegpu refers to ultra as 'VHQ'. The only better performing CPUs are Intel Hexacore SB-E, which is expected as BF is one the few games where SB-E/IB-E are the best performing chips, or SB/Haswell i7 chips that support hyperthreading.
Is Mantle running slower than the D3D version of the same game?Nope, I think the implication is that artificially creating a huge overload on the CPU to create a situation where Mantle shines isn't necessarily the best course.
^ I hope the suggestion above that people should recommend weak CPUs is just misguided enthusiasm.
How many people here need i7/i5 powers for things other than gaming??
Comproved it comparing my Pentium G620 with i5 3570 Windows response times.
Is Mantle running slower than the D3D version of the same game?
That is a GPU limited situation and shows a 25% gain in performance.
I thought that middle graphic was a mix, or more balanced situation of CPU and GPU bound. They turned off 4x AA for it, at 1080p and they are running a middle of the road CPU.
It is likely finding spots where it is CPU bound, and others where it is GPU bound and the last one is clearly CPU bound.
Why is everyone so anxious?
So you only play one game? Gotcha.
I think you missed his point. If someone in a GPU limited situation gets a weak CPU just for the heck of it just for one game?
Nope, I think the implication is that artificially creating a huge overload on the CPU to create a situation where Mantle shines isn't necessarily the best course.
While you can definitely do things when coding a game/app/benchmark to incredibly bottleneck any system under DX, Why? There needs to be a balance between the benefit of a feature and the impact on system resources.
So you're buying a bargain bin crap CPU for 50-100$ for just those games. To pair with a 290X? Who are we kidding here? I think anyone blowing their load on a high end 290X GPU for 700$ in the states (600-650$ if they're lucky) would not pair it with a cheap junk CPU.
A bargain bin AMD APU might give you a semi okay experience in one game. Maybe. Big if. But probably not.So maybe one game, but all games? Definitely a big no.
So why would anyone do that, it's beyond ridiculous to pair a 500-700$ GPU with a cheap CPU.
Or then there's the 8350.
Maybe it will give you a good experience in one game. Why go cheap for one game, most PC gamers will buy a Haswell or IB-E for a good experience in all games. And it's not like Haswell is expensive either.
A 4670k sets you back around 200 bucks or less at Microcenter, while the 7850k costs 175$...so I can't see someone blowing 600-700$ on a GPU and then trying to nickel and dime on a CPU for 25 bucks. When the Haswell is the clearly better experience in all games, not just 1-2 games.
My rig will be good "test" for Mantle.