[PcGameshardware] The Witcher 3 Benchmark

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Interesting points:

nVidia desires to protect their work which is understandable -- addressed on one of my points which is do no harm -- and the performance hit on AMD is due to tessellation and AMD can still optimize with binary.

First, I have no clue what you mean when you say "optimize with binary" as that makes no sense from a developers point of view. Windows drivers are written in C or Assembly.

And you ignoring the elephant in the room that is Kepler cards performing worse than AMD cards.
 

Face2Face

Diamond Member
Jun 6, 2001
4,100
215
106
Yeeaah, as a 780 SLI owner, the way Kepler owners have been treated with recent drivers has ensured that im going with AMD next time around.

I'm feeling the same way, but I need to see the performance for myself before I make that call ( I love my Lightning ). I guess I'll know tomorrow when I fire her up.

I plan on playing a lot of this game like I did with SKYRIM. Which was the last major game that caused me to update my entire system (Graphical mods). I have a feeling the TW3 will have the same impact.

I guess If I want to stay with NVIDIA, I could always upgrade to A GTX 960... :whiste:

Sincerely,

A ButtHurt GTX 780 owner
 
Last edited:

ocre

Golden Member
Dec 26, 2008
1,594
7
81
Generally, AMD's architectures have been more forward looking than nvidia. Nvidia optimizes for the games that are out now and coming in the near future, and their proprietary effects work around the strengths of their latest architecture.
Kepler sucked at compute. Gameworks' effects are all compute based now, leveraging Maxwell's strength there. AMD is pretty good at compute too, so they're not as badly impacted at Kepler.

When Pascal comes out, I'm sure nvidia will create effects optimized for that architecture, and Maxwell will fall short in some new way. The Witcher 3 and other games make use of DirectCompute.
Take a look at this and look at the directcompute results:
http://www.anandtech.com/show/8526/nvidia-geforce-gtx-980-review/20

Does the rough performance of the nvidia cards look familiar? The little Kepler results are all well behind the early GCN cards. Only big Kepler is able to beat the 7970.
Nvidia just doesn't build forward looking cards. They have enough influence on games to dictate when new features become prevalent, and heavily optimize their drivers around the deficiencies of their cards.

I wouldn't be surprised if GCN eventually edges out Maxwell as well, despite losing in current games. It's been a trend since the start of programmable cards between ATI and Nvidia. Radeon 8500 was slower than Geforce3/4 but did better in games later (and DirectX8.1 made a very noticeable visual difference, closer to DX9 in the games it was used in than DX8). Radeon 9700 pro started out faster than Geforce FX, and completely destroyed it once FP24/32 shaders were used. Radeon X1800/X1900 series held up better than the Geforce 7 series, and now GCN is beating Kepler.

:thumbsup:
Good post.

At least you are trying to think about things on a deeper level. That seems ultra rare these days. Like, totally nonexistent.

I commend you sir. A million times, as it is great to see that there are still people out there that i fell i can have a meaningful discussion with. I am telling you, i have been catching up on the forums this morning and almost lost all faith.......wondering, why i even bother. But then i read your post.
i am sure will completely be ignored by most.........wait, i am just being negative now, but as i stated already, I kind of lost all my faith.

Anyway, I just want to add that i think it is a little more complicated than that. Obviously though, you are right for the most part. The consoles going GCN and the fact that AMDs design were very strong in direcompute (as well as openCL), kepler is at a huge disadvantage. But, it is not only compute, it is actually bigger than that. Others have stated (but been ignored) that fundamentally the architectures Maxwell and GCN are much closer and similar that either is to kepler. Looking at the breakdown block to block, they are.

But why try to be logical about any of this? It is so much easier to invent and mud sling, you know getting nuts. Its so much fun.

But back to my point. We could say that AMDs architecture was more forward thinking, AMD intended GCN to last for many years. The industry seemed to go that route and Nvidia did seem to adjust their maxwell architecture to end up much more GCN like.

That is how it ended up. But I think this goes back to fermi. This is where Kepler evolved. Fermi was very forward thinking for its time. In the days of the 4890 and 5870. Fermi was a revolution. But, obviously it was far from perfect. AMDs take, their shot was GCN. Obviously, there was a lot more to look at by the time GCN shows up. We were moving towards a new direction and it wasnt into the dark. The future was visible, so to speak.

Fermi evolves to Kepler and AMD launches GCN. Nvidia launches a more GCN like architecture called Maxwell.

There are other factors that may play, like cuda and if there was any benefit specifically related to the layout of kepler/fermi SMM. It really doesnt matter though, because ultimately we have maxwell and GCN much more alike than maxwell and kepler. Without these changes AMD should/could have had a huge advantage because of the GCN in the consoles.

We can say that nvidia didnt look far enough out. That AMD was more forward thinking. I could say that as architectures evolved, a better path emerged. AMD would have been in a unique position and could objectively decide on the route they wanted to take. Nvidia followed through with the fermi to kepler transition and ultimately adjusted with their Maxwell architecture.

I just feel like Cuda could have had a role in this becuase we see no Maxwell DP cuda monster. It may just may have nothing to do with it, but i do find it interesting.

Anyway, just to end this. Nvidias directcompute was so poor with kepler, there is no way we would have seen gameworks using directcompute, like we see today. It would have all been CUDA driven, like the water in Just Cause 2.......

I mean, there is so much more we could be talking about. You know, on a PC tech forum. In my opinion, all of our discussions get reduced to worthless and childish rubbish. It is unfortunate. Terribly so.

Obviously there is deeper things happening and true underlining reasons on a technical level, that we could be discussing!!! I think it would be so much more worth while. And hopefully, people will take the time and have deeper discussions than we typically have.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,572
248
106
Another 780 SLI owner here that is not happy. There is no reason for a 960 to beat a 780 under any circumstance. 1024 cores 2gb vs 2304 cores 3gb. I have the luxury of having 290s as well so I'll still get to enjoy the title. However, while I've always been vender neutral I did lean green all else equal. This has me strongly leaning Radeon for the next upgrade. Not sure if it will be Fiji or arctic islands at this point as the 290s still do a great job with "the mod"
 

Qbah

Diamond Member
Oct 18, 2005
3,754
10
81
Getting a GTX970 for a friend's PC was a good decision My paltry GTX670 on the other hand...

That aside, any CPU benchmarks available anywhere?
 

n0x1ous

Platinum Member
Sep 9, 2010
2,572
248
106
So what does the 960 have over it then? Less cores, less bandwidth, less ROPS, less VRAM
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
:thumbsup:
Good post.

At least you are trying to think about things on a deeper level. That seems ultra rare these days. Like, totally nonexistent.

I commend you sir. A million times, as it is great to see that there are still people out there that i fell i can have a meaningful discussion with. I am telling you, i have been catching up on the forums this morning and almost lost all faith.......wondering, why i even bother. But then i read your post.
i am sure will completely be ignored by most.........wait, i am just being negative now, but as i stated already, I kind of lost all my faith.

Anyway, I just want to add that i think it is a little more complicated than that. Obviously though, you are right for the most part. The consoles going GCN and the fact that AMDs design were very strong in direcompute (as well as openCL), kepler is at a huge disadvantage. But, it is not only compute, it is actually bigger than that. Others have stated (but been ignored) that fundamentally the architectures Maxwell and GCN are much closer and similar that either is to kepler. Looking at the breakdown block to block, they are.

But why try to be logical about any of this? It is so much easier to invent and mud sling, you know getting nuts. Its so much fun.

But back to my point. We could say that AMDs architecture was more forward thinking, AMD intended GCN to last for many years. The industry seemed to go that route and Nvidia did seem to adjust their maxwell architecture to end up much more GCN like.

That is how it ended up. But I think this goes back to fermi. This is where Kepler evolved. Fermi was very forward thinking for its time. In the days of the 4890 and 5870. Fermi was a revolution. But, obviously it was far from perfect. AMDs take, their shot was GCN. Obviously, there was a lot more to look at by the time GCN shows up. We were moving towards a new direction and it wasnt into the dark. The future was visible, so to speak.

Fermi evolves to Kepler and AMD launches GCN. Nvidia launches a more GCN like architecture called Maxwell.

There are other factors that may play, like cuda and if there was any benefit specifically related to the layout of kepler/fermi SMM. It really doesnt matter though, because ultimately we have maxwell and GCN much more alike than maxwell and kepler. Without these changes AMD should/could have had a huge advantage because of the GCN in the consoles.

We can say that nvidia didnt look far enough out. That AMD was more forward thinking. I could say that as architectures evolved, a better path emerged. AMD would have been in a unique position and could objectively decide on the route they wanted to take. Nvidia followed through with the fermi to kepler transition and ultimately adjusted with their Maxwell architecture.

I just feel like Cuda could have had a role in this becuase we see no Maxwell DP cuda monster. It may just may have nothing to do with it, but i do find it interesting.

Anyway, just to end this. Nvidias directcompute was so poor with kepler, there is no way we would have seen gameworks using directcompute, like we see today. It would have all been CUDA driven, like the water in Just Cause 2.......

I mean, there is so much more we could be talking about. You know, on a PC tech forum. In my opinion, all of our discussions get reduced to worthless and childish rubbish. It is unfortunate. Terribly so.

Obviously there is deeper things happening and true underlining reasons on a technical level, that we could be discussing!!! I think it would be so much more worth while. And hopefully, people will take the time and have deeper discussions than we typically have.

Very well said. +1
 

njdevilsfan87

Platinum Member
Apr 19, 2007
2,331
251
126
If NV doesn't fix Kepler performance in something that has potential to be GOTY, I won't be buying NV cards for a while. Do they think it's going to make me upgrade sooner if my GPUs stop performing? Sure, but I'll buy team red.

It's not that NV is nerfing Kepler - it's just that it seems like they aren't even trying. But that's fine as long as Maxwell tops the charts. Everything else is irrelevant. Now I expect the same to happen with Maxwell so why would I upgrade to that?

But I guess that what happens after posting large profits, seeing stock prices surge, and taking most of the market share. There's just less incentive for them to try.
 
Last edited:

lilltesaito

Member
Aug 3, 2010
110
0
0
If NV doesn't fix Kepler performance in something that has potential to be GOTY, I won't be buying NV cards for a while. Do they think it's going to make me upgrade sooner if my GPUs stop performing? Sure, but I'll buy team red.

It's not that NV is nerfing Kepler - it's just that it seems like they aren't even trying. But that's fine as long as Maxwell tops the charts. Everything else is irrelevant. Now I expect the same to happen with Maxwell so why would I upgrade to that?

But I guess that what happens after posting large profits, seeing stock prices surge, and taking most of the market share. There's just less incentive for them to try.

This is how I am feeling. I figured i would have my card longer like the 470 I had.Since I was happy with the 470, I spent more to get the 780ti. Now I feel like I have been screwed over and wish I went the AMD 290 instead(which I was looking at but because of the crazy bitcoin stuff, I did not go that way).
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
First, I have no clue what you mean when you say "optimize with binary" as that makes no sense from a developers point of view. Windows drivers are written in C or Assembly.

And you ignoring the elephant in the room that is Kepler cards performing worse than AMD cards.



They may be talking about optimizing binary translation or recompilation?
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Obviously there is deeper things happening and true underlining reasons on a technical level, that we could be discussing!!! I think it would be so much more worth while. And hopefully, people will take the time and have deeper discussions than we typically have.

I'm all for learning and objective discussions but the GTX 960 only has 1024 cuda cores and a 128-bit bus ------and defeating a GTX 780 is very odd.
 

Riceninja

Golden Member
May 21, 2008
1,841
3
81
kepler got screwed so hard, but what's even more pressing is that you need a titan x to run it in ultra in 1080p 60fps. that means this game is 2x more demanding than even gtav. i woulda been ok had the graphics look like the 2013 ingame footage, but the current downgraded graphics doesn't justify. poor showing from a developer that made its name on pc.
 

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
I'm all for learning and objective discussions but the GTX 960 only has 1024 cuda cores and a 128-bit bus ------and defeating a GTX 780 is very odd.

More than that, if there are serious flaws in the Kepler architecture, why are they most pronounced in games with nvidia code in them? It would lead toward the conclusion that either nvidia actively wants Kepler performance to suffer, or wants GCN performance to suffer relative to Maxwell and doesn't care about what that means for Kepler. The best case I can think of is that nvidia is merely totally apathetic about how their code performs for anything they're not actively selling.
 

Spjut

Senior member
Apr 9, 2011
928
149
106
This is how I am feeling. I figured i would have my card longer like the 470 I had.Since I was happy with the 470, I spent more to get the 780ti. Now I feel like I have been screwed over and wish I went the AMD 290 instead(which I was looking at but because of the crazy bitcoin stuff, I did not go that way).

That's how it goes when there's a new architecture.

I'd bet that Nvidia stopped optimizing for its Tesla GPUs when Fermi was released. Kepler was an evolved Fermi, so optimizations could trickle down.
Nvidia didn't want games to support DX10.1 because the 200 series didn't have it, but when Fermi came and introduced DX11 support for Nvidia, Nvidia obviously didn't care about the 200's series disadvantage anymore.

Likewise, AMD probably stopped optimizing for the HD 5000/6000 when GCN was released. The same year GCN was released, AMD even moved the HD 2000-4000 series to legacy support.
 

IEC

Elite Member
Super Moderator
Jun 10, 2004
14,362
5,026
136
My 290X is not looking so bad...

It'll be interesting to see where the 390X lands.
 

rgallant

Golden Member
Apr 14, 2007
1,361
11
81
More than that, if there are serious flaws in the Kepler architecture, why are they most pronounced in games with nvidia code in them? It would lead toward the conclusion that either nvidia actively wants Kepler performance to suffer, or wants GCN performance to suffer relative to Maxwell and doesn't care about what that means for Kepler. The best case I can think of is that nvidia is merely totally apathetic about how their code performs for anything they're not actively selling.
this
game ready maxwell drivers only , if you happen to own 1 or 2 your good. pays to spend 1k on nv cards every year.
the rest of the nv 80% market share , too bad-we have your money already.
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
That's how it goes when there's a new architecture.

I'd bet that Nvidia stopped optimizing for its Tesla GPUs when Fermi was released. Kepler was an evolved Fermi, so optimizations could trickle down.
Nvidia didn't want games to support DX10.1 because the 200 series didn't have it, but when Fermi came and introduced DX11 support for Nvidia, Nvidia obviously didn't care about the 200's series disadvantage anymore.

Likewise, AMD probably stopped optimizing for the HD 5000/6000 when GCN was released. The same year GCN was released, AMD even moved the HD 2000-4000 series to legacy support.

Not the case i was using quadfire 5970s a year and a half after GCN was released and the performance was great, it was only because of the Vram on one of the cards was failing why i stopped using them.
 
Last edited:

Spjut

Senior member
Apr 9, 2011
928
149
106
Not the case i was using quadfire 5970s a year and a half after GCN was released and the performance was great, it was only because of the Vram on one of the cards was failing why i stopped using them.

The HD 5000/6000 series did continue to get bug fixes and Crossfire profiles
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |