GameGPU shows GTX 750 Ti faster than GTX 760 in Doom (Vulkan)?

hsjj3

Member
May 22, 2016
127
0
36


As you can see, the 750 Ti is at 37fps and the 760 is at 31fps. I used to have the 750 and it was definitely comfortably behind a 760.

Is the above result a mistake, or is it real? The only explanation is that 760 is Keplar while 750 Ti is Maxwell...but I have trouble trying to make sense of this.
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
232
106
I believe Kepler isn't optimized for this game.

zlatan said:
Kepler is generally a strong architecture when the shaders are optimised for ILP. In this case all the cores can be used in any Kepler product. The only problem when you don't optimize your shaders for independent workloads, so Kepler will loose 33 percent of the theoretical performance. This is the case with GameWorks. These shaders just use 67 percent of the Kepler GPUs, so 33 percent of your shader cores will always be in idle.
Optimising for Kepler is really easy. There might be some workloads when it is not practical to find some independent operations for the idle cores, because of the register pressure, but many times this is not a problem.
The problems are the licences. With some specific codes NV won't allow the devs to optimize the shaders. They can see the source, but they don't able change it, and the original code will hurt Kepler. This is why some games don't run well on these products. Basically the licences won't allow the performance optimization.

Highly likely Kepler has no proper Vulkan code on Doom, remember Vulkan and DX12 is arch based.
That might as well. Although, look here. Supported.
 
Last edited:

Shivansps

Diamond Member
Sep 11, 2013
3,873
1,527
136
Highly likely Kepler has no proper Vulkan code on Doom, remember Vulkan and DX12 is arch based.
 

hsjj3

Member
May 22, 2016
127
0
36
Well, so that 33% performance loss...it has to be on Nvidia right? I mean, by all accounts the GTX 770, 780 and 780 Ti and Titan should all still be decent GPUs.

It's incredible how AMD's old GPUs are making almost 50% gains, and Nvidia's older GPUs simultaneously losing 33% performance!
 

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
I suppose it's theoretically possible to optimize for kepler, but no developer is going to do that at this point. Nvdia will probably not be very supportive either, they'd rather have developers optimize for cards that are actually for sale.

The people running that old stuff are probably not buying games full price anyway.
 

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
I suppose it's theoretically possible to optimize for kepler, but no developer is going to do that at this point. Nvdia will probably not be very supportive either, they'd rather have developers optimize for cards that are actually for sale.

The people running that old stuff are probably not buying games full price anyway.

What you're saying seems to be the common sentiment, but what about the possibility that the architecture itself is just no good and not designed for modern code?
 

Ares202

Senior member
Jun 3, 2007
331
0
71
The people running that old stuff are probably not buying games full price anyway.

That seems a very baseless statement, lots of people I know have 5000 / 6000 or 7000 series cards or AMD equivalent and still buy new games. Some people like to play games and aren't really interested in graphics settings, they are only interested in it running smoothly.
 

SPBHM

Diamond Member
Sep 12, 2012
5,058
410
126
I suppose it's theoretically possible to optimize for kepler, but no developer is going to do that at this point. Nvdia will probably not be very supportive either, they'd rather have developers optimize for cards that are actually for sale.

The people running that old stuff are probably not buying games full price anyway.

people running 780 TIs don't buy games, ok


thing is, GCN 1.0 which is older than Kepler is running it REALLY well...

as for the OP, the problem is not the 760 but the whole Kepler line, if you check the cards even the 7850 OC (r7 370) is beating the 780 ti.

something is seriously broken, because Kepler is not that inefficient and have the specs to perform well on this thing.

it also might be related to vram, gamegpu is testing a 370 and 750 ti with 4GB and not the more popular 2GB models!?
 
Last edited:

cytg111

Lifer
Mar 17, 2008
23,524
13,098
136
780(non ti) owner right here .
Buuuuurned.

I am getting owned by a R7 370 here
 

Shivansps

Diamond Member
Sep 11, 2013
3,873
1,527
136
I dont think there is anything Nvidia can do about it, everything on DX12 and Vulkan is on devs hands to optimise, this also happens on OpenGL?
 

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
I dont think there is anything Nvidia can do about it, everything on DX12 and Vulkan is on devs hands to optimise, this also happens on OpenGL?
Nvidia could suggest things to make kepler perform better, but they've pretty much abandoned it when it comes to performance optimization. Though they will still fix bugs.

In opengl it's not that bad since the driver is pretty decent in running it even though there is no game specific stuff added anymore.

 
Feb 19, 2009
10,457
10
76
Well, so that 33% performance loss...it has to be on Nvidia right? I mean, by all accounts the GTX 770, 780 and 780 Ti and Titan should all still be decent GPUs.

It's incredible how AMD's old GPUs are making almost 50% gains, and Nvidia's older GPUs simultaneously losing 33% performance!

I made posts on this topic awhile back. Kepler when running code optimized for Maxwell or GCN would lose 1/3 of it's performance. This is exactly what happens in many GameWorks titles. I didn't know zlatan had said the same thing.

Pascal in theory, should run GCN optimized code better than Maxwell, but I am not sure about GP104 due to its different arrangement to GP100 and the Pascal Whitepaper is mostly GP100 based.

It's interesting that zlatan brings up non-modifiable GameWorks libraries, in their default NV supplied state, they do gimp Kepler.
 

BlitzWulf

Member
Mar 3, 2016
165
73
101
I made posts on this topic awhile back. Kepler when running code optimized for Maxwell or GCN would lose 1/3 of it's performance. This is exactly what happens in many GameWorks titles. I didn't know zlatan had said the same thing.

Pascal in theory, should run GCN optimized code better than Maxwell, but I am not sure about GP104 due to its different arrangement to GP100 and the Pascal Whitepaper is mostly GP100 based.

It's interesting that zlatan brings up non-modifiable GameWorks libraries, in their default NV supplied state, they do gimp Kepler.


Ouch, it's hard to read that with a 980 Ti sitting in my rig right now,the idea of Nvidia purposefully holding back my performance to make newer cards look better frankly turns my stomach .Big Vega cant come soon enough.
 

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,120
260
136
It's cause the driver engineers at Nvidia can't afford to keep optimizing the shader compiler for the Kepler microarchitecture ...

AMD like Nvidia shares an intermediate GPU language such as AMDIL or NVPTX that gets converted from the generated HLSL/GLSL/SPIR-V bytecode but the big difference between the two IHVs is that AMD doesn't aggressively change it's ISA so much ...

That's one more reason why console manufacturers like AMD's current GPU microarchitectures is that they can bet that AMD is going to stay on GCN along with it's derivatives and be able to make them backwards compatible with the console APIs ...
 
Last edited:

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
The people running that old stuff are probably not buying games full price anyway.

"Old stuff?!" Today we are two years removed from a $700 780ti (980 was released in fall 2014). Only in smartphone land is two year old $700 hardware considered obsolete.

Heck the 7970 is from 2011!!!! And yet it still gains in new games.

If Nvidia isn't going to fully support two year old $700 hardware then they need to give up on pretenses and lease GPUs instead of selling them.




Actually never mind I don't want to give them "founder's edition" level ideas.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
Kepler was announced in 2010, though.

Cards came out in early 2012.

So Kepler was likely designed in 2010.
 

Yakk

Golden Member
May 28, 2016
1,574
275
81
It's cause the driver engineers at Nvidia can't afford to keep optimizing the shader compiler for the Kepler microarchitecture ...

AMD like Nvidia shares an intermediate GPU language such as AMDIL or NVPTX that gets converted from the generated HLSL/GLSL/SPIR-V bytecode but the big difference between the two IHVs is that AMD doesn't aggressively change it's ISA so much ...

That's one more reason why console manufacturers like AMD's current GPU microarchitectures is that they can bet that AMD is going to stay on GCN along with it's derivatives and be able to make them backwards compatible with the console APIs ...

What would be the reason(s) for nvidia changing all their hardware level code for each architecture release instead of evolving their code like AMD is doing? Was it not built to be modular at all like GCN has been for the last half decade? And still isn't I guess?
 

Shivansps

Diamond Member
Sep 11, 2013
3,873
1,527
136
"Old stuff?!" Today we are two years removed from a $700 780ti (980 was released in fall 2014). Only in smartphone land is two year old $700 hardware considered obsolete.

Heck the 7970 is from 2011!!!! And yet it still gains in new games.

If Nvidia isn't going to fully support two year old $700 hardware then they need to give up on pretenses and lease GPUs instead of selling them.




Actually never mind I don't want to give them "founder's edition" level ideas.

To be fair, Doom on OpenGL is kinda the worse game ever for Kepler, and that is a non gameworks title.

Then ID Tech goes forwards and makes things even worse in Vulkan for Kepler, thats an achivement. Im pettty sure there is no Kepler optimizations in Doom.

The main difference with GCN 1.0/Kepler is that GCN 1.0 is still sold today as mainstream card, Kepler is long gone, thats makes difference at the moment of making arch specific code. Thats the drawback on DX12/Vulkan, we depends in devs on making the stuff for old cards :S
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |