[computerbase] Project CARS benchmarks

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DarkKnightDude

Senior member
Mar 10, 2011
981
44
91


Conditions:
24 AI cars
Weather set to Clear
Map: Nordschleife
Pagani Zonda R
3:00PM

FRAPS benchmark log for one lap each. Though the max seems off.

My settings:
http://i.imgur.com/58QGrhy.png

Everything maxed besides motion blur off and SMAA was left on low. Honestly the game felt smooth both ways, but doesn't excuse the fps loss.
 
Feb 19, 2009
10,457
10
76
Set your weather to rain. I have a strong suspicion that rain is GPU PhysX accelerated. Alpha backers have mentioned on their forums, when they added GameWorks weather effects, performance on AMD was destroyed.

Also smoke particles are too. Test a non-stop burnout session with lots of smoke with CPU vs GPU PhysX.

Thanks!

Also CPU/GPU load via afterburner would be very useful.
 
Last edited:

96Firebird

Diamond Member
Nov 8, 2010
5,714
316
126
I see you've been making your presence known over on the Steam forums, Silverforce11...

And at least give RS credit for his nicknames for the game.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Read above at his post again,

"The game runs at 24-28 FPS with CPU Physx forced on through control panel.

With default GPU PhysX settings, it seems to be about 36ish-40 FPS. There's a noticeable difference there for me."

GPU PhysX: 36-40 fps
CPU PhysX: 24-28 fps

Quite a big difference.

Looks like I didn't catch that and looked at the wrong graphs in another post.



Conditions:
24 AI cars
Weather set to Clear
Map: Nordschleife
Pagani Zonda R
3:00PM

FRAPS benchmark log for one lap each. Though the max seems off.

My settings:
http://i.imgur.com/58QGrhy.png

Everything maxed besides motion blur off and SMAA was left on low. Honestly the game felt smooth both ways, but doesn't excuse the fps loss.

Thanks for that work/data! :thumbsup:

Well, that almost settles it - if the game has PhysX, and there is no way for AMD cards to run it without offloading it to the CPU, it's going to be very hard for AMD to improve performance to NV level here. I am sure AMD will improve performance to some extent but how are they going to deal with PhysX?

This is actually a great opportunity for AMD to dive deep into ProjectCARS and talk to professional review sites to write maybe explore this topic and seek feedback from the PC gaming community what they think about this type of PC game development. Would be an amazing PR piece that would open a lot of eyes for the average Joe PC gamer that may have been unaware of what goes on behind the scenes of AAA game development.
 
Last edited:
Feb 19, 2009
10,457
10
76
@96Firebird
Just doing my part to warn potential AMD owners who didn't know about this GameWorks title and buy it on Steam, wasting their $. Wouldn't want even more AMD gamers raging about it, the AMD threads are getting big as it is. Feel free to do your part, chime in and defend the developers & NV.

@RS Thanks for the PhysX CARS & Project nVidia name (http://steamcommunity.com/app/234630/discussions/0/613957600538780982/). It truly represents this disgraceful game.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
@RS Thanks for the PhysX CARS & Project nVidia. It truly represents this disgraceful game.

haha, no worries.

I guess I didn't follow this game closely enough:

October 17, 2013
"Project CARS will include GPU accelerated PhysX"



"Project CARS will officially support DX11 and Nvidia’s PhysX + Turbulence"
"Since Project CARS will support Turbulence, this could very well mean that there will be GPU hardware support for the PhysX effects that will be implemented. Given also the fact that the title was mentioned by Nvidia"

Well I guess we never paid attention but the use of PhysX was apparently well known to Project CARS users/journalists who followed the game.

That bodes the question: if since October 17, 2013, it was publicly disclosed information that Project CARS will use PhysX and PhysX is going to be GPU-accelerated, what would it do to offer AMD 20 game keys for the title? Even if they offered AMD 1 billion CD-keys, it's not as if AMD can run PhysX code on the GPU. I am not getting SMS's point-of-view here. The developer basically sided with NV 1.5 years ago and became NV's 100% partner in developing this game. AMD, as well as AMD GPU users, became nobody at that point.
 
Last edited:

el etro

Golden Member
Jul 21, 2013
1,581
14
81
Man, Gameworks is a idiot idea...


Why not simply Optimize the game to the max and then just liberate the source code for the competitor 2 weeks prior to the game launch?
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Up until now PhysX was always used for extra eyecandy. AMD users knew it and could just decide not to run it, or turn it down. This game though it's actually part of the gameplay. This would be the equivalent of AMD boosting draw calls in BF to where it would only run well with Mantle and nVidia users being forced to run the DX11 pathway and bottlenecking the CPU.

I'll bet that not one person defending nVidia's implementation of PhysX in this game, or blaming AMD's lack of dev relations would have been singing the same tune had it been done to them.

What made the PC the dominant platform it is today was compatibility. Compatibility between hardware from different companies and software as well. Taken to it's logical conclusion this could kill PC gaming.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Man, Gameworks is a idiot idea...


Why not simply Optimize the game to the max and then just liberate the source code for the competitor 2 weeks prior to the game launch?

This is a different situation. Unless nVidia allows AMD to run PhysX on their GPU's AMD can't fix this. They might be able to do driver optimizations in other areas to reduce the effect, but they can't fix CPU PhysX vs. GPU PhysX.
 
Feb 19, 2009
10,457
10
76
Up until now PhysX was always used for extra eyecandy. AMD users knew it and could just decide not to run it, or turn it down. This game though it's actually part of the gameplay. This would be the equivalent of AMD boosting draw calls in BF to where it would only run well with Mantle and nVidia users being forced to run the DX11 pathway and bottlenecking the CPU.

I'll bet that not one person defending nVidia's implementation of PhysX in this game, or blaming AMD's lack of dev relations would have been singing the same tune had it been done to them.

What made the PC the dominant platform it is today was compatibility. Compatibility between hardware from different companies and software as well. Taken to it's logical conclusion this could kill PC gaming.

No, it won't kill PC gaming. It will kill AMD PC gaming.

IF NV can buyout the next round of AAA games for 2015, it doesn't matter how good AMD's next-gen GPUs are, it will run like shat in all the big time games, ones that are the benchmark.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Up until now PhysX was always used for extra eyecandy. AMD users knew it and could just decide not to run it, or turn it down. This game though it's actually part of the gameplay.

I gotta admit, this is pretty nuts. I don't even know what to say to that because it's not as if NV allows me to go out and buy a dedicated budget GPU for PhysX because I can't pair an AMD rendering GPU with a secondary NV card. This is the most egregious example of a game developer favouring an AIB, while alienating 85% of the PC gaming market comprised of Intel+AMD GPU owners. What do we do at this point? Boycott the game to show that this is simply unacceptable? Is this the future of AAA game development now that NV has almost 80% market share? We are slowly reaching a critical point where NV is simply getting way too much power and influence over game developers and even AIBs (for example shutting the lights out on much of GPU Vcore overclocking starting with Maxwell, and then trying to follow-up by neutering mobile dGPU overclocking).

The very fact that NV worked directly with SMS from October 2013 (at least) and they KNEW that PhysX would be the main physics engine driving the game shows that NV was fully aware and onboard with purposely crippling gaming performance for ALL AMD and Intel GPU users. That's pretty telling if you ask me because I don't recall anyone developing or helping to develop a PC game knowing from day 1 that such dire performance consequences are a virtual guarantee for every non-NV user.

No, it won't kill PC gaming. It will kill AMD PC gaming.

IF NV can buyout the next round of AAA games for 2015, it doesn't matter how good AMD's next-gen GPUs are, it will run like shat in all the big time games, ones that are the benchmark.

I think the media needs to step up. Voices of individual forum members won't be heard. This is straight up unfair competitive business advantage and non-competitive business practices. If this is brushed aside, NV can make every game from the ground-up built directly for its GPU architectures, in turn turning PC AAA game development akin to Console game development where the developer would optimize for specific hardware (i.e., NV's only). This would more or less wipe out AMD unless they can make GPUs 2-3X faster just to compensate for the game code bias.
 
Last edited:
Feb 19, 2009
10,457
10
76
Are you saying PC gaming will thrive with only one GPU vendor? I'd really hate to see GPU development slow to a crawl like we now have on the CPU side.

PC gaming has thrived with only one CPU vendor.

GPU development would slow, sure, but it'll make life easier for developers who do cross-platform games. They only make it for console quality, they won't have to worry about pushing the PC limits.

PC gaming will be just fine, but PC gamers may have to pay more for NV GPUs and it won't be as interesting that's for sure. Either way, GameWorks IS a masterstroke from NV! (Thanks Desperado!) It's the killing blow.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,980
595
126
PC gaming has thrived with only one CPU vendor.

GPU development would slow, sure, but it'll make life easier for developers who do cross-platform games. They only make it for console quality, they won't have to worry about pushing the PC limits.

PC gaming will be just fine, but PC gamers may have to pay more for NV GPUs and it won't be as interesting that's for sure. Either way, GameWorks IS a masterstroke from NV! (Thanks Desperado!) It's the killing blow.
I did not think you were okay with a monopoly but okay. As for PC gamers may have to pay more there is no may about it.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Are you saying PC gaming will thrive with only one GPU vendor? I'd really hate to see GPU development slow to a crawl like we now have on the CPU side.

I know how to make it work. We would buy NV stock and if NV is a monopoly, PC gamers would have no choice but to pay $500+ for mid-range cards and $1000+ for flagships. NV's gross margins would explode even further since all of AMD's customer base would become NV's customer base and NV's earnings would rise. We would then use the dividends and capital gains from NV's share ownership to finance future GPU purchases. But, I'd rather NV be split into 2 firms because I much prefer competition and investing funds into technology firms has its own risks as this industry sector is very susceptible to recessions/economic downturns due to the nature of luxury/non-essential consumer goods.

Let's see what happens though. I have a feeling this forum will become even more hilarious once Titan X and other NV owners start downplaying the excellent price/performance of R9 390/390X cards by constantly talking about their 4GB of VRAM "limits", and trying to get us to understand that price/performance is an outdated metric and that we should instead pay $700-1000 for flagship cards to 'future-proof'. LOL
 
Last edited:
Feb 19, 2009
10,457
10
76
I did not think you were okay with a monopoly but okay. As for PC gamers may have to pay more there is no may about it.

Didn't say I was OK with it. Just that is whats going to happen if NV continue to push GW, buying out more AAA titles in development. Guaranteed.

All it takes to reverse the trend is for major journalists with a spine to call them out and for game reviewers who have a simple statement like in Overclock.net's reviews, akin to "This is an NV game, so its not surprise AMD performs poorly". NV needs signals to know what they are doing is unacceptable. If there's no such signals, they DAF and will drive AMD to the ground with more GW shenanigans.

I'm still waiting for [H] & BrentJ to do a Project Cars review and blast AMD for poor drivers again though. That's the trendy thing.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
^ A lot of hardware reviewers discounted Dirt Showdown automatically for AMD bias. I expect all of them to not use Project CARS in their testing as a result of PhysX underlying the gameplay/game's engine. This would be a good litmus test to see who is truly biased in the industry and gets special favours from NV (early review samples, various presents/gifts/trip perks, early access interviews to NV's engineers/PR, etc.)

I think we've reached a point where AMD needs to just throw the gloves off and make EVERY AMD GE game with closed-source DirectCompute to take full advantage of GCN.



Lulz, HD6970 with similar frame times to a GTX680 AMP!



Make TressFX 3.0 closed-source, and create new graphical features for lightning, shadow and other aspects of games all closed source. I can't believe I am advocating such craziness but I don't know how else AMD can survive the extreme bias that's starting to exist in game development today.

The other alternative I see if AMD works with 95% of AAA game developers and continues to make AAA games open-source -- this way NV and AMD users get great experience, but unfortunately AMD doesn't have the money to block NV from partnering with as many AAA game developers as they can afford.
 
Last edited:

AnandThenMan

Diamond Member
Nov 11, 2004
3,980
595
126
Didn't say I was OK with it. Just that is whats going to happen if NV continue to push GW, buying out more AAA titles in development. Guaranteed.
We are well on our way if you ask me. Nvidia is a near monopoly right now so it won't take much for it to fully happen. 3xx for AMD is do or die IMO.
All it takes to reverse the trend is for major journalists with a spine to call them out and for game reviewers who have a simple statement like in Overclock.net's reviews, akin to "This is an NV game, so its not surprise AMD performs poorly". NV needs signals to know what they are doing is unacceptable. If there's no such signals, they DAF and will drive AMD to the ground with more GW shenanigans.
What review sites are going to do this? I'd be surprised if we see more than one, heck I'd be surprised if even one said anything along these lines.
I'm still waiting for [H] & BrentJ to do a Project Cars review and blast AMD for poor drivers again though. That's the trendy thing.
I don't know how [H] will approach I honestly don't. I guess what I'm saying is I'll be dumbfounded if they do smack AMD around with the blame stick for poor performance.
Let's see what happens though. I have a feeling this forum will become even more hilarious once Titan X and other NV owners start downplaying the excellent price/performance of R9 390/390X cards by constantly talking about their 4GB of VRAM "limits", and trying to get us to understand that price/performance is an outdated metric and that we should instead pay $700-1000 for flagship cards to 'future-proof'. LOL
To me this outcome is all but assured if the 390 is a 4GB card.
Make TressFX 3.0 closed-source, and create new graphical features for lightning, shadow and other aspects of games all closed source. I can't believe I am advocating such craziness but I don't know how else AMD can survive the extreme bias that's starting to exist in game development today.
AMD should have kept Mantle to themselves this is one of the biggest blunders I have ever seen. Or license it out but not outright give it away. Now Nvidia will profit from Mantle and AMD will get next to zero advantage, same as the other tech you mention. I don't advocate close ecosystems either but how AMD approaches propagating tech is a train wreck as far as the financial side goes.
 
Last edited:
Feb 19, 2009
10,457
10
76
It's a money game RS. NV have more carrot wielding power to bait corrupt rabbit devs.

If its a play of dirty tactics by bribing devs or reviewers, AMD will lose eventually, they may delay it, but they won't match NV's $ throwing abilities.

What AMD should have done when they won the next console contract, is to make libraries for devs to use and make it closed source. If they done this from the start, NV would have had no chance because every major game is cross-platform with consoles being the focus.
 

jj109

Senior member
Dec 17, 2013
391
59
91
Up until now PhysX was always used for extra eyecandy. AMD users knew it and could just decide not to run it, or turn it down. This game though it's actually part of the gameplay. This would be the equivalent of AMD boosting draw calls in BF to where it would only run well with Mantle and nVidia users being forced to run the DX11 pathway and bottlenecking the CPU.

Physx SDK is used as a gameplay physics engine on Unreal 4 and several other engines, and the interactive physics portion always runs on the CPU. It's also often confused with the GPU-accelerated Physx being forced to run on the CPU. AFAIK, GPU Physx can't really be core gameplay because it's one-way interactive (unless they're using it to make visual obstructions, I guess).

It would be pretty dumb if the Project CARS devs tied GPU physx to a bunch of effect settings instead of having one big on-off switch/slider ala Borderlands and Metro.
 

SPBHM

Diamond Member
Sep 12, 2012
5,059
413
126
it would be interesting to see an in-depth comparison with the consoles regarding the PhysX effects presents on the consoles and PC version, and also CPU scaling tests with AMD GPUs vs Nvidia GPUs

I noticed while playing this game an abnormally high CPU load and temperature for my i5, I guess using a Geforce would lower this load...
 

Eymar

Golden Member
Aug 30, 2001
1,646
14
91
Make TressFX 3.0 closed-source, and create new graphical features for lightning, shadow and other aspects of games all closed source. I can't believe I am advocating such craziness but I don't know how else AMD can survive the extreme bias that's starting to exist in game development today.

The other alternative I see if AMD works with 95% of AAA game developers and continues to make AAA games open-source -- this way NV and AMD users get great experience, but unfortunately AMD doesn't have the money to block NV from partnering with as many AAA game developers as they can afford.

That's why I respect AMD GPU division, they genuinely seem to have a good corporate ethic which has been pretty consistent for years so they won't resort to underhanded tactics. Corporations tend to change ethic standards (in my opinion due mostly based on management changes), but AMD has bucked that trend.

Maybe a more reasonable and attainable option is to request graphic benchmarks from gaming devs so AMD can send benchmark to consumer test groups (to test range of hardware specs) to see if performance or features(ie. broken XF) is lacking while game is in development. That way AMD can resource wisely on when to support a game dev prior to game release.
 

DiogoDX

Senior member
Oct 11, 2012
747
279
136
So Project Cars is the first game to truly fragment the pc users.

Funny that this it was what most of the Nvidia users in here said would happen with Mantle.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
No, it won't kill PC gaming. It will kill AMD PC gaming.

IF NV can buyout the next round of AAA games for 2015, it doesn't matter how good AMD's next-gen GPUs are, it will run like shat in all the big time games, ones that are the benchmark.

Sorry, that is very shortsighted. If nVidia did this then I think it would be perfectly acceptable for Intel to stop supporting pci-e too. It's not like they get anything out of it. They don't sell GPU's.
 
Feb 19, 2009
10,457
10
76
Sorry, that is very shortsighted. If nVidia did this then I think it would be perfectly acceptable for Intel to stop supporting pci-e too. It's not like they get anything out of it. They don't sell GPU's.

That won't work, cos if Intel doesn't support it natively, Motherboard makers will just add a chip to enable it. The same motherboard makers as the ones selling dGPUs. ;p

Why Intel has no reason to do that, because they got nothing to gain. NV has reasons to destroy AMD GPU performance.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |