New Mafia II PhysX ON/OFF video.

Page 16 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Well, he's right in a way. An *ironic* way. In the same way that AMD is 'holding back' Nvidia PhsyX by not liscensing it from them, Nvidia held back tesselation by not getting on board with ATI's propriety 'tru-form' tech way back in the day, and then again during the 2900xt-HD4890 era. Not that I blame either company for not jumping aboard on their competitors propriety tech when the options are 1. adopt it and give their competitor a huge advantage, or 2. don't adopt it because it will die without support from both companies.

Tesselation only took off after it became an industry standard. GPU physics will be the same way, and it's about time someone (*cough* microsoft) went through with it already.

These guys forget it ALWAYS goes both ways. They blame ATI for not adopting Physx but others can blame Nvidia for not adopting Tessellation way back then.


I agree that if all hardware (regardless of vendor) supports the same features then games would more quickly take advantage of said features, but there were ZERO titles that I am aware of with tessellation prior to DX11 but there has been a dozen or so games gpu-accelerated physics. Why was there not even 1 game with tessellation before DX11 came about?
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
I must be blind, because I can't tell much difference between the PhysX on vs off, on the videos. I'm not a game-player, though, so that is probably why.

http://www.youtube.com/watch?v=-x9B_4qBAkk

Here is a video summary of many of the games that use gpu-physx. Some of it looks genuinely pretty good, some of it looks good in a subtle way, some of it I am indifferent about.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
7,064
7,490
136
I think Dguy had the right overarching original idea: That its not physx that should be hated on, but the spartan environment of the "non-physx" game when other non-physx games have done so much better on the cpu alone.

Golem showed up and subverted that idea, rightly, with the following thought: For this specific game, if the original console version is no different (in the physics department) than the ATI PC version, then nothing was lost and only what Nvidia paid for was gained for Nvidia card owners and that's all there really is to it (in this case).

So I guess the next question is: Is there a difference between the console version and the ATI PC version?

If there is, we're one step closer to the "Nvidia is a crapbag" side of the argument, and if not then Nvidia is free to do with its dollars to benefit its customers as it sees fit.

Can someone confirm yea or nay on the console issue?
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
I agree that if all hardware (regardless of vendor) supports the same features then games would more quickly take advantage of said features, but there were ZERO titles that I am aware of with tessellation prior to DX11 but there has been a dozen or so games gpu-accelerated physics. Why was there not even 1 game with tessellation before DX11 came about?
Because you hadn't even Googled it. There have been quite a few games that have used it. But, just like PhysX, it was one vendor only, it wasn't typically implemented in a way that was a slam dunk for game improvement.
 

HurleyBird

Platinum Member
Apr 22, 2003
2,726
1,342
136
I agree that if all hardware (regardless of vendor) supports the same features then games would more quickly take advantage of said features, but there were ZERO titles that I am aware of with tessellation prior to DX11 but there has been a dozen or so games gpu-accelerated physics. Why was there not even 1 game with tessellation before DX11 came about?

The original Unreal Tournament (as well as the games based off that engine, at least by editing the config file) had tru-form support. Counter-strike also, I think, and probably a few others. I think there were actually quite a few but it wasn't heavily marketed. I only know of TF2 that used the later tesselator (via a command line), although there may be some other obscure ones ou there.

But really, the answer is that Nvidia spent more money, and marketed more heavily. It doesen't really matter in the long run though; games that used the ATI tesselator are like a drop in the bucket compared to games which use PhysX which is again a drop in the bucket compared to the kind of market penetration Glide had... and we all know how that ended. As soon as a strong industry standard emerges that everyone can use, the few developers who use PhysX will dump it en-masse.
 

finbarqs

Diamond Member
Feb 16, 2005
4,057
2
81
View post # 281 in this thread.

That didn't really work because it worked for him, and not for me. Mine actually got slower when I used my 8800 GTX as the PHYSX Processor, as opposed to taking out the PHYSX and enabling it on the single GPU. mine worked in reverse, where it actually slowed down.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Keys,

The reason people are so hostile towards Physx is because it is being used a marketing tool. It hasn't been providing anything but some additional eye candy, such as debris (which doesn't look realistic anyway), or more realistic cloth movement. Minor improvements in the visual quality which end up costing a ton in performance.

Physx really should be focusing on world interaction, ballistics, cause and effect versus scripted animations. It should make you say, "wow! the game feels (not looks) so much more realistic, I just have to play with physx on" rather than "wtf I lost 40% of my frame rate for some lame debris and cloth?".

Nvidia is using it the wrong way, they are using it for eye candy which is the easiest thing to flaunt in the eyes the customer.

I agree with everything you said in this post.
Also as some have mentioned:
1. You could have debris without physX
2. The debris don't really add much to the game (and are over the top, in an unrealistic manner)

Also, they had to use special angle shots, what they were showin? nothing like what it looks like when you actually play the game, I played the entire game (on a GTX260, physX on), its an FPS, I recognized all those "scenes" from the game, but when I played it I saw it from a first person perspective as I killed enemies. They had to put a camera NEAR the enemies at special angles, all zoomed in so you can see all those extra debris (which would otherwise be too small to notice from the first person perspective of the character)
 
Last edited:

Grooveriding

Diamond Member
Dec 25, 2008
9,108
1,260
126
I must be blind, because I can't tell much difference between the PhysX on vs off, on the videos. I'm not a game-player, though, so that is probably why.

You're not blind, the difference is minimal.

The only in your face effect that you can't miss with physx is how it cuts your framerate in half :thumbsdown:
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
You're not blind, the difference is minimal.

The only in your face effect that you can't miss with physx is how it cuts your framerate in half :thumbsdown:

I basically agree, but there is another obvious in your face effect that you didn't mention...
A game with physX off will be completely missing features that exist in any other game via CPU acceleration... eg: clothe will magically disappear or in lesser cases will lose all animation (instead of simply having a predetermined animation), particles & destructible animations will just no longer exist instead of existing in predetermined form (or lesser amount) done via CPU on other games, etc.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
That didn't really work because it worked for him, and not for me. Mine actually got slower when I used my 8800 GTX as the PHYSX Processor, as opposed to taking out the PHYSX and enabling it on the single GPU. mine worked in reverse, where it actually slowed down.

Right, which should surely indicate something isn't right. I had a substantial gain using the 480 + 8800GTS 512 in Mafia II.

Can you list your system specs for us please? CPU, RAM, Mobo, Power Supply, OS and version (32 or 64 bit).

What you can try in the meantime is:

Make sure your motherboard drivers are all up to date. Go to your motherboard manufacturers site and download all the latest drivers. Chipset, Audio, Lan, etc. Update them.

Download the latest NV driver for your card/OS.

In control panel, uninstall any drivers pertaining to Nvidia Display and Nvidia PhysX. Also any stereoscopic drivers if you have them installed.

Reboot.

Cancel any Windows "Autodetect" of new hardware. Install Nvidia driver.

Reboot.

Go into your Nvidia control panel and make sure PhysX is set to run on the 8800GTX in the dropdown selection.

Run Mafia II.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
there is supposed to be a new driver (v260) coming out today that will have a "clean install" checkbox... he could use that.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
there is supposed to be a new driver (v260) coming out today that will have a "clean install" checkbox... he could use that.

Yep. He could do that to. He should still make sure his mobo drivers are up to date and installed in the meanwhile.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
The original Unreal Tournament (as well as the games based off that engine, at least by editing the config file) had tru-form support. Counter-strike also, I think, and probably a few others. I think there were actually quite a few but it wasn't heavily marketed. I only know of TF2 that used the later tesselator (via a command line), although there may be some other obscure ones ou there.

But really, the answer is that Nvidia spent more money, and marketed more heavily. It doesen't really matter in the long run though; games that used the ATI tesselator are like a drop in the bucket compared to games which use PhysX which is again a drop in the bucket compared to the kind of market penetration Glide had... and we all know how that ended. As soon as a strong industry standard emerges that everyone can use, the few developers who use PhysX will dump it en-masse.


You seem to draw you "conclusion" on a confusion of terms.
Glide was an API.
So is DirectX, OpenCL, CUDA, DirectCompute ect.

PhysX is a physics middleware.
Just like Havok or OpenBullet

But unlike Havok/Bullet which only runs on the CPU (Empty PR means nothing) Physx runs on both the CPU...and the relevant API's for the GPU:
CUDA
OpenCL
DirectCompute.

There is nothing stopping PhysX from running on OpenCL...except why should NVIDIA do that right now?
The have full controll over CUDA and both OpenCL and DirectCompute are far away in the review mirror in temrs of usage, tools-set and a general "enviroment".

So PhysX runs via CUDA for now (on the GPU) and the old SDK ran on AGEIA's old code, that they got from NOVODEX...one of the best CPU physics middlewares of that time.


Som dosn't understand this and post grabage like this:
http://www.realworldtech.com/page.cfm?ArticleID=RWT070510142143

Which even had the not so flattering fate of being debunked at Beoynd3D..and ATI haven...so sad really.
But that is what happens when drinking buddies (DKanter and Char-lie) gooff up *shrugs*

In the new PhysX SDK it's not up to the developer anymore to make everything multithreaded, it's done by auto now...so I wonder waht the excuse will be when people find out the CPU cannot compete with the GPU in SIMD calculations?

Physx isn't going anywhere, infact it's market share has being growing ever since the AGEIA days...facts don't lie.
Stating that PhysX will die because it's "proprietary" is like saying OpenGL will kill DirectX, because OpenGL is open and free and DirectX is proprietary.

And not looking at market shares, API features and developers relations.

So even is OpenBullet (a physics middleware) came out on OpenCL...not much would change.

Intel is hugging Havok tight...they won't let AMD get in that way.
OpenBullet....well...there has been a lot of TALK.
Like ATI has TALKED about GPU-physics since 2006.

Keyword: TALK.

Only one that has brought something to the table is NVIDIA.
And PhysX could run on AMD GPU's to...if AMD would so...but they won't.
They will rather TALK and TALK...did I mention TALK?
But deliver nothing.

So PhysX isn't going anywhere, quite the opposite in fact...

PS. I have heard this:
"PHYSX WILL DIE AS SOON AS...." since 2006.
PhysX has never been more used than it is today.

Find a new fail-manta and move on...please?
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Lol talking about physics marketshare growing.

Stanley tools is the worlds #1 weapons dealer!

I don't know what world you live in, but in this reality PhysX has gone from 0% market share in Jan 2006 to surpassing Havok's marketshare this year *shrugs*
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Actually it was aegia, nVidia simply bought them out.
but you are correct on most of what you said.

AGIEA's SDK only supported CPU and PPU.
NVIDIA developed the GPU support.
(AFAIR it was with PhysX SDK 2.73)

So in fact I was correct about that part too
 
Last edited:

taltamir

Lifer
Mar 21, 2004
13,576
6
76
AGIEA's SDK only supported CPU and PPU.
NVIDIA developed the GPU support.
(AFAIR it was with PhysX SDK 2.73)

So in fact I was correct about that part too

No, nvidia recompiled the physX C code that was written for the PPU with their CUDA compiler. The entire porting process (minor modifications were needed) supposedly took them under a month. (although it took longer though before they actually started integrating it into their driver)
And apparently its simple enough to hack the driver to allow it to run on an ATI card as well: http://www.tomshardware.com/news/nvidia-physx-ati,5764.html
but both nvidia AND AMD were hostile to the idea
 
Last edited:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
No, nvidia recompiled the physX C code that was written for the PPU with their CUDA compiler. The entire porting process (minor modifications were needed) supposedly took them under a month. (although it took longer though before they actually started integrating it into their driver)

What about PhysX SDK 3.0?

No just a simple "recompile".


And apparently its simple enough to hack the driver to allow it to run on an ATI card as well: http://www.tomshardware.com/news/nvidia-physx-ati,5764.html
but both nvidia AND AMD were hostile to the idea

Yup, I know.
AMD knows
NVIDIA knows.

But do you seriously expect AMD to put a NVIDIA PhysX sticker on their boxes?
And admit they failed (since ATI promised GPU physics in 2006 and nothing has happen since) and can't do what NVIDIA do?

Not to mention that a lot of fanboys on this site (and others) would get a stroke...


But I don't worry...sometime come 2015, when AMD get it head out of it's ass and deliver hardware physcis, trust me the red side will hail is as the second comming...


I have enjoyed hardware physics since 2006...and you all will soon too...no matter the empty rethorics being used today
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
I think Dguy had the right overarching original idea: That its not physx that should be hated on, but the spartan environment of the "non-physx" game when other non-physx games have done so much better on the cpu alone.

Golem showed up and subverted that idea, rightly, with the following thought: For this specific game, if the original console version is no different (in the physics department) than the ATI PC version, then nothing was lost and only what Nvidia paid for was gained for Nvidia card owners and that's all there really is to it (in this case).

So I guess the next question is: Is there a difference between the console version and the ATI PC version?

If there is, we're one step closer to the "Nvidia is a crapbag" side of the argument, and if not then Nvidia is free to do with its dollars to benefit its customers as it sees fit.

Can someone confirm yea or nay on the console issue?

This is EXACTLY what it all boils down to. You hit the nail on the head.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,249
136
Maybe someday physX will add something valuable to gameplay. But I don't see it happening anytime soon.

It seems to me that something is wrong with either the code or the hardware at this time. If adding some stupid effects can drop framerates substantialy on nvidias flagship cards it kinda makes a person wonder....Maybe nvidia makes it that way so even if you purchase flagship cards you still gotta get a dedicated physX card to take advantage of the feature you already purchased anyways!
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Maybe someday physX will add something valuable to gameplay. But I don't see it happening anytime soon.

Define "valuable"..or you just made a poor, useless, subjective fallacy.

It seems to me that something is wrong with either the code or the hardware at this time. If adding some stupid effects can drop framerates substantialy on nvidias flagship cards it kinda makes a person wonder....Maybe nvidia makes it that way so even if you purchase flagship cards you still gotta get a dedicated physX card to take advantage of the feature you already purchased anyways!

Did you sleep in physics lessons?
Why do you think supercomputer are needed for fission calculations?
Do you whine about more NPC's taking up resources?
Or shaders?
Or AA?
Or AF?

Nothing demands more calculation that physics.
Fact of life.
Get over it.


You make me wonder why people post about topics they are apparently utterly cluess about.

/whine "Why does having 1000x more interactive obejcts moving around tax my PC worse than a scripted animation with 100 obejcts in it?
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,249
136
Define "valuable"..or you just made a poor, useless, subjective fallacy.

Hello to Green team Denmark! How's the weather over there today/tonight/tomorrow or thatever it is?

How's this for a defination....Seems to fit the best

Of considerable use, service, or importance!

Currently the only real use is for marketing purposes only. If you think that it can be used to a greater extent more power to you! It's up to the develpopers to use it or not to use it. Seems like with a divided market it's not gonna be of much use to them.

Now what nvidia should do is physX done right. Maybe nvidia should purchase a game developer and use physX to it's true potential the way it's suppost to be played or whatever the saying is....Oh wait they wouldn't do that as it would cut off 1/2 of the customer base for the game(s)

When and if nvidia can make physX work without substantial framerate loss without the need for purchasing a dedicated physX card while even running their best cards in SLI then maybe they'll have something....My theory is it won't happen anytime soon! Hell it might not even be possible at all.

/whine "Why does having 1000x more interactive obejcts moving around tax my PC worse than a scripted animation with 100 obejcts in it?

Maybe your into the dust particle's and shell casings along with unrealistic effects for the most part, and like physX at it's current implementation....But for the most part others are not. It's not just those running an AMD/ATI card it's your fellow nvidia users at the same time.

I'm not sure but it seems like some posters here have some kinda obligation to defend nvidia's honor....Or they just have so much vested in nvidia that it's their way of justifying the purchase of their products in the past, present, and future

I guess a reply without fresh content to debate is kinda useless.

Why do you think that nvidia nuked the drivers so physX wouldn't work if an AMD/ATI card is detected as primary? Couldn't be because they are getting a free ride as they'd have to pay to play. Couldn't be to protect the IP as it would be done on a nvidia card. Could it be that even nvidia knows that for physX to work at it's best the use of the physX card needs to be independant and handling it's job only? I'm sure nvidia nuked physX for a reason and it doesn't involve IP, taking a free ride, or for the money!
 
Last edited:

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Even a little protest toward PhysX is too much.


Everyone has an opinion, and I'd completely accept any sort of tak it or leave it stance, but to outright bash it makes zero sense.
/2cents
 
Last edited:
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |