Batman Arkham City, no physics at all if you don't use physx ?

Page 14 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Spjut

Senior member
Apr 9, 2011
928
149
106
Thankfully, there are third parties that may help you out there. And, sadly, nVidia doesn't desire to officially support hybrid modes. Ideally, and from a gamer point-of-view, would like to see nVidia rethink this based on they're still nVidia GPU's, but they offered their reasoning a few times and the latest one was here:



Zogrim at PhysXinfo, sure did have some impressive questions:

http://physxinfo.com/news/6419/exclusive-nvidia-talks-present-and-future-of-physx-technology/

Thanks for the interview

They could just put a warning if using with AMD though
 

Grooveriding

Diamond Member
Dec 25, 2008
9,108
1,260
126
Its free in the fact that you dont have to use it at all and it doesnt have to be enabled. Your choice in enabling it then complaining is sort of like running over ppl on purpose and complaining that you got in trouble for it. It is something you choose to do then you act as if your butt hurt over doing it. Its a free choice for you to make and this is why i dont understand why you seem to have an extreme hatred for it.

Dont turn on physx and live a little longer.

BTW. you could also sell your nvidia cards already and buy some AMD hardware. Your consistent hate on this company has bound to have shortened your life yrs by now, lol

All this time you could have been living stress free, in an AMD wet dream come true. Or is it that you always have to have something to complain about?

Really ? That's a whole lot of vitriol packed into one post. I'm sorry you're so put ill at east by the 'hatred' you perceive in my postings.

It seems you are missing the explanation I made of my opinion from reading the first bit of this rage-filled, nasty post.

You are free to feel however you do about physx, I'm not interested in changing your mind or hating on you for it. I explained why I feel the way I do, and let's face it, no matter what I said or say; you'll attack it with more of your rants.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,108
1,260
126
There ya go --- your starting point for some respect!

I'd be all for a game where I could literally break through any wall, destroy any object and make use of and interact with the pieces left. A true to life physics engine that allowed full and complete destruction would be amazing and awesome.

If GPU physx brings that to us, and I do believe it will be a GPU or a chip with similar characteristics that does, I am all for it and on board; even if it requires an expensive dedicated card.

It's just so far off from that currently and I'm not comfortable with the performance it takes for what it does right now. I think we are better served by some of the other options out there.

Realistically I think to deliver physics like I hoped for, we probably are very far off from hardware that can do that.

Playing games like Crysis, which is already over four years old, I think we are closer to photo-realistic graphics in games than we are to life like physics in games.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
Who writes DirectX? Is it Microsoft? I'm sure Nvidia would fight them if they tried to implement a GPU physics API into future DX versions. Maybe it's too difficult to write in or impossible. I don't know. I always think a baseline standard is better than proprietary anything. I also don't like the performance hit, even with my GTX295 installed physx gives an FPS hit.
Do you think DirectX is free? It comes with windows operating system. Why doesn't Dx10/11 work under XP? It isn't because it can't be done, but Microsoft doesn't want it to be done as they want to sell Vista and 7. On top of that, AMD and Nvidia needs to pay MS for Dx, as well as studios who make games that uses it.

OpenGL is an open standard, but the game engines that was built on top of it isn't. To average users, they will think OpenGL is better because it is cross platform. What is not so well known is that the reason that is cross platform is because video drivers made it so. Without drivers, neither OpenGL or DirectX will work. Nothing is magical, those are just APIs, function specifications that needs to be implemented. The MS version of GPU computing is called directcompute, but is it going anywhere? No. Why? Because there exists a better solution that is more robust than DirectCompute, called CUDA.

You may think that a Physics Engine that can utilize any video card will change the world as we know it, but it won't. It will run worst then PhysX if it doesn't crash. It needs to work with 2 distinct set of hardwares. Havok FX died because of this reason.

If there is already a wheel that works, then the only reason to reinvent the wheel is if the new one works better. Funny you have no problem knowing both AMD and Nvidia needs to pay MS for DirectX, but have problems knowing that AMD needs to pay Nvidia for PhysX.

If you don't want performance hit, than play at the lowest settings, that way there will be least about of performance hit. On the other hand, you can buy, tweak and tune the biggest bad hardware so you can conquer games via its maximum settings. Computer gaming has been like that since the beginning. Yes, this is actually part of computer gaming.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I'd be all for a game where I could literally break through any wall, destroy any object and make use of and interact with the pieces left. A true to life physics engine that allowed full and complete destruction would be amazing and awesome.

If GPU physx brings that to us, and I do believe it will be a GPU or a chip with similar characteristics that does, I am all for it and on board; even if it requires an expensive dedicated card.

It's just so far off from that currently and I'm not comfortable with the performance it takes for what it does right now. I think we are better served by some of the other options out there.

Realistically I think to deliver physics like I hoped for, we probably are very far off from hardware that can do that.

Playing games like Crysis, which is already over four years old, I think we are closer to photo-realistic graphics in games than we are to life like physics in games.


If you're not comfortable, well, you're not comfortable! I don't have any problems what-so-ever with a person's subjective view or opinion at all. For me, I am pleased to see the ball rolling to mature and evolve.
 

tweakboy

Diamond Member
Jan 3, 2010
9,517
2
81
www.hammiestudios.com
I saw this video today showing the Batman Arkham City game with vs without physx.

http://www.youtube.com/watch?v=trq6B4anzjM&feature=youtu.be&hd=1

It looks like if you don't use physx there will be no physics effects at all looking at this video. Everything I see done there with physx I've seen in other games that don't use it.

Also, this is obviously not everything in the game, but having recently played the BF3 beta; I don't see anything here I haven't seen done on the CPU.

Is this a wise choice to mandate physx in order to have physics effects in a game ? Especially considering the rather large performance hit gpu physx has historically had versus cpu implemented physics.

I'll assume there will be the option to run this on the CPU, in the past though this was ridiculously intensive (Mafia 2) bordering on unplayable. I do wonder though as in this comparison it says with GTX card vs without GTX card ? The video is obviously to help EVGA sell cards of course, perhaps there will be a CPU option.

There is option in nvidia nvpanel to make CPU the phsyx processor and take the load off of the video card.
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
I'd be all for a game where I could literally break through any wall, destroy any object and make use of and interact with the pieces left. A true to life physics engine that allowed full and complete destruction would be amazing and awesome.

If GPU physx brings that to us, and I do believe it will be a GPU or a chip with similar characteristics that does, I am all for it and on board; even if it requires an expensive dedicated card.

It's just so far off from that currently and I'm not comfortable with the performance it takes for what it does right now. I think we are better served by some of the other options out there.

Realistically I think to deliver physics like I hoped for, we probably are very far off from hardware that can do that.

Playing games like Crysis, which is already over four years old, I think we are closer to photo-realistic graphics in games than we are to life like physics in games.

Here, I can agree with you. We arent anywhere close to where it could be. PhysX isnt complete. Nvidia knows this. Its far from finished. But it is a step towards the right direction. These are steps that we get to see and experience. But its not to this omg point where it changes anything. But where i feel different is, I can give them credit for their efforts. I understand it is a working process. I really dont even have a card to use for physx. My 460 cannot do both game and use physX. I played mafia2 with the physX on medium with the CPU. I did this and found it enjoyable. Its nothing but added eye candy. I found joy in playing mafia2 on PC because of the added physX. It made it more than your interesting on PC over a console.

Yes there are developers who give us crysis and BF3. But how many games have developers who spend that much time and effort on their games. You pretty much have named all the games that do. A tiny few throughout the yrs. For this reason, i respect anyone who puts effort in pushing PC gaming further. Nvidia only added to these games which would have been nothing but a straight console port. If you dont like the effects, what harm is there in not enabling the feature. Its really simple.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
Why doesn't Dx10/11 work under XP? It isn't because it can't be done, but Microsoft doesn't want it to be done as they want to sell Vista and 7.
Actually that’s not true at all. XP lacks the kernel infrastructure and driver framework needed for DX10/DX11, and attempting to back-port said features into XP would result in many legacy applications breaking.

On top of that, AMD and Nvidia needs to pay MS for Dx, as well as studios who make games that uses it.
Can you please provide evidence of this claim? Thanks.
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
Actually that’s not true at all. XP lacks the kernel infrastructure and driver framework needed for DX10/DX11, and attempting to back-port said features into XP would result in many legacy applications breaking.
.


I see no reason why DX11/DX10 wont work in XP. With only three things keeping microsoft from doing this: the will, time and $$$. There is nothing preventing it except Microsoft. Even if its not easy, it is possible. Its just not an investment that benefits M$. Actually the reverse, DX10 was the only reason some ppl went to vista.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
I see no reason why DX11/DX10 wont work in XP. With only three things keeping microsoft from doing this: the will, time and $$$.

If you spent the time and money and effort to modify winXP to make it able to run DX10/11, you will end up with a kernal thats surprisingly similar to vista/win7

MS did spend the time effort and money. And they called the resulting product windows vista.
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
If you spent the time and money and effort to modify winXP to make it able to run DX10/11, you will end up with a kernal thats surprisingly similar to vista/win7

MS did spend the time effort and money. And they called the resulting product windows vista.

Yep

Vista/win7 have a lot more to offer but your point is a good one. I fully understand why its the way it is, i have no beef with it. I am just saying that with care and time DX11/10 could have been on XP. I feel like we are much better off with win7 now, though vista was a tough ride for some. I really think we needed a better operating system once that systems were using large amounts of ram. And I am sooo glad XP 64 is gone!
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
I see no reason why DX11/DX10 wont work in XP. With only three things keeping microsoft from doing this: the will, time and $$$. There is nothing preventing it except Microsoft. Even if its not easy, it is possible. Its just not an investment that benefits M$. Actually the reverse, DX10 was the only reason some ppl went to vista.

You should read this piece:
http://beyond3d.com/content/articles/55/
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Do you think DirectX is free? It comes with windows operating system. Why doesn't Dx10/11 work under XP? It isn't because it can't be done, but Microsoft doesn't want it to be done as they want to sell Vista and 7. On top of that, AMD and Nvidia needs to pay MS for Dx, as well as studios who make games that uses it.

OpenGL is an open standard, but the game engines that was built on top of it isn't. To average users, they will think OpenGL is better because it is cross platform. What is not so well known is that the reason that is cross platform is because video drivers made it so. Without drivers, neither OpenGL or DirectX will work. Nothing is magical, those are just APIs, function specifications that needs to be implemented. The MS version of GPU computing is called directcompute, but is it going anywhere? No. Why? Because there exists a better solution that is more robust than DirectCompute, called CUDA.

You may think that a Physics Engine that can utilize any video card will change the world as we know it, but it won't. It will run worst then PhysX if it doesn't crash. It needs to work with 2 distinct set of hardwares. Havok FX died because of this reason.

If there is already a wheel that works, then the only reason to reinvent the wheel is if the new one works better. Funny you have no problem knowing both AMD and Nvidia needs to pay MS for DirectX, but have problems knowing that AMD needs to pay Nvidia for PhysX.

If you don't want performance hit, than play at the lowest settings, that way there will be least about of performance hit. On the other hand, you can buy, tweak and tune the biggest bad hardware so you can conquer games via its maximum settings. Computer gaming has been like that since the beginning. Yes, this is actually part of computer gaming.

If you want to do anything remotely related to gaming on your PC you install Windows. So saying it's not free is a bad argument. You're going to install Windows no matter what.

Also look on the back of Skyrim's case. Notice that logo...HAVOK. Havok does more than Physics though so I can't specifically tie the physics side of their software to Skyrim, but there's a lot of games that have pretty realistic rain, smoke, and other physics going on without using Nvidia's proprietary Physx. They also look just as good and have zero performance hit since it doesn't steal GPU resources.

Nobody pays for access to Directx to my knowledge. Maybe you could back that with a link to enlighten us?

I fail to see where you come up with the idea that a more standardized GPU based physics API would crash and Physx won't. It would be up to the GPU manufacturer to develop drivers to work properly and not crash. It doesn't matter if it uses 2 different hardware sets. DirectX does also, but nobody blames Directx when a game crashes due to drivers.

Tell people to turn settings down to their lowest? WTF!? I can run Battlefield on Ultra and magically there is not a 20fps performance hit from some smoke and environmental destruction. I'm of the mindset that Physx is garbage because of the performance hit. It does nothing you can't do without the GPU side of it. Will I use it? Maybe if it's there and I have the hardware. DO I support it's use? No way. Despite what you may think or say you are allowed to dislike something but use it grudgingly.

So your logic is that developers already have these things like this and Nvidia pays them to lock it out unless you have a specific card installed.....




Thats hogwash. Completely backwards. Your just making up garbage when you say crap like this. Its totally bull crap and you pass it off so seriously.

Are you really going to get that upset because of something I say online? No wonder today's society is going down...everything is taken personally.

I do believe that Nvidia pushes for Physx in order to sell cards. "Oh look, if you don't have our stuff you are missing features". Average Joe would say "oh damn...i need that." He wouldn't even realize those same effects are used in other games without using any proprietary API and specific hardware to do it.

Stop taking everything personally and relax a bit. What I say shouldn't ruin your day lol. I just wonder how you and others can disregard any opinion other than your own. "Oh you don't like it, turn it off and cry somewhere else" is what I hear too often. That...is backwards.

Last thing I will say, nothing proprietary outside of Apple has taken off significantly in recent years. Everything that is successful is that way because of using an open standard. You think we would have 3Ghz+ multi core CPUs if there was only one choice out there? You could say yes but then could you buy one for $300? Not likely and you'd probably still be on a single core because of the pricing.
 
Last edited:

Seero

Golden Member
Nov 4, 2009
1,456
0
0
Actually that’s not true at all. XP lacks the kernel infrastructure and driver framework needed for DX10/DX11, and attempting to back-port said features into XP would result in many legacy applications breaking.
Are you saying that window 7 breaks many legacy applications? It is just programming. Ocre and Taltamir see the reasoning, which is really trivial. $ and time. Pay them enough and they will make windows run on phones. Oh wait...


Can you please provide evidence of this claim? Thanks.
Look up Nvidia ION DirectX10.
 

Kr@n

Member
Feb 25, 2010
44
0
0
Because scripted inert debris that magically dissapears into thin air over and over and over again tells my brain that this isn't real...not even close.
And there is no need for CPU physics today.
Would you run graphics on the CPU today?

If no, then you know why physics dosn't belong on the CPU anno 2011.

And even GPU physics uses "clever tricks"...it's not like any game simulates physics down to the atomic level...it's just that a CPU are a joke compared to a GPU for physics.

You seemed to imply that scripted, biased physics were significantly worse than "true" physics (and cited as an example GPU PhysX). I merely responded to that assertion, saying that scripted physics weren't necessarily a bad thing (be it on CPU or GPU). No mention whatsoever of CPU graphics (you may argue that I mentionned RT, and that most RT engines are CPU based, but the analogy with RayTracing was just that: an analogy).

I agree with you that GPU are more suited to massive physics computations, due to inherent parallelism. However, most current GPUs aren't up to the task when hammered with graphics along with physics computations. Given those low/mid end GPUs, CPU physics still make sense IMO because it seems there is more spare CPU power than GPU power in current low/mid end configs, with current games...

I also agree GPU physics uses clever tricks (it was even one of my previous points, if I'm not mistaken). It's just a matter of scale: GPU physics are generally more precise (less biased physics laws, etc.) and on a larger scale (more particles, lengthier animations, computed objects last longer, etc.). This is more expensive to compute (more parallel), thus more suited to GPU, that's all ...

TL/DR : GPU physics are good and all but CPU physics are currently serving a purpose, and surely aren't horrid just because they are computed on CPU (barring borked implementations like you mention) ... There is no such thing as "infamous CPU physics" vs. "holy grail OMG GPU physics".
 
Last edited:

Seero

Golden Member
Nov 4, 2009
1,456
0
0
If you want to do anything remotely related to gaming on your PC you install Windows. So saying it's not free is a bad argument. You're going to install Windows no matter what.
You must not have Ubuntu and Mac. There were a time where games are multi-platform as they are written upon OpenGL. Microsoft throw a whole lot of resources in DirectX to compete with it. DirectX7,8,9,9a,9b and 9c. DirectX 9.0c has won the war as CAP bits are impossible to manage by programmers. DirectX10 kills all of the CAP bits (and then create new ones) in the name of management, but in practice they want to take control on what can be done. Nvidia never liked Directx 10 due to its lack of support on GPGPU, but are forced to comply. ATI didn't have this problem and created 4xxx series based upon Dx10 specification. That is why Nvidia kept TWIMTBP while AMD ditched GITG. The result? You knew.

TWIMTBP obviously made games better, and not surprisingly, more so with Nvidia gear than others. Sea of QQs left and right, sabotage, bribe, you name it. However, what about the part that probably won't be there and also worked on ATI cards? To some, there ain't any. In Batman:AA, MSAA won't be there for sure it it isn't because of TWIMTBP. Yes, it is only in the GOTY version for AMD user, but if Nvidia never put it there the first place, then it wouldn't be there. Why was it so difficult? Well, it is all because of the limitation of Dx9. See how the pieces falls together?

Also look on the back of Skyrim's case. Notice that logo...HAVOK. Havok does more than Physics though so I can't specifically tie the physics side of their software to Skyrim, but there's a lot of games that have pretty realistic rain, smoke, and other physics going on without using Nvidia's proprietary Physx. They also look just as good and have zero performance hit since it doesn't steal GPU resources.
Really? Haven't you see the amount of complains about how GPUs are undersaturated in skyrim and how CPU speed is the domainate factors on performance yet it is hardly under load? Someone posted a challenge in the skyrim forum to others asking nothing more than a screenshot with FPS indicator at a specific spot. The problem is even 2600k overclocked to over 4Ghz still can't max out the graphics along with good FPS. Picky you may say I am, but under 40FPS while looking at the town of skyrim top down with a overclocked 2600k, SSD, 3x580SLI isn't what I will call good performance, specially when neither CPU or GPUs is under heavy load.

As to its physics, it isn't bad, but it is surely buggy. I killed myself by simply working into a pot a few times. Weird bounding sound from objects which are suppose to be stationary. CPU physics isn't perfect, just like GPU physics. The fact is, physics are getting better and better, with both CPU and GPU.

Nobody pays for access to Directx to my knowledge. Maybe you could back that with a link to enlighten us?
Where do you think MS makes money from? Trees?
http://www.microsoft.com/about/legal/en/us/IntellectualProperty/IPLicensing/Policy.aspx

I fail to see where you come up with the idea that a more standardized GPU based physics API would crash and Physx won't. It would be up to the GPU manufacturer to develop drivers to work properly and not crash. It doesn't matter if it uses 2 different hardware sets. DirectX does also, but nobody blames Directx when a game crashes due to drivers.
Rome wasn't built in a day. PhysX has a long history. Ageia acquired and modified NovodeX, which by itself is a working physics engine to begin with. The modification is so that it utilize a PPU when it is present. Without PPU, NovodeX is just another physics engine that runs on CPU. It was renamed to PhysX and where later acquired by Nvidia. In another words, PhysX is nothing more than a physics engine that runs on CPU by default if Nvidia video card is not present. Havok had the potental of allowing GPU acceleration too, but MS are not going to waste resources on that as they do not benefit from it.

The only reason why PhysX is good as because Nvidia controls both the API and how these APIs are implemented. Havok FX can't do that as it must rely on GPU vendors. MS can dictate the API, but that is if they have a clue first. Only Nvidia has the clue as they have PhysX, an existing engine that offloads things to GPU. AMD is trying to catch up by hiring the person who built physx, Manju Hegde. As to MS? They must learn to build video cards first. Again, physics engine can be build on top of DirectCompute, but like any new things, there will be lots of bugs and compatability issues. Compare it with something that is working for 8 years, it really isn't easy.

The performance hit you talk about doesn't come out of no where. If you think CPU can handle it, then there are no problems having CPU handle all affects from PhysX without a nvidia video card. You believed that PhysX is a tool used by Nvidia to deliberately hammer those that are not their customers. Don't worry, you are not the first and won't be the last. There were a claim stating that PhysX cripple CPU performance by using old instruction set (FP87) where it should use SSE. Guess what? PhysX 3.x SDK enables SSE, but CPU performace is still inferior. CPU Multi-thread support? Yes, but still inferior.
http://www.geeks3d.com/20100711/cpu-physx-x87-sse-and-physx-sdk-3-0/

Tell people to turn settings down to their lowest? WTF!? I can run Battlefield on Ultra and magically there is not a 20fps performance hit from some smoke and environmental destruction. I'm of the mindset that Physx is garbage because of the performance hit. It does nothing you can't do without the GPU side of it. Will I use it? Maybe if it's there and I have the hardware. DO I support it's use? No way. Despite what you may think or say you are allowed to dislike something but use it grudgingly.
Many believed that it is console gaming that is holding pc gaming back, but it isn't. The true problem is the diversity of hardware combinations, and you actually showed the reason as a user. You don't want performance hit, and you are not the only one who doesn't want performance it. However, there are people like me who wants to see our system work, justifing its cost. How will you balance this?

As to B:AC, there is a Dx11 performance problem, but no one is saying that this is a PhysX performance problem, so what exactly is your problem? This is not personal, as a sea of users are complaining about Dx11 while seeing no difference playing it in Dx9 except that the performance issues are gone. Guess what, you can turn the level of eye candy up or down depanding on your setup. I don't know what you want, but if you don't want headache that you should side with console gaming so everyone play at the same level. I love new technologies. I want to see games that will drive my computer to its knees so I have reasons to upgrade. PhysX is good as it is really an add on to the game.

I too want to see games make in 64bit, require 16Gb RAM, max out 8 core CPUs and only runs in Dx11. Unfortunately, there are none. There are none because such product has avery narrow market. Dynamic tessellation is only good by its name as you really can't scale things up and down. The MS way of solving this problem is with dynamic tessellation. Playing Dx11 with Dx10 hardware will make everything look like deflated balloons, so devs basically make the game without the need of Dx11 tessellation, and retrofit it back as overinflated balloons. You may think this is good, but not me.

Were you asking why I said it will run worst if it doesn't crash? Look that the current Dx11 implementation. Crysis2, DA2, Dirt3, and Batman:AC. You need to try really hard to say that there are no problems with directx.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
You must not have Ubuntu and Mac. There were a time where games are multi-platform as they are written upon OpenGL. Microsoft throw a whole lot of resources in DirectX to compete with it. DirectX7,8,9,9a,9b and 9c. DirectX 9.0c has won the war as CAP bits are impossible to manage by programmers. DirectX10 kills all of the CAP bits (and then create new ones) in the name of management, but in practice they want to take control on what can be done. Nvidia never liked Directx 10 due to its lack of support on GPGPU, but are forced to comply. ATI didn't have this problem and created 4xxx series based upon Dx10 specification. That is why Nvidia kept TWIMTBP while AMD ditched GITG. The result? You knew.

TWIMTBP obviously made games better, and not surprisingly, more so with Nvidia gear than others. Sea of QQs left and right, sabotage, bribe, you name it. However, what about the part that probably won't be there and also worked on ATI cards? To some, there ain't any. In Batman:AA, MSAA won't be there for sure it it isn't because of TWIMTBP. Yes, it is only in the GOTY version for AMD user, but if Nvidia never put it there the first place, then it wouldn't be there. Why was it so difficult? Well, it is all because of the limitation of Dx9. See how the pieces falls together?


Really? Haven't you see the amount of complains about how GPUs are undersaturated in skyrim and how CPU speed is the domainate factors on performance yet it is hardly under load? Someone posted a challenge in the skyrim forum to others asking nothing more than a screenshot with FPS indicator at a specific spot. The problem is even 2600k overclocked to over 4Ghz still can't max out the graphics along with good FPS. Picky you may say I am, but under 40FPS while looking at the town of skyrim top down with a overclocked 2600k, SSD, 3x580SLI isn't what I will call good performance, specially when neither CPU or GPUs is under heavy load.

As to its physics, it isn't bad, but it is surely buggy. I killed myself by simply working into a pot a few times. Weird bounding sound from objects which are suppose to be stationary. CPU physics isn't perfect, just like GPU physics. The fact is, physics are getting better and better, with both CPU and GPU.


Where do you think MS makes money from? Trees?
http://www.microsoft.com/about/legal/en/us/IntellectualProperty/IPLicensing/Policy.aspx


Rome wasn't built in a day. PhysX has a long history. Ageia acquired and modified NovodeX, which by itself is a working physics engine to begin with. The modification is so that it utilize a PPU when it is present. Without PPU, NovodeX is just another physics engine that runs on CPU. It was renamed to PhysX and where later acquired by Nvidia. In another words, PhysX is nothing more than a physics engine that runs on CPU by default if Nvidia video card is not present. Havok had the potental of allowing GPU acceleration too, but MS are not going to waste resources on that as they do not benefit from it.

The only reason why PhysX is good as because Nvidia controls both the API and how these APIs are implemented. Havok FX can't do that as it must rely on GPU vendors. MS can dictate the API, but that is if they have a clue first. Only Nvidia has the clue as they have PhysX, an existing engine that offloads things to GPU. AMD is trying to catch up by hiring the person who built physx, Manju Hegde. As to MS? They must learn to build video cards first. Again, physics engine can be build on top of DirectCompute, but like any new things, there will be lots of bugs and compatability issues. Compare it with something that is working for 8 years, it really isn't easy.

The performance hit you talk about doesn't come out of no where. If you think CPU can handle it, then there are no problems having CPU handle all affects from PhysX without a nvidia video card. You believed that PhysX is a tool used by Nvidia to deliberately hammer those that are not their customers. Don't worry, you are not the first and won't be the last. There were a claim stating that PhysX cripple CPU performance by using old instruction set (FP87) where it should use SSE. Guess what? PhysX 3.x SDK enables SSE, but CPU performace is still inferior. CPU Multi-thread support? Yes, but still inferior.
http://www.geeks3d.com/20100711/cpu-physx-x87-sse-and-physx-sdk-3-0/

Many believed that it is console gaming that is holding pc gaming back, but it isn't. The true problem is the diversity of hardware combinations, and you actually showed the reason as a user. You don't want performance hit, and you are not the only one who doesn't want performance it. However, there are people like me who wants to see our system work, justifing its cost. How will you balance this?

As to B:AC, there is a Dx11 performance problem, but no one is saying that this is a PhysX performance problem, so what exactly is your problem? This is not personal, as a sea of users are complaining about Dx11 while seeing no difference playing it in Dx9 except that the performance issues are gone. Guess what, you can turn the level of eye candy up or down depanding on your setup. I don't know what you want, but if you don't want headache that you should side with console gaming so everyone play at the same level. I love new technologies. I want to see games that will drive my computer to its knees so I have reasons to upgrade. PhysX is good as it is really an add on to the game.

I too want to see games make in 64bit, require 16Gb RAM, max out 8 core CPUs and only runs in Dx11. Unfortunately, there are none. There are none because such product has avery narrow market. Dynamic tessellation is only good by its name as you really can't scale things up and down. The MS way of solving this problem is with dynamic tessellation. Playing Dx11 with Dx10 hardware will make everything look like deflated balloons, so devs basically make the game without the need of Dx11 tessellation, and retrofit it back as overinflated balloons. You may think this is good, but not me.

Were you asking why I said it will run worst if it doesn't crash? Look that the current Dx11 implementation. Crysis2, DA2, Dirt3, and Batman:AC. You need to try really hard to say that there are no problems with directx.


After reading this post I can see where you're coming from. Yeah DX has issues, but most software has issues on PCs in some way these days it seems. I would just like for games to actually run well on hardware that is more expensive than a console. I should not get console-like performance when I'm running 3-4x the hardware specs if not more. I don't think there's a good reason for a game using DX9 which is 2 generations old, to run at 30fps just because of a few smoke and particles. For an example to what I mean when I run the benchmark for Batman: AC with no Physx I am seeing FPS that averages well over 50, right up to my 60hz refresh rate if I have vsync on. When I turn physx on I am getting 34fps averages and minimums that run down to 15fps. That's terrible and it's not just people running Hybrid systems either. There's a lot of NV users getting similar results. Why do you need SLI to play a game with some smoke and stuff? Am I asking for too much? Remember I'm talking about the DX9 version here as I know all about the DX11 problems Rocksteady is having.

What I have a hard time coming to terms with is the smoke and that stuff not looking all that impressive to begin with but once you turn it on...cripples performance. It's a little crazy. I even have my GTX 295 installed to run Physx for Batman: AC and the benchmark results are the same, even when I delete the PhysxDevice.dll from the game directory.
 
Last edited:

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
Are you saying that window 7 breaks many legacy applications?
No, because DX9 on Vista/7 runs in a similar environment as it did on XP, for precisely this reason. DX9 is separate to DX10/DX11 on said platforms.

It is just programming.
Heh. You going to call out AMD/nVidia for not implementing the kernel changes at the driver level? How about Khronos for not implementing an OpenGL wrapper? They could all do it without Microsoft making any kernel changes to XP. It’s all just programming, right?

Look up Nvidia ION DirectX10.
I’m not searching for your evidence; the burden of proof is with you. Please provide evidence of licensing costs associated with DirectX on Windows. A link to a reputable page mentioning the costs will suffice.

Where do you think MS makes money from? Trees?
They make money from selling software like Windows and Office.

The DirectX SDK is available as a free download: http://www.microsoft.com/download/en/details.aspx?id=6812

The DirectX runtime is also free to download and is free to distribute: http://www.microsoft.com/download/en/details.aspx?id=35

So please, show us these licensing costs for DirectX on Windows, or retract your claims. Thanks.

Can you please highlight on that page where it says DirectX on Windows has licensing fees? Thanks.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
btw, the people that claim MS is making companies pay for directX? I think you are a bit confused. DirectX is free.
MS makes the customer pay for windows.
MS makes the video card makers pay for WHQL certification and digital signature (which is worthless but windows will destroy any driver that is installed with one).
MS makes takes a cut out of every xbox360 game sale.

They got their fingers in many pots. DirectX licensing is actually NOT one of them.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
btw, the people that claim MS is making companies pay for directX? I think you are a bit confused. DirectX is free.
Yep, his statement on top of that, AMD and Nvidia needs to pay MS for Dx, as well as studios who make games that uses it is comically wrong, I'm just waiting to see if he'll ever retract it. AMD/nVidia is not paying Microsoft for DirectX and neither are developers making commercial games with it.

Furthemore, any IHV is free to implement a hardware accelerated back-end for DirectX on their hardware without any licensing or limitation. This is unlike PhysX where nVidia is in complete control of the hardware acceleration and only allows it on their hardware, and only if no other IHV's GPU is active in the system.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
After reading this post I can see where you're coming from. Yeah DX has issues, but most software has issues on PCs in some way these days it seems. I would just like for games to actually run well on hardware that is more expensive than a console.
It does run well, it really does, but that isn't what you really want. What you really want is a game custom build for your particular hardware configuration, which is not the cutting edge. You see, that is what consoles are for to being with, a custom build system that never have compatibility issues and are often being maxed out by games.

As to PC gaming, what you want doesn't really exist. It never existed before. Instead of how do others max out your PC, we find ways to max out our PC. It has always been like that. We don't blindly push all bars to the right and turn on all eye candies. Well, we do, but then we turn the ones that hurt FPS most with the least visual impact off until we get the FPS we like.

I kept saying PhysX is not the reason of your performance hit. You are the reason of the performance hit. If you are willing to lower settings, then you can play at good FPS too. What more can I say?

For an example to what I mean when I run the benchmark for Batman: AC with no Physx I am seeing FPS that averages well over 50, right up to my 60hz refresh rate if I have vsync on. When I turn physx on I am getting 34fps averages and minimums that run down to 15fps.
You see, if 50FPS is what you are aiming for, than you already know the setting that you should play at, but somehow you still wants to turn on PhysX and then complain about the performance hit.

That's terrible and it's not just people running Hybrid systems either. There's a lot of NV users getting similar results. Why do you need SLI to play a game with some smoke and stuff? Am I asking for too much? Remember I'm talking about the DX9 version here as I know all about the DX11 problems Rocksteady is having.
No, you are not asking for too much, you simply don't know what you are asking for. You want games to utilize monster setup, you want games to max out your not so monster setup and you don't want console graphics. If I were you, I'll simply buy a pair of GTX580 and play.

Seriously, consoles are cheap and lots of great games doesn't have a PC version. It is plug and play, why do you hate it so much?

What I have a hard time coming to terms with is the smoke and that stuff not looking all that impressive to begin with but once you turn it on...cripples performance. It's a little crazy. I even have my GTX 295 installed to run Physx for Batman: AC and the benchmark results are the same, even when I delete the PhysxDevice.dll from the game directory.
If it is not impressive and hurts performance, then turn it off!!!!!!11!!! God! It isn't easy to simulate interactive smokes, which is really not the same as scripted smoke/fog. Yes, it requires a lot of resources, but that is the purpose! If it doesn't, then it is already in the console version!! Gee.

A single 6970 can handle 2560x1600 with no additional PhysX effect and AA under very high setting at around 70FPS. A single 580 can obtain 60FPS min with that setting. What is so wrong about PhysX? Stick with 1080p, low PhysX, no AA and high setting and go play the game. If Dx11 is so important to you, you may as well go buy a new video card as it will also impact FPS.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
I’m not searching for your evidence; the burden of proof is with you. Please provide evidence of licensing costs associated with DirectX on Windows. A link to a reputable page mentioning the costs will suffice.
You are good with words, but not really good at reading or remember what you ask for. Let me remind you.

Can you please provide evidence of this claim? Thanks.

If you ain't searching for my evidence, but why do you ask me to provide one to begin with?

About IP licensing and IP licensing policy.
http://www.microsoft.com/about/legal...ng/Policy.aspx

Official enough?

Show me the part where it explicitly stated that it does not require any additional rights when it comes to game developments. If you can't find it, then it isn't free. The ball is at your court BFG.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |