Batman Arkham City, no physics at all if you don't use physx ?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Thats pretty much exactly what im saying.

I mean think about it Nvidia cant add destruction to enviorments with GPU PhysX without having it available to all players. How would it go over in (insert FPS shooter game here) if some guy with Physx Starts blowing holes in stuff and going on a killing spree while the other ATI users have to run around and get better angles for shots without using destruction of the enviorment. Wouldnt really be fair.

Thats why i said untill we have a industry wide compatable on ALL systems/consols GPU physics API we have not seen what GPU physics can really do.

Sure you can add destruction. Because this is a content modification and has nothing to do with GPU-PhysX.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Read Virge's post above....''PhysX effects are usually done to the point of ridiculousness''

I've been trying to tell you this on Rage3D as well.If GPU physics ever take off I'll be surprised.It looks as though it's dying.Implemented in one or two games a year including Crappy games like Dark Void ETC ETC.


CPU'S are getting better and better.GPU pHYSx is an extremely niche market feature.

To not try to take advantage of the strengths of GPU Processing is insane and why nVidia is investing and AMD likes the potential of GPU Processing for Physics -- as did ATI -- as did Havok with HavokFX.

There are real benefits from GPU processing, which is the key. And GPU's will get more and more powerful, too. It's not just about this second but about building a foundation for the future with GPU processing.
 

Red Storm

Lifer
Oct 2, 2005
14,233
234
106
To not take advantage of the strengths of GPU Processing is insane and why nVidia is investing and AMD likes the potential of GPU Processing for Physics -- as did ATI -- as did Havok with HavokFX.

There are real benefits from GPU processing, which is the key. And GPU's will get more and more powerful, too. It's not just about this second but about building a foundation for the future with GPU processing.

I think it makes more sense to leverage the embedded GPUs now shipping with more and more CPUs from Intel and AMD. In a few years so many people are going to have that GPU in their rig (whether they go with a dedicated card or not) that the install base will likely be much higher than the dual card users.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,110
1,260
126
At least nVidia is doing instead of talking. Now talking instead of doing is a good thing or the smart thing?

Well no, both are a waste of energy imo. My point is that just because nvidia is doing physx doesn't make it a plus versus AMD not doing anything about gpu physics, currently.

Because what they've been and are doing has proven to be worthless as yet. The very, very low usage of gpu physx by game developers, I think, speaks to the currently lackluster possibilities that physx presents. This is why nvidia basically has to chase down game developers, hammer on their door with a sledgehammer, purchase a truckload of copies to bundle with their games and send in their own coders; just to get one or two game houses out of thousands to even use it. I think they're so invested into it, having gone so far as to purchase Ageia, that they are not letting it go.

It's gone nowhere in over four years now and there are CPU implementations that do it better and don't incur the serious framerate hit that gpu physx does.

Something is not a positive just because it's there ready to be used, it can actually be a negative, and physx is a negative imo. Wasted resources.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Well no, both are a waste of energy imo. My point is that just because nvidia is doing physx doesn't make it a plus versus AMD not doing anything about gpu physics, currently.

Because what they've been and are doing has proven to be worthless as yet. The very, very low usage of gpu physx by game developers, I think, speaks to the currently lackluster possibilities that physx presents. This is why nvidia basically has to chase down game developers, hammer on their door with a sledgehammer, purchase a truckload of copies to bundle with their games and send in their own coders; just to get one or two game houses out of thousands to even use it. I think they're so invested into it, having gone so far as to purchase Ageia, that they are not letting it go.

It's gone nowhere in over four years now and there are CPU implementations that do it better and don't incur the serious framerate hit that gpu physx does.

Something is not a positive just because it's there ready to be used, it can actually be a negative, and physx is a negative imo. Wasted resources.

I disagree. What I see is actual leadership and being pro-active in an area that the company feels is important. Physics is important, may be the next frontier and nVidia has placed themselves in a position by creating Cuda and innovating PhysX and offering more for their customers and choice for developers on many platforms including x86, GPU, and Arm. They don't have to count on others or force waiting on their customers.

Sure, I'd like to see more content and personally expected 6-12 AAA titles a year and disappointed. Sure, I'd like to see the GPU component of PhysX offer OpenCl flexibility; to help with adoption, maybe. But, I desire to see innovation with Physics and don't just want to settle for only what the CPU can muster considering the raw computational prowess of GPU processing -- it's a waste not to try to innovate -- a disservice for your customers.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,110
1,260
126
I disagree. What I see is actual leadership and being pro-active in an area that the company feels is important.

Out of curiosity where have you seen physx display leadership in its application over the alternatives out there ?

I generally define leadership as being an example set of what is best and to strive for, or an example or method to follow to achieve excellence. Perhaps I have missed where physx has done anything like that in terms of in-game physics ?

I've seen companies like DICE and Crytek show leadership and excellence in terms of in-game physics, setting the bar higher than anyone else has achieved yet. I would say their work on in-game physics is where the true leadership is and where nvidia should be aiming for and trying to catch up to.

Their work also really puts gpu physx squarely behind the eight ball because they're doing it with no performance hit on the GPU and doing it better. It makes physx look all the more unappealing. Nvidia does a lot of things very well, but too often this leads to the thinking that this means everything they do is great.
 
Last edited:

Rifter

Lifer
Oct 9, 1999
11,522
751
126
Sure you can add destruction. Because this is a content modification and has nothing to do with GPU-PhysX.

Not if it uses GPU physics for the destruction effects. Which is what i meant and i thought we were all talking about.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Not if it uses GPU physics for the destruction effects. Which is what i meant and i thought we were all talking about.

And this can run on x86. But there is a performance problem. All effects must be calculated even on a low-end gpu. That's the reason why we see scripted animations instead of real physic.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Out of curiosity where have you seen physx display leadership in its application over the alternatives out there ?

PhysX leadership by creating the foundations, getting content in there for their customers and improving the gaming experience over what the developer may of intended, improving multi-core, having a GPU component, spending resources where your mouth is.

I generally define leadership as being an example set of what is best and to strive for, or an example or method to follow to achieve excellence. Perhaps I have missed where physx has done anything like that in terms of in-game physics ?

You made your point clear - it's a waste of resources.

I've seen companies like DICE and Crytek show leadership and excellence in terms of in-game physics, setting the bar higher than anyone else has achieved yet. I would say their work on in-game physics is where the true leadership is and where nvidia should be aiming for and trying to catch up to.

How does Dice and Crytech's in-house physics help the developers of Batman? How does Dice's work or Crytech's work improve the gaming experience of Batman? PhysX is a middleware.

Their work also really puts gpu physx squarely behind the eight ball because they're doing it with no performance hit on the GPU and doing it better. It makes physx look all the more unappealing. Nvidia does a lot of things very well, but too often this leads to the thinking that this means everything they do is great.

More work, adoption and innovation would be very welcomed.
 

jordanecmusic

Senior member
Jun 24, 2011
265
0
0
wee nned a more optimized euphoria engine to implement along side havok. gta4 (could have been more optimized on pc) is proof that havok has lots of potential against physx. havok also runs on all platforms with no problems. ps3 360 mac and pc
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
wee nned a more optimized euphoria engine to implement along side havok. gta4 (could have been more optimized on pc) is proof that havok has lots of potential against physx. havok also runs on all platforms with no problems. ps3 360 mac and pc

PhysX works on the consoles as well. Havok will be hampered by the fact a CPU company owns it. And until Intel make GPUs, Havok wont be running on a GPU and thus will lower potential performance of the API.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
nVidia is doing more for the CPU Part of PhysX than AMD and Intel for Havok over OpenCL, or Bullet over OpenCL.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
People are missing the point of PhysX and the way it is implemented. PhysX can be used to create destructible world just like any other physics engine and it runs as good as any other physics engine on CPU. The special feature of PhysX is it can offloaded to GPU, which no other physics engine can.

Now the question is, should a game be balance assuming there exists GPU to offload, knowing that only systems with Nvidia video card can have such privileges. Game dev won't be stupid enough to exclude all AMD video card user so games won't require a Nvidia video card to run. In fact, I don't think any consoles use Nvidia video card, and Batman AA runs perfectly fine on consoles.

The one thing people hate about console ports is it never max out pc hardwares as consoles are assembled with older hardwares. To fix this, we have skin packs, but that is all. Does skin packs the only tool that can be used to utilize all those extra horse power from a PC? It can only go so far as most pc gamers are still on 1080p res.

Rage tried to utilize these horse power by having huge texture files, then compress it into JPG, and therefore removing details from the original art to allow the game to run. However, you can't compress it so it looks like sh!t on console, so instead it looks good on consoles, but what happens on PC? There are visually no difference from 8k to 16k on normal display, and 8k can't utilize hardware completely. Because of this, critics are saying it is another poor console port. Wait, ID tech 5 is suppose to utilize multi-core CPUs, while it use these extra cores to convert JPG into a format GPU understands. Wait, what happens on systems with older gen CPUs? That is fine, if they have a Nvidia video card, transcoding can be done on GPU. Wait what about AMD user? Lol, tough luck!

In comparison, the idea of the way PhysX is implement is much better, as the higher compute power, the more craps fly around. High end CPU? no problem. Low end CPU + high end Nvidia video card? no problem. Low end CPU + AMD card? Well they see less craps flying around, which is not a big deal.

This is only one way of implementing PhysX. PhysX can be implemented with 150xGTX580 just to produce 1 frame a minute, or GTX580SLI with 2600k at 120FPS. However, AMD user will not buy the game as the game is unplayable without Nvidia card, and therefore hurting sales seriously, not to mention consoles. Note that it isn't about the detail level as users can set everything as low as possible and still get < 10 FPS with there 2600k@5Ghz + 4x6970 overclocked. This is the real reason why game machanics won't change with or without a Nvidia card, not because PhysX cannot handle game machanics.

ID tech 5 also tried to port GPU transcode to OGL, but it poorly. Why? Because neither Nvidia or AMD care. While Nvidia's decision is logical, as they have CUDA, AMD simply don't give a sh!t. Why do vendors need to be envolve in programming? Because they are the ones that are responsible on how their hardware interface with those OGL APIs. The code which are done with OGL can be perfectly optimized and yet still runs like sh!t because the underlaying code behind those APIs are poorly implemented. This is why driver updates often increase performance in games. They did not change game code on how to call those APIs, but optimize the underlaying code behind those APIs, resulting better performance.

People blindly complains about proprietary can keep asking for open standards without knowing what needs to be done behind the scene. Doesn't Rage show the problems with OGL? The smart AA feature is completely missing because the underlaying code behind those OGL calls are missing from both vendors. People claim the game without knowing that it is really driver bugs. What is the point of porting everything to open standard when vendors fail to support them?

PhysX works and it can be used to suck up some of those extra horse power without having to revamp the whole game and so does Megatexture, but users do not see it this way, nor we even care. The definition of "work" depends on their "setup", not how it is coded. If it runs without crash, 60/120FPS, looks good and GPU/CPU monitor shows 90+% usage, then it is "working", which is a valid definition as we are the users. Once it works, then we may have interest on how it is done. Otherwise, it is simply a piece of junk code.
 

AdamK47

Lifer
Oct 9, 1999
15,507
3,210
136
It's better that PhysX is there than not at all. That how I look at it. If you don't like it, turn it off. Pretend it never existed in the game. The game would be the same as if it were never implemented.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
It's gone nowhere in over four years now and there are CPU implementations that do it better and don't incur the serious framerate hit that gpu physx does.

Thats like comparing apples to oranges in a way. How can you say CPU implementations "do it better," when there are currently NO CPU implementations that do real time physics ONLY like hardware PhysX?

And there's a good reason for this. There is a very large to humongous performance delta between CPUs and GPUs in parallel computing performance.

A Core i7 990x, the most expensive and fastest consumer desktop processor can only do 220 GFLOPS (single precision) at peak throughput, which is slightly less than the PS3 Cell processor's max at 230.

My dedicated physx card which is a GTS 250 that cost me 100 USD, has GFLOPS rating of 705. High end cards like the GTX 580 have well over a teraflop of raw computational power.

You could probably do a fairly realistic, 100&#37; real time approximation of a large scale explosion on a single GTX 580 I'd wager, something which no current CPU could attempt to do because they're just not powerful enough.

Something is not a positive just because it's there ready to be used, it can actually be a negative, and physx is a negative imo. Wasted resources.

One of the premises behind hardware accelerated PhysX (and hardware acceleration in general) was that as modern GPUs become more and more powerful, they will have more and more unused cycles that you can devote to other tasks.

From the physx trailer, it appears that the owner of a GTX 560 Ti (a mid end GPU) will be able to play Batman AC with high visual fidelity, and enable the PhysX effects to boot with playable frames.....all on the same card.

Thats the power of modern GPUs!
 
Last edited:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
From the physx trailer, it appears that the owner of a GTX 560 Ti (a mid end GPU) will be able to play Batman AC with high visual fidelity, and enable the PhysX effects to boot.....all on the same card.

Thats the power of modern GPUs!

You sure? The performance hit for hardware physx in even Batman: AA is ridiculously huge. A drop of 100 fps or more is not uncommon for the highest setting, thank god the game runs at 250 fps normally huh!? I don't mind a 100 fps drop if its from 200 to 100. But who knows if Batman: AC will be like that?!

I'm not quite sure a 560ti will be up to the task, we'll see when the games released
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
You sure? The performance hit for hardware physx in even Batman: AA is ridiculously huge. A drop of 100 fps or more is not uncommon for the highest setting, thank god the game runs at 250 fps normally huh!? I don't mind a 100 fps drop if its from 200 to 100. But who knows if Batman: AC will be like that?!

I'm not quite sure a 560ti will be up to the task, we'll see when the games released

Well thats how Geforce.com is describing it.

If you click on the link, they show you side by side pics with physx disabled and enabled. The PhysX disabled pic uses an 8800GT, whilst the PhysX enabled one uses a 560 Ti.

Of course, it doesn't tell you the resolution, and other IQ settings, but I'm assuming it's on high setting....which isn't the highest if Batman AA is used as an example.

Batman AA had a very high setting as well. Also, I'm sure this version of PhysX used in the game (2.8) has better performance than the version used in Batman AA due to better optimization.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,110
1,260
126
Thats like comparing apples to oranges in a way. How can you say CPU implementations "do it better," when there are currently NO CPU implementations that do real time physics ONLY like hardware PhysX?

Maybe there are no CPU implementations that perform physics the same as GPU physics, but that is not what matters to me as a gamer. What I care about is what I see on the screen, not what my hardware is doing in the background to put it there.

Game engines use all sorts of tricks and methods to cut the workload needed to display the imagery you experience. When there are games out there using the CPU to perform physics and doing it in more impressive, immerisive and game-changing ways than gpu physx does; I don't care about all the fancy calculations physx is doing anymore. It's not as good as what I'm seeing done on the CPU, but it's costing me performance the CPU physics are not. It's a double-negative for gamers.

Will it be possible with future-hardware that we will see gpu physx able to do more than it can now to put it on par or better than what we see from CPUs ? Maybe, it has had four years already and the hardware still is nowhere near there, and only 15 games that make use of it in that time. As time goes on game engines will demand more performance to just render graphics improvements, so who knows.

Right now gpu physx mostly looks to me like a way to sell more hardware for nvidia because of the unreasonable performance hit and an extra logo to stamp on the box.

Sure you can turn it off in games that use it, but I'd rather those games just make use of the superior CPU physics without the performance hit. Fortunately there have only been about four games worth playing that make use of gpu physx. For it to be worth anything to me the performance hit would have to be eliminated or they would need to incorporate an additional chip onboard to perform the calculations at no added cost.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Game engines use all sorts of tricks and methods to cut the workload needed to display the imagery you experience. When there are games out there using the CPU to perform physics and doing it in more impressive, immerisive and game-changing ways than gpu physx does; I don't care about all the fancy calculations physx is doing anymore. It's not as good as what I'm seeing done on the CPU, but it's costing me performance the CPU physics are not. It's a double-negative for gamers.

So you're fine with "tricks" as long as they look good? Because thats what a lot of the CPU "physics" is these days.....tricks, or smoke and mirrors, and not real physics.

Here's a good example. In BC2, you have huge, massive explosions. But unless you're standing in or very near the immediate blast radius of these explosions, they're not really dangerous.....or nowhere near as dangerous as you'd expect explosions of that magnitude to be.

Compare that to the much smaller grenade explosions in Metro 2033 which are [very] dangerous, because the entire explosion is done in real time, so it's more akin to a real explosion. Throw a grenade in Metro 2033, and the resulting shockwave will send surrounding objects, debris and shrapnel flying outward in a violent manner, killing or injuring the PC and anyone else unlucky enough to be in their path....which could be as far as 20 or more feet away, well out of range of the actual blast.....like a real explosion.

It took me a while to get used to how deadly grenades could be in Metro 2033, because so many games use "tricks" when it comes to physics effects like explosions. Usually the trick of choice is a canned animation, with a bit of real time physics to make it look real. Sure it may "look" nice, but it's nothing like a real explosion.

Anyway, my point is, that realism is one of the cornerstones of gaming. The lighting effects in DX11 are far more realistic than how lighting effects were under DX7 several years ago, but this realism comes at a price, because it requires more processing power.

Physics is the same way. More realistic physics, requires more computational power, and since GPUs have a lot more computational power than CPUs, it makes sense that they would be capable of more realistic physics than CPUs.

And thats what hardware PhysX attempts to do, full real time physics processing, rather than using the smoke and mirror tactics of CPU powered physics APIs.

So in light of this, it boggles my mind why you would be against hardware accelerated physics....unless you're against enhanced realism in gaming?

No profanity please.
-ViRGE
 
Last edited by a moderator:

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
At least to me, this new news, confirmation the game is dx11.

With a graphics card capable of powering the PhysX effects you can also enable a number of other enhancements, chief amongst which is DirectX 11 tessellation. Adding extra detail to objects and scenes, tessellation is one of the hottest new properties in game engine programming and will be the focus of a future Arkham City article that also examines the other enhancements and additions to the PC version of the game.
http://www.geforce.com/News/articles/exclusive-physx-in-batman-arkham-city-a-first-look
Catwoman strikes a pose in one of Batman: Arkham City&#8217;s tessellated environments.
 

Makaveli

Diamond Member
Feb 8, 2002
4,800
1,264
136
Question.

Is it still possible to run an NV card for just Physx and a AMD primary GPU card.

If so what limitations are there to this setup?

Having to wait for drivers?

Additional load and heat?

What would be the lowest end NV card you can use to get acceptable performance with physx on?

I've been contemplating adding the secondary card to my system but need to have these questions answered first figure a few of you are running a system with this config.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |