New Mafia II PhysX ON/OFF video.

Page 20 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Sep 9, 2010
86
0
0
No, but anyone on this forums is capable to judging his opinion based on the merits of the evidence he brought forth. That aside, I don't think he was addressing you when he referred to cellfactor, a title very conveniently ignored for the purpose of broadening the generalization that no title has ever made use of physx in gameplay; moreover, you having played it doesn't refute his point that people keep painting physx with an overly broad brush and seem to have no problem with using misinformation as long as it suits their views.

PhysX has been used only for eye candy and no interaction on the gameplay.

False dichotomy. Sticking with Dx9 hardly prevents anyone from innovating in technology. You can do both, either or none. In fact, I'd like to hear how any dx11 features have more potential to revolutionize gameplay than hardware accelerated physics... Again, one has nothing to do with the other in this context but so far dx11 has failed to impress me far more often than hardware physics 3 years ago.

Seems that you don't have a DX11 capable card, Tessellation and DirectCompute showed more in less time being in the market that evern PhysX done, besides Batman AA, I haven't seen a game that can really impress with its physics effects using PhysX, Half Life 2 is a 6+ old game which uses CPU phisics and yet, at this time, nothing changed much.

As for nvidia (or any companies of this caliber at this point) not innovating, I'll only point out that if you were to take away all the technologies that nvidia brought up from proprietary tech demo to industry standard, you'd be left with a very pale carcass of what games look like today.

I really doubt that, besides PhysX, most of new technologies incorporated on DX11 where ideas apported by AMD, nVidia was the one who screwed DX10 asking Microsoft to loose up a bit. DX10.1 features like Read back multi sampling came from AMD, Tessellation came from AMD, BC6 and BC7 compression schemes came from the ideas of 3Dc compression made by AMD, and Gather4 which came from the Fetch4 implementation back from the X1k era. nVidia did a lot of efforts in the OpenGL arena, but in DX as of late, nothing had come from nVidia. So without nVidia, nothing in the graphic quality landscape would had changed.

Define what you would call immersive. Lest we move to the flavor of the day of no-true-Scotsman fallacy, you're going to have one hell of a time squarely disproving a subjective view held by someone else. I happen to agree that some of the mafia 2 effects were lackluster and pointlessly resource intensive (other than the destructible and interactive objects; those were quite promising) but that doesn't make his preference any less valid.

If you want inmersiveness, play Batman AA the crow's level. While everyone has a different opinion of inmersiveness, Batman AA can indulge you in that in terms of phisics effects.

If his stance is really as fragile as you claim, then it should be easy for you to convince everyone without making this weird.

I don't have to convince nobody, this thread is about Mafia 2 PhysX On/Off video, and is up to the gamer if want to use its effects or not and if it make a difference, plus the idea of democracy and forum is to post, share opinions and have a good time, not to convince and brainwashing people.
 
Last edited:

taltamir

Lifer
Mar 21, 2004
13,576
6
76
if i was nvidia, I would have ensured interoperability of a secondary physX card (that is, nvidia GPU) with a primary ATI card, then paid some game makers to make games with great first order physics improvements that requires a dedicated G92 class card for physics. This could have then started a trend where people have an nvidia OR ATI main card, and a secondary, older and weaker, nvidia card dedicated to physX, which could have only grown in favor of nvidia over time.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
I don't have to convince nobody, this thread is about Mafia 2 PhysX On/Off video, and is up to the gamer if want to use its effects or not and if it make a difference, plus the idea of democracy and forum is to post, share opinions and have a good time, not to convince and brainwashing people.

This thread could have ended 467 posts ago, with this paragraph.
Good job.
 

taserbro

Senior member
Jun 3, 2010
216
0
76
PhysX has been used only for eye candy and no interaction on the gameplay.

But you've just discounted cellfactor without providing a reason again, not to mention ignored all the factors why devs are not able to use physx in a way that would make it mandatory. Repeating it won't make it closer to being true.

Seems that you don't have a DX11 capable card, Tessellation and DirectCompute showed more in less time being in the market that evern PhysX done, besides Batman AA, I haven't seen a game that can really impress with its physics effects using PhysX, Half Life 2 is a 6+ old game which uses CPU phisics and yet, at this time, nothing changed much.

I do actually. I have a 5870 in a gaming computer and a gtx470 in a poor man's workstation, both of which occasionally running modern titles as sort of checking in with the times. I've played mafia2 as well as some recent dx11 titles such as bc2 and dirt2. I still maintain that the physics seen in physx demos and some physx titles including cellfactor are much more impressive than any tangible difference from switching from a dx9 to a dx11 rendering method in all the games I've seen so far.

Even if I weren't equipped to judge dx11 titles, I still fail to see how it's impossible to innovate or technology-wise just by sticking to a single generation of directx like you asserted nvidia is guilty of. I see modern dx9 titles that look much better than early ones, some dx9 titles that look much better than some early dx11 titles and regardless of dx version I notice the most apparent change in visual quality when a chip maker enables a new technology rather than when microsoft ships a new release of their api.

I really doubt that, besides PhysX, most of new technologies incorporated on DX11 where ideas apported by AMD, nVidia was the one who screwed DX10 asking Microsoft to loose up a bit. DX10.1 features like Read back multi sampling came from AMD, Tessellation came from AMD, BC6 and BC7 compression schemes came from the ideas of 3Dc compression made by AMD, and Gather4 which came from the Fetch4 implementation back from the X1k era. nVidia did a lot of efforts in the OpenGL arena, but in DX as of late, nothing had come from nVidia. So without nVidia, nothing in the graphic quality landscape would had changed.

T&l, programmable shaders, gpu crossbar memory controllers, real time multi-pass bumpmapping, hardware frustrum culling, advanced MSAA, 3dvision and I'm probably not even close to scratching the surface here, were from nvidia. Some of those technologies were invented in house. And while ati did innovate big time with their first implementation of tesselation (I'd know, I still have a radeon 8500 and I wrote a paragraph about that a while ago in this thread), they did very little to make it be adopted unlike nvidia who are doing everything they can to plug physx short of a move that would directly come to bite them in the derriere.

Note that I'm not going to discount ati's contributions here at all; they were first to do a single pcb dual chip card, they also did much better dvd decoding support and again, plenty of other things I can't recall/ haven't heard of yet. However, anyone can wikipedia up a laundry list of such techs from almost all companies that were ever in a competitive position.

I'm never been one to excuse poor behavior like nvidia (if some evidence is brought up; so far I see none) but saying that, and I quote, "without nVidia, nothing in the graphic quality landscape would had changed" just leads me to believe you made up your mind to put them in a bad light no matter what because even what little I listed above all constituted a significant progression in look and possibilities of gaming during their times.

If you want inmersiveness, play Batman AA the crow's level. While everyone has a different opinion of inmersiveness, Batman AA can indulge you in that in terms of phisics effects.

Fair enough, that's your opinion which I never argued against. It's also my opinion that some (not all) effects in mafia2 were also well done and satisfy well enough of the requisites to be called immersive as well. That was what you called lonbjerg out for.

I don't have to convince nobody, this thread is about Mafia 2 PhysX On/Off video, and is up to the gamer if want to use its effects or not and if it make a difference, plus the idea of democracy and forum is to post, share opinions and have a good time, not to convince and brainwashing people.

Actually, you don't necessarily have an obligation to convince me of much even though that would help the discussion since that's just about the point of you replying to me making it sound like I'm wrong, but in this case you do need to convince people that calling someone a 15 year old kid and reporting his posts over his pointing out that, in fact, there exists a title that showed the viability of physx in creating revolutionary physics in gameplay despite the repeated assertions of otherwise is justified. And by that, I mean back up your claims with facts you candidly believe in.

It's a pretty easy thing to do to fall back to a safe stance that paints us reasonable but there's more to forums than to just state your favorite colors and calling others names; many, such as myself, come here hoping to learn something, often changing our views about subjects we previously weren't familiar with through people with experiences and insight I don't have access to. It would have pleased me to have my mind changed through learning new things by being presented evidence I wasn't aware of (I don't think anyone's been brainwashed through being shown a cogent argument yet) but there are less and less folks interested in that rather than just gathering more unqualified corroboration. If, as you claim, you never intended to be here to debate on the equal footing of being open to change your stance, then all that remains in your post were simply insulting someone for disagreeing with you, which in my book isn't exactly consistent with the spirit of "sharing opinions and having a good time", however admirable that sounds.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Only way PhysX will become the standard is if they port it to OpenCL (or have two version: CUDA for GeForce, OpenCL for everything else). That way it will run on AMD GPU's whether AMD likes it or not. That would require a giant (and welcome) shift in Nvidia's decision making process though.

My theory is that nVidia already has an OpenCL implementation... or at least has done enough of the preparations for it to be able to release one on short term.
They just don't release it until their hand is forced. Currently they have the monopoly on accelerated physics. When a competitor starts offering accelerated physics as well, PhysX could be in danger (or possibly when developers decide to stick with alternative CPU-only APIs, but so far PhysX' market share has only increased).
Then nVidia can make PhysX more compelling by enabling OpenCL support. This way they can protect their PhysX investment and market share, and continue to have some control over the physics market.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
If Nvidia really wanted to be smart, here's what they'd do imho:

1. Let anyone support physx, open it on the hardware support side. (ie, AMD could implement it on their graphics cards for no cost, Intel could optimize it for their cpus at no cost, etc).

2. Let games use the physx API for no cost.

3. Develop really, really, exceptional tools and libraries for implementing physx based effects in games and other programs. Charge for these, they'd be licensed. Ie if you were a game developer you'd use nvidia's tool and they'd get a cut of every game sold (or pay a larger up front fee, or whatever, like licensing a game engine). Develop the best physics tools for consoles, cell phones, etc. Tons of money could be made, they'd get good PR, etc...win/win.

They're doing 2 and 3 already (Apex is a fine example of how nVidia is committed to 3).
And as I said, I *think* nVidia is going to do 1, at least to the part of supporting OpenCL, once the time is right. I doubt they'd actually give AMD and Intel access to the source code, that is very VERY unusual in the software world. Protecting IP, trade secrets and that sort of thing (don't forget, Intel owns Havok, which is PhysX' biggest competitor).
 

Scali

Banned
Dec 3, 2004
2,495
0
0
This. For all the talk PhysX gets it still hasn't even delivered something that Half Life 2 showcased almost 6 years ago (as far as I can tell? Please correct me if I am wrong).

You might want to take a look at the game I-Fluid.

Okay, it isn't a fancy 'A' title like Half-Life 2, and no, it doesn't even use PPU/GPU acceleration... Just a simple CPU will do fine.
However, it DOES use quite sophisticated gameplay physics, comparable to what Half-Life 2/Portal do... making puzzles with physics.
And it uses PhysX for it.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
I really doubt that, besides PhysX, most of new technologies incorporated on DX11 where ideas apported by AMD, nVidia was the one who screwed DX10 asking Microsoft to loose up a bit. DX10.1 features like Read back multi sampling came from AMD, Tessellation came from AMD, BC6 and BC7 compression schemes came from the ideas of 3Dc compression made by AMD, and Gather4 which came from the Fetch4 implementation back from the X1k era

Are we still on this FUD?
That was disproved ages ago.
nVidia's hardware actually DOES support multisample readback, and it can be enabled through an NVAPI driver extension.
In fact, this is how they implement AA in games like Batman: AA, Far Cry 2, and a few other titles which use deferred rendering, and still enable AA (multisample readback is pretty much a requirement for that).

So how could nVidia want Microsoft to loosen up the DX10 specs by removing features that nVidia supports?
Another thing is that DX10.1 requires at least 4xAA, where DX10 doesn't require AA at all.
Again nVidia wanting Microsoft to loosen up the DX10 specs? Even though nVidia has supported 4xAA and beyond for years?

Nope... if there is any truth to Microsoft loosening up DX10 specs to favour an IHV, then that IHV must be Intel. THEY are the ones who do not support the 'missing' features like AA and multisample readback. Not nVidia. Because if it was nVidia, then the DX10 specs would be exactly the G80 specs, and those include a number of DX10.1 features (and also DirectCompute... but AMD cannot do that on their early DX10/10.1 hardware... did AMD ask Microsoft to remove it?....)
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |