How to enable Nvidia Phsyx on Ati cards in Batman:AA

Page 14 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Originally posted by: error8But what is all the hype for? AA works in Batman, you just have to force it from CCC. I don't see any problem with this. I tried both in 4X and 8X and it's working fine. I'm getting terrible slowdowns at 8X so I know AA is there. And I see nothing wrong with it, it takes away all the stair step effect so it's working normally. About PhysX, I consider it useless anyway, so even if it's disabled automatically when a non Nvidia card is detected, it's not going to take away my gaming experience. Foul play or not, PhysX it's Nvidia's and they can do whatever they want with it, even if it's just a software feature.
You didn't read the thread? The reason people are disappointed is because you get the slowdowns due to AA not being implemented properly on ATI's side, but, amazingly, with a quick hack and a card ID change, you get it working perfectly on a pseduo-NVIDIA card. It seems like it was a dirty trick. I mean, AA was never an "exclusive" feature, hell, it's a standard. But what kind of precedence does this set? Now a company has to pay developers if they want AA to work on their cards in a game? Would this continue on to ALL advanced rendering features? Like I said, it's a road you don't want to go down.

 

error8

Diamond Member
Nov 28, 2007
3,204
0
76
It works just fine with 4X AA on my 4870. What, all games should work with 8X now?
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
Originally posted by: MrK6
Originally posted by: error8But what is all the hype for? AA works in Batman, you just have to force it from CCC. I don't see any problem with this. I tried both in 4X and 8X and it's working fine. I'm getting terrible slowdowns at 8X so I know AA is there. And I see nothing wrong with it, it takes away all the stair step effect so it's working normally. About PhysX, I consider it useless anyway, so even if it's disabled automatically when a non Nvidia card is detected, it's not going to take away my gaming experience. Foul play or not, PhysX it's Nvidia's and they can do whatever they want with it, even if it's just a software feature.
You didn't read the thread? The reason people are disappointed is because you get the slowdowns due to AA not being implemented properly on ATI's side, but, amazingly, with a quick hack and a card ID change, you get it working perfectly on a pseduo-NVIDIA card. It seems like it was a dirty trick. I mean, AA was never an "exclusive" feature, hell, it's a standard. But what kind of precedence does this set? Now a company has to pay developers if they want AA to work on their cards in a game? Would this continue on to ALL advanced rendering features? Like I said, it's a road you don't want to go down.




ditto this is wicked ghey
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: MrK6
You didn't read the thread? The reason people are disappointed is because you get the slowdowns due to AA not being implemented properly on ATI's side,
Because AMD did not work with the developer to implement it. (Even though they could have).
but, amazingly, with a quick hack and a card ID change, you get it working perfectly on a pseduo-NVIDIA card.
According to a blog post by an AMD marketing guy.

It seems like it was a dirty trick. I mean, AA was never an "exclusive" feature, hell, it's a standard.
It's not a standard on the Unreal 3 engine.
But what kind of precedence does this set? Now a company has to pay developers if they want AA to work on their cards in a game?
NVIDIA didn't pay the developer, they just helped them to get it working correctly on their card. AMD could have done the same thing, but chose not to.
Would this continue on to ALL advanced rendering features? Like I said, it's a road you don't want to go down.
Pure sensationalism. You ignore a lot of facts to get to that tin foil hat theory.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,650
218
106
Originally posted by: Wreckage

Because AMD did not work with the developer to implement it. (Even though they could have).

So you have access to mail of both the devs and AMD?

And you have the devs asking AMD "Hey do you want to implement AA for your cards in our game?" and then the answer of AMD saying "No go fuck yourselves?".

Or is your source some post in a blog from an nVidia worker?


 

JSt0rm

Lifer
Sep 5, 2000
27,399
3,947
126
Originally posted by: GaiaHunter
Originally posted by: Wreckage

Because AMD did not work with the developer to implement it. (Even though they could have).

So you have access to mail of both the devs and AMD?

And you have the devs asking AMD "Hey do you want to implement AA for your cards in our game?" and then the answer of AMD saying "No go fuck yourselves?".

Or is your source some post in a blog from an nVidia worker?

why bother dude...
 

Compddd

Golden Member
Jul 5, 2000
1,864
0
71
I can confirm this works guys.

I'm playing Batman AA on my new 5870 with PhysX on and it's running quite smoothly!

End user innovation FTW!
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Originally posted by: Wreckage
Originally posted by: MrK6
You didn't read the thread? The reason people are disappointed is because you get the slowdowns due to AA not being implemented properly on ATI's side,
Because AMD did not work with the developer to implement it. (Even though they could have).
but, amazingly, with a quick hack and a card ID change, you get it working perfectly on a pseduo-NVIDIA card.
According to a blog post by an AMD marketing guy.
It seems like it was a dirty trick. I mean, AA was never an "exclusive" feature, hell, it's a standard.
It's not a standard on the Unreal 3 engine.
But what kind of precedence does this set? Now a company has to pay developers if they want AA to work on their cards in a game?
NVIDIA didn't pay the developer, they just helped them to get it working correctly on their card. AMD could have done the same thing, but chose not to.
Would this continue on to ALL advanced rendering features? Like I said, it's a road you don't want to go down.
Pure sensationalism. You ignore a lot of facts to get to that tin foil hat theory.
What facts am I ignoring? Changing the vendor ID to mimic a NVIDIA card allows you to use AA properly in the game. It takes 5 minutes, but the developer didn't want to enable better performance of it's product across all hardware configurations? Are you serious? Everything posted is true, there are examples of this working, yet you refuse to admit it because you don't want to put NVIDIA in a bad light. Will you self-destruct if I continue to back you into a corner? I mean, your post is the equivalent of you sticking your fingers in your ears, and yelling "NAH NAH NAH I CAN'T HEAR YOU;" you have no counter argument.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: MrK6

What facts am I ignoring?

All of them.

Show me where NVIDIA payed them

Show me a screen shot of the AA working just fine

Show me where AMD was prevented from working with the developer.

Show me where AA is standard in the unreal engine especially using DX9

AMD did nothing and expects to be rewarded for it. Send them some food stamps.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,001
126
Originally posted by: Wreckage
Originally posted by: MrK6

What facts am I ignoring?

All of them.

Show me where NVIDIA payed them

Show me a screen shot of the AA working just fine

Show me where AMD was prevented from working with the developer.

Show me where AA is standard in the unreal engine especially using DX9

AMD did nothing and expects to be rewarded for it. Send them some food stamps.

So naturally you agree than, any DX10.1 or DX11 games should be disabled on Nvidia's upcoming cards since they don't have anything that supports those features now. Any DX10.1 or DX11 games that are available now or coming out soon must have been developed on AMD hardware. Nvidia should not be 'rewarded' for the trails blazed by AMD, right?
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
Originally posted by: SlowSpyder
Originally posted by: Wreckage
Originally posted by: MrK6

What facts am I ignoring?

All of them.

Show me where NVIDIA payed them

Show me a screen shot of the AA working just fine

Show me where AMD was prevented from working with the developer.

Show me where AA is standard in the unreal engine especially using DX9

AMD did nothing and expects to be rewarded for it. Send them some food stamps.

So naturally you agree than, any DX10.1 or DX11 games should be disabled on Nvidia's upcoming cards since they don't have anything that supports those features now. Any DX10.1 or DX11 games that are available now or coming out soon must have been developed on AMD hardware. Nvidia should not be 'rewarded' for the trails blazed by AMD, right?

Of course he agrees. Anything less would be hypocrisy.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: SlowSpyder


So naturally you agree than, any DX10.1 or DX11 games should be disabled on Nvidia's upcoming cards since they don't have anything that supports those features now. Any DX10.1 or DX11 games that are available now or coming out soon must have been developed on AMD hardware. Nvidia should not be 'rewarded' for the trails blazed by AMD, right?

DirectX is a Microsoft product. :roll: AMD did not write DirectX. Horrible attempt at a comparison by the way.

It's funny how you think AMD should be given stuff because they can't or won't do it for themselves.

Video Card Socialism. :laugh:
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
The only thing that's funny is you think that ATI should pay a company to get anti aliasing. Without Nvidia or ATI paying a game company any money at all, most games have full anti aliasing support for both cards. With this ONE game, Nvidia happens to give them money. With this ONE game, AA happens to be supported on Nvidia's card and not ATI's. Why is it now okay(required) for a GPU manufacturer to pay a game developer money to have AA support. Just because you say so? Are you going to claim that anti aliasing is an Nvidia product? Last time I checked, it is NOT ATI or Nvidia's job to develop games. There is no way anyone can believe this wasn't a very deliberate lockout. There's no way anyone can seriously doubt Nvidia would do this with their history.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: dguy6789
The only thing that's funny is you think that ATI should pay a company to get anti aliasing. Without Nvidia or ATI paying a game company any money at all, most games have full anti aliasing support for both cards. With this ONE game, Nvidia happens to give them money. With this ONE game, AA happens to be supported on Nvidia's card and not ATI's. Why is it now okay(required) for a GPU manufacturer to pay a game developer money to have AA support. Just because you say so? Are you going to claim that anti aliasing is an Nvidia product? Last time I checked, it is NOT ATI or Nvidia's job to develop games. There is no way anyone can believe this wasn't a very deliberate lockout. There's no way anyone can seriously doubt Nvidia would do this with their history.

Who said NVIDIA payed them anything?
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Originally posted by: Compddd
I can confirm this works guys.

I'm playing Batman AA on my new 5870 with PhysX on and it's running quite smoothly!

End user innovation FTW!

DO you have MSAA working? How does your framerate compare to reviews showing Batman benchmarks?
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
Originally posted by: Shaq

Nvidia scratched their back so they got theirs scratched in return. I guess in gaming companies aren't entitled to make money since a lot of gamers are apparently very immature and complain if they don't get the same features as someone else got. ATI can pay devs and get any added features they want. This is the game of the year most likely and Nvidia chose to invest some money in it. If they got a "bonus" from the devs it's not Nvidias fault and that is who almost everybody is blaming.
Mine was a rhetorical question. The purpose of it was to point out that if there?s no performance difference between driver AA and in-game, that means there?s absolutely nothing special about it given ATi is already doing the same thing in their drivers. That means the in-game setting is just a marketing tool.

In-game AA is easier to set as you don't have to go into the drivers and set it.
I don?t agree there; I prefer to force everything through the driver whenever possible because it?s much easier to control things from one central point. The only time I use in-game settings is for compatibility reasons and/or if there?s a clear performance advantage for doing so.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
Originally posted by: Wreckage

Because AMD did not work with the developer to implement it. (Even though they could have).
nVidia didn?t work with GSC to implement DX10.1/DX11 renderers in their games either.

According to a blog post by an AMD marketing guy.
And according to a blog post by an nVidia marketing guy we?re lead to believe nVidia had no part in blocking ATi out, and they didn?t pay the developer anything.

For the record I actually believe nVidia, I?m just pointing out the double-standard in your logic.

My stance on the issue based on information so far: I believe nVidia only provided the resources to help Batman?s developer implement in-game AA for DX9. I don?t believe money changed hands, and I don?t think nVidia had a part in implementing the AA block.

The AA implementation also appears to be a generic implementation for the UT3 engine to work around its deferred renderer under DX9, but the implementation provides no performance advantage over driver AA, nor does it require nVidia specific features.

For this reason, I blame the developer for the vendor block. Given this appears to be generic code that works perfectly on ATi cards when the block is defeated, I can see absolutely no legitimate reason why it was locked out.

Again, if anyone argues the point that it was implemented on nVidia hardware and wasn?t tested on ATi?s cards, then exactly the same thing applies to current DX10.1/DX11 render paths in other games. Should those also be blocked off to future nVidia cards?

DirectX is a Microsoft product. AMD did not write DirectX
Irrelevant. Since you still have issues understanding my point, let me make two sentences for you.

Wreckage states: Batman?s developer expended effort into implementing in-game AA. Because ATi had no part in that and because it was only tested on nVidia?s cards, Wreckage believes ATi should be locked out.

BFG10K states: Stalker?s developer expended effort into implementing in-game DX10.1/DX11. Because nVidia had no part in that and because it was only tested on ATi?s cards, BFG10K is asking Wreckage whether he believes nVidia should be locked out in the same way.

This is very quite simple; you either disagree with both sentences or you agree with both. Anything else and your arguments are a double-standard. I disagree with both statements because vendor-lock ? when done for no legitimate reason ? only hurts consumers.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Originally posted by: BFG10K
Originally posted by: Wreckage

Because AMD did not work with the developer to implement it. (Even though they could have).
nVidia didn?t work with GSC to implement DX10.1/DX11 renderers in their games either.

According to a blog post by an AMD marketing guy.
And according to a blog post by an nVidia marketing guy we?re lead to believe nVidia had no part in blocking ATi out, and they didn?t pay the developer anything.

For the record I actually believe nVidia, I?m just pointing out the double-standard in your logic.

My stance on the issue based on information so far: I believe nVidia only provided the resources to help Batman?s developer implement in-game AA for DX9. I don?t believe money changed hands, and I don?t think nVidia had a part in implementing the AA block.

The AA implementation also appears to be a generic implementation for the UT3 engine to work around its deferred renderer under DX9, but the implementation provides no performance advantage over driver AA, nor does it require nVidia specific features.

For this reason, I blame the developer for the vendor block. Given this appears to be generic code that works perfectly on ATi cards when the block is defeated, I can see absolutely no legitimate reason why it was locked out.

Again, if anyone argues the point that it was implemented on nVidia hardware and wasn?t tested on ATi?s cards, then exactly the same thing applies to current DX10.1/DX11 render paths in other games. Should those also be blocked off to future nVidia cards?

DirectX is a Microsoft product. AMD did not write DirectX
Irrelevant. Since you still have issues understanding my point, let me make two sentences for you.

Wreckage states: Batman?s developer expended effort into implementing in-game AA. Because ATi had no part in that and because it was only tested on nVidia?s cards, Wreckage believes ATi should be locked out.

BFG10K states: Stalker?s developer expended effort into implementing in-game DX10.1/DX11. Because nVidia had no part in that and because it was only tested on ATi?s cards, BFG10K is asking Wreckage whether he believes nVidia should be locked out in the same way.

This is very quite simple; you either disagree with both sentences or you agree with both. Anything else and your arguments are a double-standard. I disagree with both statements because vendor-lock ? when done for no legitimate reason ? only hurts consumers.
Very well said . There does seem to be quite the double standard present.

 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
Originally posted by: golem

Sorry it should be, "one UE3 game that actually had in game MSAA under DX9...". I've asked this question multiple times. But it SEEMS that the only UE3 engine game that had in game MSAA at all is Gears of War and that was only under DX10. No one seems to be able to name any UE3 games before Arkham that had ingame MSAA in DX9.
My apologies for misunderstanding you. Yes, you?re right about your UT3 observations.

The issue here is the UT3 engine?s (partial) use of deferred rendering, which is incompatible with MSAA because said implementation relies on traditional forward rendering. DX10 makes it easier to implement MSAA with deferred rendering because it?s possible to bind an MSAA render target as a texture, and each sub-sample can be accessed individually.

With DX9 you can still implement MSAA (as both vendors have done at the driver level for most UT3 based games), but in theory the implementation is slower than DX10?s. I say in theory because DX10 has yet to deliver any of the performance improvements it promises in theory, but I digress.

Anyway, Batman?s in-game AA appears to work fine on ATi cards when the lock is defeated, and appears to run the same speed too, so there appears to be nothing special with its implementation over what nVidia/ATi already do at the driver level. I could understand the block if it ran faster and/or required nVidia specific features, but neither appears to be the case.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
BFG, if that is the case, why would Anand say in the HD5870 review that benchmarking ati's CCC AA vs. in-game NV AA produces meaningless results, and that the in-game method is "much faster"?
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: BFG10K


Wreckage states: Batman?s developer expended effort into implementing in-game AA. Because ATi had no part in that and because it was only tested on nVidia?s cards, Wreckage believes ATi should be locked out.

BFG10K states: Stalker?s developer expended effort into implementing in-game DX10.1/DX11. Because nVidia had no part in that and because it was only tested on ATi?s cards, BFG10K is asking Wreckage whether he believes nVidia should be locked out in the same way.

Well since everyone loves to throw the word "standard" around.

Is DX10.1/11 a standard?

Is AA standard in the Unreal 3 Engine?

I ask because you make it sound like AMD had to write the DX10.1/DX11 code. As if Microsoft did not create and own the product.

Basically it is truly an incredibly weak comparison and nothing more than grasping at straws to back up your already thin argument.

BFG... Do you think AMD could have worked with the developer to get in-game AA?
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: BFG10K

Anyway, Batman?s in-game AA appears to work fine on ATi cards when the lock is defeated, and appears to run the same speed too,

Have you personally tested this?
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Should people stop buying nVidia?s cards because they don?t support games ?correctly? by not having DX10.1/DX11?

There seems to be a larger difference in Batman using AA then there is in every single DX10.1 and DX11 vs DX10 game combined atm. You could probably even make the argument that that would hold up even versus DX9.

Another note- outside of maybe Wreckage, you aren't likely to see a bunch of people who run nV hardware whining like a four year old girl about DX10.1/DX11 being added into games. Noone running nV hardware atm can use it, there aren't a bunch of threads to cry about it though. That is a rather staggering distinction between those who run red versus green it would seem.

nV owners knew when they bought their hardware that they wouldn't be able to use those features.

ATi users knew when they bought their hardware that nVidia would pay developers to get exclusive support for nV hardware.

Noone walked into this without knowing their choice would have pros and cons. It seems at this point a whole bunch of people are just throwing a fit because they guessed wrong on which was going to have a bigger impact on games people wanted to play in the lifetime of the hardware.

Remember, this is your logic, not mine. You?re happy for nVidia to be rewarded for doing nothing, but not for ATi to do the same. Why?

Let's see if Batman still won't run AA on ATi hardware when nV has DX11 hardware out, until then that point doesn't work very well.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |