- Jul 4, 2005
- 4,064
- 89
- 91
As it stands now, AA only works in BioShock?s DX9 mode, and only with GeForce hardware. We tried forcing AA with the Radeon HD 2900 XT under both Windows Vista and WinXP with all four custom filter AA modes (including edge detect) and couldn?t get AA to work properly in BioShock. We asked AMD if they plan on adding AA support to BioShock in a future Catalyst driver revision but couldn?t get a direct answer. The only way to enable AA under BioShock with Radeon cards is to rename the BioShock executable from ?bioshock.exe? to ?Oblivion.exe?. Keep in mind by doing this though, AMD?s driver-level optimizations for the game are automatically disabled, and as a result, performance suffers ? we recorded a frame rate of just 26.7 fps for the 2900 XT with the executable renamed to Oblivion.exe running our manual walkthrough sequence under 4xAA at 1600x1200. (Surprisingly enough, the Radeon X1950 Pro delivered a frame rate of 20.9 fps under the exact same scenario) Because of this, AMD?s Dave Baumann told us flat out not to run the game with the renamed executable.
Originally posted by: Laminator
As it stands now, AA only works in BioShock?s DX9 mode, and only with GeForce hardware. We tried forcing AA with the Radeon HD 2900 XT under both Windows Vista and WinXP with all four custom filter AA modes (including edge detect) and couldn?t get AA to work properly in BioShock. We asked AMD if they plan on adding AA support to BioShock in a future Catalyst driver revision but couldn?t get a direct answer. The only way to enable AA under BioShock with Radeon cards is to rename the BioShock executable from ?bioshock.exe? to ?Oblivion.exe?. Keep in mind by doing this though, AMD?s driver-level optimizations for the game are automatically disabled, and as a result, performance suffers ? we recorded a frame rate of just 26.7 fps for the 2900 XT with the executable renamed to Oblivion.exe running our manual walkthrough sequence under 4xAA at 1600x1200. (Surprisingly enough, the Radeon X1950 Pro delivered a frame rate of 20.9 fps under the exact same scenario) Because of this, AMD?s Dave Baumann told us flat out not to run the game with the renamed executable.
Originally posted by: Ackmed
HardOCP used AA. Seemingly without issues. They renamed it to Oblivion though.
Originally posted by: Sonikku
It reminds me of the time Half Life 2 suspiciously ran poorly on Nvidia hardware at release when compared to ATI, only to see the gap narrow considerably when Nvidia fans found work arounds to Valve's engine. It seems this time around Nvidia has done something similar. It's things like this that really make me hate the PC industry. -_-
Originally posted by: cmdrdredd
This game was paid for by Nvidia. It shows. It's not that the Radeon cards can't do it, far from it. It's that the game is specifically made so that it will skew results in favor of Nvidia. It is a TWIMTBP game after all :roll:
Originally posted by: quattro1
Originally posted by: cmdrdredd
This game was paid for by Nvidia. It shows. It's not that the Radeon cards can't do it, far from it. It's that the game is specifically made so that it will skew results in favor of Nvidia. It is a TWIMTBP game after all :roll:
lol, pure comedy, wonder why you say that, because you have an ATI card? Bioshock's engine does not support AA. Any AA to be applied to the game has to be done in the driver(correctly), by either NVIDIA or ATI. Nothing keeping each one from adding that support in their own driver, it's completely independent of the game.
Originally posted by: PhatoseAlpha
Seems pretty cut and dried to me. You know Ati's drivers are keying on the executable file name and altering the way they do things. They've said so. Suspect #1 here is that Ati's driver optimizations are what's killing the AA.
That's not happening here because AA is forced through the driver, not through the application. Also it uses nVidia's R6 Vegas flag which nVidia had working a while ago but AFAIK ATi still don't have AA working in that game.They deliberately made it difficult for ATI cards to function 100%.
Only through DX10 according to the developer which again has nothing to do with nVidia's driver level implementation.The Unreal3 engine does support AA though
Because it activates the AA flag that Oblivion uses ("Chuck patch"). Again that has nothing to do with the application, it's at the driver level.How else do you explain being able to rename the .exe and then enable AA?
HL2 ran poorly because NV3x shader performance sucked balls. That wasn't Valve's fault, nor was it their fault they had to implement a mixed mode path to get acceptable performance.It reminds me of the time Half Life 2 suspiciously ran poorly on Nvidia hardware at release when compared to ATI,
Details?only to see the gap narrow considerably when Nvidia fans found work arounds to Valve's engine.
Originally posted by: BFG10K
That's not happening here because AA is forced through the driver, not through the application. Also it uses nVidia's R6 Vegas flag which nVidia had working a while ago but AFAIK ATi still don't have AA working in that game.They deliberately made it difficult for ATI cards to function 100%.
What ATi needs is a Chuck patch for Bioshock.
Only through DX10 according to the developer which again has nothing to do with nVidia's driver level implementation.The Unreal3 engine does support AA though
Because it activates the AA flag that Oblivion uses ("Chuck patch"). Again that has nothing to do with the application, it's at the driver level.How else do you explain being able to rename the .exe and then enable AA?
Having said that it's rather strange they're getting AA since the Chuck patch doesn't account for deferred rendering which Bioshock uses. I'd wager they're getting a performance hit but no actual AA.
Originally posted by: BroadbandGamer
I'm still kind of pissed I can't enable AA on Vista with an 8800 GTX.:|
Originally posted by: keysplayr2003
Originally posted by: BroadbandGamer
I'm still kind of pissed I can't enable AA on Vista with an 8800 GTX.:|
What happens when you enable AA in the drivers? Or is that option greyed out when in the Bioshock profile in Vista? I haven't tried it yet. AA in the drivers works nice in XP though, (sorry ). Hopefully, 2K will release some game patches to address these issues.
Originally posted by: n7
Originally posted by: keysplayr2003
Originally posted by: BroadbandGamer
I'm still kind of pissed I can't enable AA on Vista with an 8800 GTX.:|
What happens when you enable AA in the drivers? Or is that option greyed out when in the Bioshock profile in Vista? I haven't tried it yet. AA in the drivers works nice in XP though, (sorry ). Hopefully, 2K will release some game patches to address these issues.
Nothing happens.
It's not greyed out in the profiles, but then again, profiles haven't worked on the last few driver sets IIRC, so i force things globally.
You can set 16x AA & it does nothing.
Originally posted by: BFG10K
HL2 ran poorly because NV3x shader performance sucked balls. That wasn't Valve's fault, nor was it their fault they had to implement a mixed mode path to get acceptable performance.It reminds me of the time Half Life 2 suspiciously ran poorly on Nvidia hardware at release when compared to ATI,
Details?only to see the gap narrow considerably when Nvidia fans found work arounds to Valve's engine.
Originally posted by: n7
Originally posted by: keysplayr2003
And it's set to "override game options"?
Yeah.
But it doesn't matter.
None of the AA modes work. At all.
It's all a pretty Biojaggieshock.
Originally posted by: Cookie Monster
Its called marketing. Get use to the term. And unless you have any evidence backing up your bold statement, stop making yourself look like a ATi fanboy.
Do you want me to start a thread on COJ and the removal of hardware based AA that caused at least 15% performance drop on the G80s?