- May 7, 2005
- 5,161
- 32
- 86
Funny thing was, in one of the those benchs, using HQ for the GX2 actually improved performance. Kind of strange but something about it makes me giggle.
Originally posted by: Wreckage
Originally posted by: Ackmed
Some of us did, some didnt want to accept it. As I have said, ATi has better IQ. And 8x+ AA from NV is hardly playable in current games. Even with a GX2. I got flamed for saying it, its nice to have a review say the same. Even if its x-bit.
They also stated that NVIDIA's AA was better, the GX2 was faster and HDR+AA is poorly implemented. I know you have your red filter enabled. :roll:
* Nvidia?s Transparent AA Multi-Sampling hardly does any job at all.
* ATI?s ?Performance? Adaptive AA is clearly better than Nvidia?s Transparent AA multi-sampling, but is obviously not as good as ?Quality? Adaptive AA.
* Nvidia?s Transparent AA Super-Sampling is a little bit more accurate than ATI?s ?Quality? Adaptive AA.
Originally posted by: Ackmed
About HDR+AA, yes they did say that. I dont think its hard at all. You download the fix, install it, and enable force the AA, while selecting HDR in the game. Obviously its not that hard. Their opinion is their opinion.
Originally posted by: Ackmed
It is if you've got a manly enough PSU to run any card.
They had to set NV's to high quality to even get close to ATI's non shimmering levels,
What the hell are you talking about? It's plenty playable at 1600x1200 and even 1920x1200 if you use 8xSLI in HL2. It's also faster than the Radeon's 6xAA in Oblivion indoors and neither card can do 8x/6x outdoors so that's a moot point.And 8x+ AA from NV is hardly playable in current games. Even with a GX2
Err, transparent multi-sampling doesn't actually work in HL2 so their comments with that are meaningless. In games where it does work it significantly reduces vegetation shimmer (probably even better than raw 4xSSAA) with literally zero performance hit. It's amazing for games like Far Cry and Call of Duty 2.Looks like they give a 2 to 1 edge to ATi.
Yes, that isn't so bad. The problem is ATi's shocking Crossfire implementation where you had to resort to renaming application names in the hopes that you'd randomly stumble onto the right one. Even now all you can do force AFR in Direct3D games and hope for the best.About HDR+AA, yes they did say that. I dont think its hard at all. You download the fix, install it, and enable force the AA, while selecting HDR in the game.
They also said that ATI's HDR+AA in Oblivion was difficult to implement, and you were quick to jump on that inaccuracy rather than their talk about 8xAA.And 8x+ AA from NV is hardly playable in current games. Even with a GX2. I got flamed for saying it, its nice to have a review say the same.
When Xbit discredits an ATI feature it doesn't seem true to your ears, yet when they practice the same technique on Nvidia's features you applaude with agreement.About HDR+AA, yes they did say that. I dont think its hard at all.
No, it was about you claiming that "X" amount of frames was playable with ATI hardware in HDR+AA situations but turning around and saying that that same "X" amount of frames wasn't playable on Nvidia hardware in 8xAA situations. I'm convinced you have a reading comprehension disorder.That being said, the discussion I had with others was not about that, it was about ATi having better IQ, and 8x+ AA not being very playable in newer games.
How? By saying that HDR+AA is a big hassall? You nit-pick what you think it validates and chop up the fluidity of the review by concentrating on the red tresspasses. Drop the selective reading.This article somewhat validates my arguement.
This is true, but if I'm spending the amount for an X1950XTX, I wouldn't be doing so unless I knew I could run it. Besides, the difference in power consumptions between the two are not as great of difference as you think. In fact, the reference that Nvidia gave Xbit was higher than the X1950XTX.And yet, the GPUs manage to suck down less power then ATI's one, so it's not a win-win situation.
While it is impressive that it takes two 7 series GPU's to get about the same X1k power consumption, when we measure how much more complicated a GPU like the X19 series is, it becomes more understandable. Their GPU's can handle more diverse instructions than that of current games. (i.e. better physics and superior float-point calculations and dynamic branchings for Folding@Home) Not to mention people who want G80's will have to have just about as good if not better PSU's than the ones that power the current X1k cards so for those who invested in a good PSU for the X1k series may have saved themselves a little money when the DX10 cards hit.These cards should have comparable power consumption, though. By our estimate, the dual-chip solution from Nvidia consumes about 110-120W. Not yet having any accurate data about the power consumption of the GeForce 7950 GX2 we have to name the Radeon X1950 XTX the most voracious premium-class graphics card of today. Anyway, one such card can be easily fed by any high-quality 450W ATX 2.0 power supply.
Originally posted by: Trevelyan
Well since the x1950xt is faster in Source engine games, I'll probably get it.
Originally posted by: Matt2
Originally posted by: Trevelyan
Well since the x1950xt is faster in Source engine games, I'll probably get it.
Umm... are u guys pulling these numbers out of your butts?
Look at the HL2: E1 benchmark.
1600x1200
FSAA 4x + Aniso 16x
Radeon X1950XTX | HQAF---------77.1
7950GX2 | High Quality------------87.9
Originally posted by: Wreckage
Originally posted by: Ackmed
About HDR+AA, yes they did say that. I dont think its hard at all. You download the fix, install it, and enable force the AA, while selecting HDR in the game. Obviously its not that hard. Their opinion is their opinion.
So in other words you only support their opinion when it's your opinion..... LOL! You have gone from one sided to just plain comedy.
Just pick the points from any given article that supports your view and ignore and or refute the rest. Bash a site like Hardocp until they come out with a pro ATI article and then say how good it is, etc.
Yawn...
While I would hate to see AMD remove ATI from the high end market as it would ruin competition and pricing, at least it would render you null and void on most forums.
Originally posted by: josh6079
They also said that ATI's HDR+AA in Oblivion was difficult to implement, and you were quick to jump on that inaccuracy rather than their talk about 8xAA.And 8x+ AA from NV is hardly playable in current games. Even with a GX2. I got flamed for saying it, its nice to have a review say the same.
Example:
When Xbit discredits an ATI feature it doesn't seem true to your ears, yet when they practice the same technique on Nvidia's features you applaude with agreement.About HDR+AA, yes they did say that. I dont think its hard at all.
No, it was about you claiming that "X" amount of frames was playable with ATI hardware in HDR+AA situations but turning around and saying that that same "X" amount of frames wasn't playable on Nvidia hardware in 8xAA situations. I'm convinced you have a reading comprehension disorder.[/quote]That being said, the discussion I had with others was not about that, it was about ATi having better IQ, and 8x+ AA not being very playable in newer games.
How? By saying that HDR+AA is a big hassall? You nit-pick what you think it validates and chop up the fluidity of the review by concentrating on the red tresspasses. Drop the selective reading.This article somewhat validates my arguement.
ATI Radeon X1950 XTX produces higher image quality under our settings due to high-quality anisotropic filtering;
Nvidia GeForce 7950 GX2 could produce higher quality antialiasing modes than the Radeon X1950 XTX, but they are hardly useful for modern games;
Yes Ackmed, I did see that you discussed the technicalities. Good job there bub, you get a sticker for the day. The point is that you discredit one of their claims faulting ATI and credit the claims faulting Nvidia.I discussed it. I dont see why they think its hard. You turn off Cat A.I. and force AA. Two changes in the drivers, and its done.
It's not an insult, just an observation that you continually prove to be accurate.I cant comprehend? Why the continued insults?
:roll: Funny, I just saw this same tactic from a child the other day.**I know you are but what am I?**Sorry, that would be you.
And I made a very clear cliff-note for you in that thread as well and you still don't understand why you were wrong.I said it very plainly, and many times in the last thread.
No, you said that 54 frames was border line for playable only to specify that claim to online shooters after BFG10K proved you wrong with your own preferences. The only reason why you started talking about online games like Q4 is because the single player games with FP16 HDR+AA got lower than 54 frames. BFG10K then gave links to Far Cry (a game with an online mulitplayer that you haven't played) and showed that it still got less than 54 frames. You then claimed that the online mulitplayer in a game you never played for a first person shooter "isn't a twitch shooter".The two games talked about were Oblivion, and Quake 4. I said that getting around 30 frames is ok for Oblivion with HDR+AA, because its a slow paced game, and not multiplayer. I said around 30 fps wasnt good enough for Quake 4 with 8xAA (its actually even lower than that), because its a twitch FPS shooter, and has multi player.
Except you convientently leave out the fact that someone more credible than yourself gets playable frames in Q4 with 8xS and doesn't have a GX2. But that would be showing both sides of the argument instead of one wouldn't it...There is no bias in that statement.
How old are you?Got hasslehoff on your mind?
They made more than two, that is just how many you picked.I didnt nit pick anything. They made two points, that agreed with what I said. Lets look at them again
They also said that HDR+AA isn't very useful in games, but you objected to that claim. This is what I mean by selective reading.Its just another site that agrees with me, in that ATi has better IQ, and 8xAA+ is not very useful in new games.
Originally posted by: josh6079
Yes Ackmed, I did see that you discussed the technicalities. Good job there bub, you get a sticker for the day. The point is that you discredit one of their claims faulting ATI and credit the claims faulting Nvidia.I discussed it. I dont see why they think its hard. You turn off Cat A.I. and force AA. Two changes in the drivers, and its done.
It's not an insult, just an observation that you continually prove to be accurate.[/q\I cant comprehend? Why the continued insults?
:roll: Funny, I just saw this same tactic from a child the other day.**I know you are but what am I?**[/quote]Sorry, that would be you.
And I made a very clear cliff-note for you in that thread as well and you still don't understand why you were wrong.I said it very plainly, and many times in the last thread.
No, you said that 54 frames was border line for playable only to specify that claim to online shooters after BFG10K proved you wrong with your own preferences. The only reason why you started talking about online games like Q4 is because the single player games with FP16 HDR+AA got lower than 54 frames. BFG10K then gave links to Far Cry (a game with an online mulitplayer that you haven't played) and showed that it still got less than 54 frames. You then claimed that the online mulitplayer in a game you never played for a first person shooter "isn't a twitch shooter".[/quote]The two games talked about were Oblivion, and Quake 4. I said that getting around 30 frames is ok for Oblivion with HDR+AA, because its a slow paced game, and not multiplayer. I said around 30 fps wasnt good enough for Quake 4 with 8xAA (its actually even lower than that), because its a twitch FPS shooter, and has multi player.
There is no bias in that statement.
Except you convientently leave out the fact that someone more credible than yourself gets playable frames in Q4 with 8xS and doesn't have a GX2. But that would be showing both sides of the argument instead of one wouldn't it...
Got hasslehoff on your mind?
How old are you?
I didnt nit pick anything. They made two points, that agreed with what I said. Lets look at them again
They made more than two, that is just how many you picked.
They also said that HDR+AA isn't very useful in games, but you haven't objected to that claim. This is what I mean by selective reading.Its just another site that agrees with me, in that ATi has better IQ, and 8xAA+ is not very useful in new games.
As you can see, Elder Scrolls: Oblivion looks remarkable with FSAA and HDR turned on. Unfortunately, we could not obtain the benchmark numbers for you to see, as performance of Oblivion with HDR+AA varies substantially, which does not allow us to present validated benchmark result: every new manual test run brings a new number.
What we can say about current state of HDR+AA support is that it is here, it provides some additional eye-candy and is generally a nice feature. However, given all the difficulties with enabling the capability, gamers should really think twice before considering this feature seriously.
ATI Radeon X1950 XTX theoretically can enable FSAA with FP16 HDR, which is an advantage, however, it does not have the feature supported flawlessly at the moment and it may not provide sufficient performance;
No, I don't think it is difficult to enable just like I don't think using 8xAA is difficult for a 7950GX2. The difference is your opinion is irrationalized since you only disregard inaccuracies against ATI instead of both vendors.The point is, I said they have their own opinion, and nothing is wrong with that. I also said that myself, and others in this very thread said enabling HDR+AA wasnt hard. Do you think it is?
Where? Where have you demonstrated that you don't have a reading comprehension disorder? Every "point" you think they made isn't the entire "point" but rather a fragment of it that you've picked out to try and glorify ATI. Like I said, they were wrong about the HDR+AA being complicated to enable but I took it a step further than your biased outlook would see and saw that they were incorrect about the 8xAA claims as well.Except not, I proved you were wrong again later in the post.
Once again you misread. Where did I say you were childish? I said your trolling is childish since you refuted my claim with a paraphrased, "I know you are but what am I" statement.You try to call me childish, when you name call?
Exactly. You then went on a tangent claiming that Q4 with 8xAA wasn't universally playable because it wasn't playable at 19x12, your specific resolution. I'm willing to bet there are more people with resolutions of 16x12 or lower than there are of 19x12.The reason why I started to talk about Quake 4, was because he provided a link of his own work, and said Quake 4 was playable if you dropped the res to 1600x1200.
You never played the multi-player. If you're really wanting to be non-biased in this statement, why not say, "for me" again since you're only going off of what you experienced. Trying to get the first shot on someone was all about twitch and speed.Yes I did say that Farcry isnt a twitch shooter, its not. There is a LOT of sneaking around, and little running and gunning.
Using X1900 or greater CrossFire platforms at least. This is why BFG10K, me and many others have told you that the performance hit of HDR+AA is very similar to the performance hit with 8xAA.And yes, that is borderline playable for an online shooter.
Proof? I'm not doubting you but that doesn't detract from the fact that ATI could barely do it. Are we now determining the playability of a game with certain settings off of its popularity? :roll:Next to nobody plays Farcry multi anymore, or ever did.
Once again, your reading couldn't comprehend what he said. BFG10K claimed that he got ~40fps at 16x12 with 8xAA in Q4. 40fps is not a slide show, even in a first person shooter. When utilizing vsync and triple buffering the frames like to hover around that mark anyways.And at what res? Ive said many times, for me I couldnt get playable frames. And several reviews couldnt as well. What res, and what frames? I dont like slide shows, I guess they do.
How is the typo "hassall" indicative of the name Hassalehoff? If anything these names would be closer. Why are you the one thinking of the male life guard?Almost 32, why? You made a funny typo, and I made a joke.
No, having a sense of humor is fine and I commend you for having it. It's just now clear that your sense of humor is immature instead of witty. Better than nothing though :thumbsup:You claimed I didnt have a sense or humor the other day. Now you're flip-flopping and acting as if its a bad thing now. Good job.
To say that a whole review "backs you up" would mean that the whole review supported what you said. It did not. You only picked the parts that did. It's the same thing that you did here:Yes, they did. I picked those two, because those are the two some people didnt agree with. Now a review backs me up.
Did they? Here is what they really said;
As you can see, Elder Scrolls: Oblivion looks remarkable with FSAA and HDR turned on. Unfortunately, we could not obtain the benchmark numbers for you to see, as performance of Oblivion with HDR+AA varies substantially, which does not allow us to present validated benchmark result: every new manual test run brings a new number.
What we can say about current state of HDR+AA support is that it is here, it provides some additional eye-candy and is generally a nice feature. However, given all the difficulties with enabling the capability, gamers should really think twice before considering this feature seriously.
Yet you did not bold that part because it would discredit the ATI feature of HDR+AA.Unfortunately, we could not obtain the benchmark numbers for you to see, as performance of Oblivion with HDR+AA varies substantially, which does not allow us to present validated benchmark result...
The part that you quoted: "...may not provide sufficient performance"Where did they say its not very useful?
Once again, you misread. They also said that it "...may not provide sufficient performance"The only negagive thing they said, was that its hard to enable.
So if more than one person disagrees with you they're automatically pals?I did not troll, that would be you and your pals.
Once again you misread. I said you're trolling and the way you are doing it is childish.Me childish? Who calls who names again?
:roll: So your argument is, "Nuh-uh! You are!"Again, that would be you, and thats acting childish.
Promise?But hey, Ill be gone next week.
Whoever else can't think rationally.Then who will you troll after?
Originally posted by: Ackmed
But hey, Ill be gone next week. Then who will troll for me?
Originally posted by: BFG10K
What the hell are you talking about? It's plenty playable at 1600x1200 and even 1920x1200 if you use 8xSLI in HL2. It's also faster than the Radeon's 6xAA in Oblivion indoors and neither card can do 8x/6x outdoors so that's a moot point.And 8x+ AA from NV is hardly playable in current games. Even with a GX2
Of course Oblivion is vastly more demanding than most other games where 8xS is usable on even single cards.
Err, transparent multi-sampling doesn't actually work in HL2 so their comments with that are meaningless. In games where it does work it significantly reduces vegetation shimmer (probably even better than raw 4xSSAA) with literally zero performance hit. It's amazing for games like Far Cry and Call of Duty 2.Looks like they give a 2 to 1 edge to ATi.
As for their super-sampling/quality comparison, their comments may be true with their particular screenshot of a zoomed wire fence but it's cetainly not true overall.
ATi's AAA is inferior to nVidia's TrAA.
ATi's vegetation still shimmers sometimes while nVidia's is literally perfect; also AAA doesn't affect the same distances that TrAA does. Additionally AAA has compatibility glitches in many Direct3D games while TrAA works perfectly in every Direct3D game I throw at it.
Yes, that isn't so bad. The problem is ATi's shocking Crossfire implementation where you had to resort to renaming application names in the hopes that you'd randomly stumble onto the right one. Even now all you can do force AFR in Direct3D games and hope for the best.About HDR+AA, yes they did say that. I dont think its hard at all. You download the fix, install it, and enable force the AA, while selecting HDR in the game.
How do we force Crossfire into OpenGL games Ackmed? How do we enable scissors or super-tiling mode into Direct3D games Ackmed? We've had 1 year of Crossfire and it's still a total joke.
nVidia's profile system is vastly superior.
No, I don't think it is difficult to enable just like I don't think using 8xAA is difficult for a 7950GX2. The difference is your opinion is irrationalized since you only disregard inaccuracies against ATI instead of both vendors.
Where? Where have you demonstrated that you don't have a reading comprehension disorder? Every "point" you think they made isn't the entire "point" but rather a fragment of it that you've picked out to try and glorify ATI. Like I said, they were wrong about the HDR+AA being complicated to enable but I took it a step further than your biased outlook would see and saw that they were incorrect about the 8xAA claims as well.
Once again you misread. Where did I say you were childish? I said your trolling is childish since you refuted my claim with a paraphrased, "I know you are but what am I" statement.
You never played the multi-player. If you're really wanting to be non-biased in this statement, why not say, "for me" again since you're only going off of what you experienced. Trying to get the first shot on someone was all about twitch and speed.
Using X1900 or greater CrossFire platforms at least. This is why BFG10K, me and many others have told you that the performance hit of HDR+AA is very similar to the performance hit with 8xAA.
Proof? I'm not doubting you but that doesn't detract from the fact that ATI could barely do it. Are we now determining the playability of a game with certain settings off of its popularity?
Once again, your reading couldn't comprehend what he said. BFG10K claimed that he got ~40fps at 16x12 with 8xAA in Q4. 40fps is not a slide show, even in a first person shooter. When utilizing vsync and triple buffering the frames like to hover around that mark anyways.
How is the typo "hassall" indicative of the name Hassalehoff? If anything these names would be closer. Why are you the one thinking of the male life guard?
Yet you did not bold that part because it would discredit the ATI feature of HDR+AA.
The part that you quoted: "...may not provide sufficient performance"
Once again, you misread. They also said that it "...may not provide sufficient performance"
Originally posted by: josh6079
Promise?But hey, Ill be gone next week.
Originally posted by: josh6079
Whoever else can't think rationally.Then who will you troll after?
You are constantly missing the points of sentences. I said never said that you were "wrong" for saying that HDR+AA was hard to enable. I agree with you as far as implementing it and that it is indeed easy to do. However, you stopped there. You didn't take into consideration that since they made the mistake about HDR+AA being difficult, that they made the mistake about 8xAA not being playable.So you agree with me, that its not hard at all to enable HDR+AA for ATi. Yet you want to claim that I am wrong in saying so at the same time... nice.
The atmosphere that your sentences are projecting are different than what the articles were however. Your sentences leave out some of their other points. HDR+AA was too slow to use in there benches and isn't common in a lot of popular, newer games. You even said here that it is too slow for you to use currently. Therefore it is "too slow to use in newer games most of the time", making it about as useful as 8xAA--an idea that you couldn't fathom in that thread. It is out there and there are some very fun games that use FP16 HDR+AA, but when compared to how many games can implement 8xAA it isn't even close.I made two points, rather clearly. And were backed up by this article. ATi has better IQ, and NV's 8xAA+ is too slow to use in newer games most of the time.
Where? This is another lie from you. Put a link up if you think I did.I didnt misread anything. A week or so ago, you resorted to name calling again.
:roll:But I guess that doesnt count now? I didnt troll anything.
This is amusing. You think that if someone quotes you and replies to you that they're automatically trolling. Heaven forbid we question Ackmed...that would be trolling... :roll:You, and your pals quoted and replied to me...and you and your pals trolled. Pretty simple.
You disregarded the fact that someone found it playable at a resolution that wasnt' yours here:There was no tangent. Lets try and stick to facts. I didnt say it wasnt universally playable.
If it couldn't do it at your uncommon resolution you didn't care if someone else found it playable at 16x12.Even a link from someone else trying to say it was playable...And it was at 1600x1200, not my res of 1920x1200.
This is what I mean by poor reading comprehension. He clearly said that he gets ~40fps with Q4 at 16x12 and Q4 isn't 10+years old.Even at 1600x1200, reviews do not show it playable. Even bfg10k doesnt show newer games playable. He does show older games (10+ years) playable, which agrees with what I said.
You also said that you didn't play it here:I played the multiplayer. I said I didnt like it...
Flip-flopping now?I didn't play Farcrys multi after it first came out...
Once again, are you implying that ATI's FP16 HDR+AA is only useful for X1900(50) CrossFire setups and slow-paced games?There is much more sneaking around in Farcry, than running and gunning.
Considering you have to have a CrossFire setup and slow-paced games before you can use it, I don't know how you seem to think so. If Q4 could do HDR+AA I don't think it would be pretty. However, it seems to handle 8xAA just fine for people with the right cards.8xAA takes more of a performance hit that HDR+AA does.
BF2 has also lost some multiplayer numbers because it has been out so long and people have gotten tired of it. Multiplayer games lose popularity over time, all games do in fact. I wasn't saying so "just to argue" but you were trying to detract from the playability of Far Cry's HDR+AA by claiming that it wasn't popular anymore when the popularity of a game has nothing to do with how it performs under certain settings. Considering a 1.4 patch just came out for it obviously people still play it.So you agree with me again, and just want to argue for the sake of it. Nice...
With two X1950XTX's in CrossFire. You can barely find an X1950XTX CrossFire mastercard anywhere at the moment and even if you could, that is ~$800 of graphic hardware pushing a low 54 frames.The numbers from Farcry that he submitted were at 1920x1200, at 54fps.
I never said that you couldn't do that. My point was that Nvidia cards that cost as much as these X1900(50)XTX CF setups can use 8xAA almost flawlessly in a wider range of games than ATI's HDR+AA.Earlier you said that people could run 8xAA at lower resolutions, and get better frames. Well guess what, it works both ways. You can run at lower resolutions to get better frames in Farcry as well.
Did I say I cared whether it would get the job done for you? Instead of being narcissistic and thinking that everyone cares about what will be playable for Ackmed you should consider that some don't like screen tearing. When enabling vsynce+triple buffering, the highest frames you can get are 60. Since 60 is on the verge of your minimum requirement, I guess you don't mind graphics that rip all over your input-lagging monitor.I dont recall this. With an average of 40fps, thats not going to get it done for me in multiplayer.
Or the highest is lower. If the highest frame is capped at 60, of course your average is going to be less. I play with an average of 40fps on my multiplayers and do fine. In fact, my minimum frames are sometimes better when I have vsync+triple buffering enabled compared to when I don't. Obviously you can't play with ~40fps and that's completely fine; it's your own preference. That doesn't mean that others who were more practical in their choice of monitors can't play it.I also doubt his numbers came from playing a multiplayer game. With an average of 40, minimum is much, much lower.
As do they with HDR+AA. The same review this thread is discussing and of which one that you have quoted claiming it supports your statements also said that HDR+AA may not give substantial performance and isn't a common feature among a lot of titles.Also, other reviews disagree, and claim its not playable.
Having a sense of humor is fine, I just noted that yours is an immature one rather than a witty one. That's not flip-flopping.Once again, your flip-flopping is astounding. Because it sounds the same? You had a typo of hassal, instead of hassle.
Hmmmmm.....I don't know, why wouldn't Ackmed bold the part that says HDR+AA may not provide substantial performance and is difficult to test because of the diverse frames....Why would I bold that?
How does that help your mantra of it being a good feature if they couldn't get it to work?They gave no numbers, and claimed they couldnt get any.
Every other review hardly does it.Every other review has no problems with it.
How the hell is claiming that a certain feature isn't easy to implement and heavy on the performance hit a "neutral" claim? It's clear that they had problems with it, if not getting it working then when it was working.Their comment isnt negative, its neutral.
What are you on? If it "may not provide sufficient performance" then it is "not very useful". Also, the fact that you yourself have to have two ATI cards and a slow-paced game to use it all the more restablishes that it is "not very useful". Compared to 8xAA, a feature that can be used in almost any game when using cards that cost the same amount as X1900 or greater CrossFire setups, the ability to do HDR+AA is so far a minor ability.That does not say what you claimed it said, "not very useful".
What agenda? For you to claim that I'm trying to reword things to fit it, maybe you should define what my agenda is. I simply disagree with you. You blow things out of proportion and think that I'm "trolling after you" or am trying to accomplish a hidden "agenda". I'm quoting you, not "rewording" you.You rewording it to fit your agenda, doesnt make it correct.
So does the fact that you are saying that 8xAA isn't playable a "neutral" statement since others besides you do play with it? You have now tried to say that a negative claim against ATI's HDR+AA is just a "neutral" one and that a negative claim against Nvidia's 8xAA is still a negative one. Oh, don't forget to remind us that ATI can do HDR+AA where Nvidia cannot....Once again, thats not negative, its neutral.
Seems convenient that your "leaving" right when redbox is returning. Tell me, where were those insults he sent you via PM again?Yes, I can gurantee you will not see a single post by me until at least Jan 6th. The only outside contact I will have, is via snail mail. Sorry, I wont be mailing you any rebuttlas.
If "trolling" is pointing out biased hypocrisies then I guess I'm guilty.Thanks for admitting you troll after me.
Because you can use it in a handful of games?I do think HDR+AA is more usuable overall...
Is this an advertisement? When was I debating that ATI had inferior IQ? Who are you arguing with here?Add that and the fact that HQ AF looks better, and ATi has less shimmering, I and several reviews/artciles come to the conclusion that ATi has better IQ. In fact, I have not seen a single review say that NV has better IQ than ATi overall.
Thanks Sherlock. What does that have to do with your inaccurate opinions concerning the superiority of HDR+AA compared to 8xAA?And all of this is likely to be moot in a few months.
What card do you think I have? What features can you currently do that I can't? It takes two to argue John Nash, you're not a red saint.Except for the people who like to stand on the sidelines with old cards not even capable of todays features, and just like to argue... not that there are any of those around here.
Originally posted by: josh6079
You are constantly missing the points of sentences. I said never said that you were "wrong" for saying that HDR+AA was hard to enable. I agree with you as far as implementing it and that it is indeed easy to do. However, you stopped there. You didn't take into consideration that since they made the mistake about HDR+AA being difficult, that they made the mistake about 8xAA not being playable.So you agree with me, that its not hard at all to enable HDR+AA for ATi. Yet you want to claim that I am wrong in saying so at the same time... nice.
The atmosphere that your sentences are projecting are different than what the articles were however. Your sentences leave out some of their other points. HDR+AA was too slow to use in there benches and isn't common in a lot of popular, newer games. You even said here that it is too slow for you to use currently. Therefore it is "too slow to use in newer games most of the time", making it about as useful as 8xAA--an idea that you couldn't fathom in that thread. It is out there and there are some very fun games that use FP16 HDR+AA, but when compared to how many games can implement 8xAA it isn't even close.[/quote]I made two points, rather clearly. And were backed up by this article. ATi has better IQ, and NV's 8xAA+ is too slow to use in newer games most of the time.
Where? This is another lie from you. Put a link up if you think I did.[/quote]I didnt misread anything. A week or so ago, you resorted to name calling again.
This is amusing. You think that if someone quotes you and replies to you that they're automatically trolling. Heaven forbid we question Ackmed...that would be trolling... :roll:[/quote]You, and your pals quoted and replied to me...and you and your pals trolled. Pretty simple.
They also stated that NVIDIA's AA was better, the GX2 was faster and HDR+AA is poorly implemented. I know you have your red filter enabled.
Again, where can you demonstrate that people who altogether question your posts are pals? Put up proof with these claims or don't post them at all.
You disregarded the fact that someone found it playable at a resolution that wasnt' yours here:There was no tangent. Lets try and stick to facts. I didnt say it wasnt universally playable.
If it couldn't do it at your uncommon resolution you didn't care if someone else found it playable at 16x12.Even a link from someone else trying to say it was playable...And it was at 1600x1200, not my res of 1920x1200.
This is what I mean by poor reading comprehension. He clearly said that he gets ~40fps with Q4 at 16x12 and Q4 isn't 10+years old.[/quote]Even at 1600x1200, reviews do not show it playable. Even bfg10k doesnt show newer games playable. He does show older games (10+ years) playable, which agrees with what I said.
You also said that you didn't play it here:I played the multiplayer. I said I didnt like it...
Flip-flopping now?[/quote]I didn't play Farcrys multi after it first came out...
Once again, are you implying that ATI's FP16 HDR+AA is only useful for X1900(50) CrossFire setups and slow-paced games?[/quote]There is much more sneaking around in Farcry, than running and gunning.
Considering you have to have a CrossFire setup and slow-paced games before you can use it, I don't know how you seem to think so. If Q4 could do HDR+AA I don't think it would be pretty. However, it seems to handle 8xAA just fine for people with the right cards.[/quote]8xAA takes more of a performance hit that HDR+AA does.
BF2 has also lost some multiplayer numbers because it has been out so long and people have gotten tired of it. Multiplayer games lose popularity over time, all games do in fact. I wasn't saying so "just to argue" but you were trying to detract from the playability of Far Cry's HDR+AA by claiming that it wasn't popular anymore when the popularity of a game has nothing to do with how it performs under certain settings. Considering a 1.4 patch just came out for it obviously people still play it.[/quote]So you agree with me again, and just want to argue for the sake of it. Nice...
With two X1950XTX's in CrossFire. You can barely find an X1950XTX CrossFire mastercard anywhere at the moment and even if you could, that is ~$800 of graphic hardware pushing a low 54 frames.[/quote]The numbers from Farcry that he submitted were at 1920x1200, at 54fps.
I never said that you couldn't do that. My point was that Nvidia cards that cost as much as these X1900(50)XTX CF setups can use 8xAA almost flawlessly in a wider range of games than ATI's HDR+AA.[/quote]Earlier you said that people could run 8xAA at lower resolutions, and get better frames. Well guess what, it works both ways. You can run at lower resolutions to get better frames in Farcry as well.
Did I say I cared whether it would get the job done for you? Instead of being narcissistic and thinking that everyone cares about what will be playable for Ackmed you should consider that some don't like screen tearing. When enabling vsynce+triple buffering, the highest frames you can get are 60. Since 60 is on the verge of your minimum requirement, I guess you don't mind graphics that rip all over your input-lagging monitor.[/quote]I dont recall this. With an average of 40fps, thats not going to get it done for me in multiplayer.
Or the highest is lower. If the highest frame is capped at 60, of course your average is going to be less. I play with an average of 40fps on my multiplayers and do fine. In fact, my minimum frames are sometimes better when I have vsync+triple buffering enabled compared to when I don't. Obviously you can't play with ~40fps and that's completely fine; it's your own preference. That doesn't mean that others who were more practical in their choice of monitors can't play it.[/quote]I also doubt his numbers came from playing a multiplayer game. With an average of 40, minimum is much, much lower.
As do they with HDR+AA. The same review this thread is discussing and of which one that you have quoted claiming it supports your statements also said that HDR+AA may not give substantial performance and isn't a common feature among a lot of titles.[/quote]Also, other reviews disagree, and claim its not playable.
Having a sense of humor is fine, I just noted that yours is an immature one rather than a witty one. That's not flip-flopping.[/quote]Once again, your flip-flopping is astounding. Because it sounds the same? You had a typo of hassal, instead of hassle.
Hmmmmm.....I don't know, why wouldn't Ackmed bold the part that says HDR+AA may not provide substantial performance and is difficult to test because of the diverse frames....Why would I bold that?
How does that help your mantra of it being a good feature if they couldn't get it to work?[/quote]They gave no numbers, and claimed they couldnt get any.
Every other review hardly does it.[/quote]Every other review has no problems with it.
How the hell is claiming that a certain feature isn't easy to implement and heavy on the performance hit a "neutral" claim? It's clear that they had problems with it, if not getting it working then when it was working.[/quote]Their comment isnt negative, its neutral.
What are you on? If it "may not provide sufficient performance" then it is "not very useful". Also, the fact that you yourself have to have two ATI cards and a slow-paced game to use it all the more restablishes that it is "not very useful". Compared to 8xAA, a feature that can be used in almost any game when using cards that cost the same amount as X1900 or greater CrossFire setups, the ability to do HDR+AA is so far a minor ability.[/quote]That does not say what you claimed it said, "not very useful".
What agenda? For you to claim that I'm trying to reword things to fit it, maybe you should define what my agenda is. I simply disagree with you. You blow things out of proportion and think that I'm "trolling after you" or am trying to accomplish a hidden "agenda". I'm quoting you, not "rewording" you.[/quote]You rewording it to fit your agenda, doesnt make it correct.
So does the fact that you are saying that 8xAA isn't playable a "neutral" statement since others besides you do play with it? You have now tried to say that a negative claim against ATI's HDR+AA is just a "neutral" one and that a negative claim against Nvidia's 8xAA is still a negative one. Oh, don't forget to remind us that ATI can do HDR+AA where Nvidia cannot.... [/quote]Once again, thats not negative, its neutral.
Seems convenient that your "leaving" right when redbox is returning. Tell me, where were those insults he sent you via PM again?[/quote]Yes, I can gurantee you will not see a single post by me until at least Jan 6th. The only outside contact I will have, is via snail mail. Sorry, I wont be mailing you any rebuttlas.
If "trolling" is pointing out biased hypocrisies then I guess I'm guilty.[/quote]Thanks for admitting you troll after me.
Because you can use it in a handful of games?[/quote]I do think HDR+AA is more usuable overall...
Is this an advertisement? When was I debating that ATI had inferior IQ? Who are you arguing with here?[/quote]Add that and the fact that HQ AF looks better, and ATi has less shimmering, I and several reviews/artciles come to the conclusion that ATi has better IQ. In fact, I have not seen a single review say that NV has better IQ than ATi overall.
Thanks Sherlock. What does that have to do with your inaccurate opinions concerning the superiority of HDR+AA compared to 8xAA?[/quote]And all of this is likely to be moot in a few months.
What card do you think I have? What features can you currently do that I can't? It takes two to argue John Nash, you're not a red saint.[/quote]Except for the people who like to stand on the sidelines with old cards not even capable of todays features, and just like to argue... not that there are any of those around here.
Originally posted by: Ackmed
You made the post, "you're a fanboy if you continue to make posts like this." We discussed it several times in the thread. You know you said it, as do I. I dont recall the thread.