the AA/AF levels between R300 and NV30 aren't directly comparable. the NV30 looks like ass compared to the R300 at the same levels, so you have to pump up the AA/AF on the NV30, which drops the performance.Originally posted by: Rollo
Originally posted by: crazySOB297
I don't think so. I'm sorry but the performance of that card was way too low for anyone. Most nvidia fanboys would even conceit defeat that generation. The fan was loud, but the performance was just flat out horrible.
The 5800U can't compare on modern games because of it's shader limitations. On the games of the time it was much more competitive:
http://www.anandtech.com/showdoc.aspx?i=1821
You are correct in that the useful life of the 5800U was one year due to the shader limitations. I never keep a card more than a year, so for anyone who upgrades annually, the 5800U wasn't as bad as it looks now.
Originally posted by: Chadder007
Umm...you forgot the Matrox Parahelia....or whatever its called.
Originally posted by: Chadder007
Originally posted by: Chadder007
Umm...you forgot the Matrox Parahelia....or whatever its called.
....looking throughout the whole thread, im surprised no one else has mentioned this.
Originally posted by: Chadder007
Originally posted by: Chadder007
Umm...you forgot the Matrox Parahelia....or whatever its called.
....looking throughout the whole thread, im surprised no one else has mentioned this.
Whats new?Originally posted by: rbV5
Typical overstated, exaggerated nonesense hype from the Rollo.
SLI didn't even exist at that time and still didn't exist when you picked up a vanilla 6800. Likewise you deemed titles like HL2 and Far Cry irrelevant until nVidia started winning thanks to SLI. SLI wasn't even a factor, you just changed your tune when it arrived.I bought the X800XT PE late last year knowing I would likely be replacing it with SLI,
Originally posted by: M0RPH
R520 and up cards will not have the resolution limitation in Crossfire
Not quite the spectacular failure Rollo is making it out to be, is it?
lol. They COULD have done it. Pretty darn irrelevant. Of course they could have done it. They could simply release a whole new x8xx line or something. The point is they didn't, and never will.In discussions with ATI regarding this drawback to the X8xx Crossfire Edition cards they stressed that they could have made the maximum resolution and refresh rates higher but chose not to in order to support current customers.
So basically ATi is saying that they want people to be using R520 together with R420 in crossfire? I just don't see that working. Just imagine all the driver problems and stuff that would cause. Running older, (and if we believe ATi, a completely different arcitechture) together with a brand new, faster, SM3.0 hardware, I just don't see that happening. I rather think ATi just could not do it, or didn't bother implementing the (much better IMO) internal link that nvidia uses.We took the opportunity to ask ATI about the reason for implimenting the DVI connecton instead of an internal hardware link like the nVidia product. The company stated that an internal connection would also have alienated existing X8xx series customers. If the new cards had such a link, then the existing cards would not have a way to interface with the new cards. Going forward, ATI felt that a DVI connection was the "best solution for all existing and future card owners."
Originally posted by: ronnn
Yes, the cross fire mb is looking very good - for a run at the speed crown. Guess if it wins the big 3dmark derby - will be hard to call it a failure. But no doubt many people will say the speed crown is unimportant (myself being one actually).
edit: Does anyone really care if the x800 line is xfire compatable? Why put in two cards when one will do?
Originally posted by: 5150Joker
5800 U was pretty bad but at this point I'd say Crossfire takes the cake.
Originally posted by: Rollo
Wrong again BFG.
I bought my X800XT PE on Dec. 14, 2004, long after SLI had been reviewed and was on sale. Just looked up my PayPal receipt for the date.
You only post to me to flame me for BS reasons.
You're running two year old games at high res? Big deal- they still two year old games. It angers you for some reason that I play current games at 19X14 4X8X? Oh noes! I bet they still look better than the lower settings you have to run them at on your hardware?
If I turned on the 16X AF, you'd just switch to something else to mindlessly flame. We don't all have to run things at exactly the same settings you would to be gamers BFG.
You might drop the BS "how many games have you finished lately" babble as well.
I game daily, whether it's hopping into a UT2004 online match for half an hour, or playing a level of one of my many games. You'll have to forgive me for having a life outside of this, but I have a son to raise, other hobbies, a career, a wife, and a home to maintain. I don't have lots of hours long stretches of time to sit down and finish games.
I feel sorry for you.
That article made no sense because they ended it like this....Originally posted by: M0RPH
R520 and up cards will not have the resolution limitation in Crossfire
Not quite the spectacular failure Rollo is making it out to be, is it?
Further proof that THG sucks.According to ATI, this current maximum is necessary to support as many current customers as possible and that the dongle (external cable) would not be replaced any time soon.
Originally posted by: Wreckage
That article made no sense because they ended it like this....Originally posted by: M0RPH
R520 and up cards will not have the resolution limitation in Crossfire
Not quite the spectacular failure Rollo is making it out to be, is it?
Further proof that THG sucks.According to ATI, this current maximum is necessary to support as many current customers as possible and that the dongle (external cable) would not be replaced any time soon.