Originally posted by: Lonyo
I've never had a problem with ATi drivers, maybe little niggles, but the same can be said for nVidia drivers as well.
Also, doesn't Vista have a PS2.0 requirement, which means cards older than R300, which don't support PS2 CAN'T work in Vista anyway?
As a result, it was implied, there will be two levels of Vista-supporting hardware: one for DX9, the other for DX10,
The ATI Crossfire AMD has every option a serious overclocker could wish for. This extends from CPU voltages that will even excite water cooling and phase-change enthusiasts, to memory voltages that will give an overclock voltage reserve to the most demanding OCZ VX and Mushkin Redline memory. In between are voltage adjustments for the chipset, HT Link, PCIe 1.2, and PCIe 1.8. Add a CPU clock frequency adjustment range of 200 to 500, PCIe from 100 to 200, a slew of memory settings from DDR200 to DDR500, and every memory tweak known to exist and you have an incredibly serious board for the overclocker and computer hobbyist.
In case the message is not crystal clear, it doesn't matter whether you want ATI Crossfire or not when you are considering buying an ATI chipset motherboard. Crossfire is slated for mainstream pricing, so it is definitely worthwhile to consider an ATI Crossfire AMD motherboard to drive your nVidia 7800GTX or 6800 Ultra or ATI X850 or the upcoming ATI X1800. They will all perform very well with any Athlon 64 processor on the ATI Crossfire AMD motherboard.
In the end, the ATI Crossfire AMD is without a doubt the best enthusiast-oriented Reference Board that we have ever seen - with performance to match.
Originally posted by: ElFenix
meh, the V5 wasn't that big a failure. it was at least competitive at some things, and the AA really moved the industry forward (although nvidia managed to do the same thing in drivers by the time the V5 was released). they had an extra featureset, it just wasn't what the market (being large OEMs trying to sell checkbox features to the average idiot at best buy) wanted.
the disappointment was that 3dfx was a terribly run company
Originally posted by: Rollo
Originally posted by: SolMiester
Cuse me, but I believe every generation of Nvidia card was a step up thank-you, more than can be said about ATI! The 5800U wasnt the best Nvidia produced, however the next generation has still to be bettered by ATI let alone Nvidia's current generation.
I didn't add 5800U because it was a bad card, I added it because it was only released in very limited numbers, gained a lot of notoriety for noise, and nV probably lost money on them. (they built all cards and distributed them to OEMS, and cards were a 3lb brick sh*thouse of high quality components)
I loved my 5800U. It was a big failure though, mostly due to the noise. I think people would have got over the second place performance if not for the fan noise.
Originally posted by: Rollo
Originally posted by: ElFenix
meh, the V5 wasn't that big a failure. it was at least competitive at some things, and the AA really moved the industry forward (although nvidia managed to do the same thing in drivers by the time the V5 was released). they had an extra featureset, it just wasn't what the market (being large OEMs trying to sell checkbox features to the average idiot at best buy) wanted.
the disappointment was that 3dfx was a terribly run company
The 5800U was competitive at everything, at most games it would have been indiscernible from a 9700Pro at any setting that was playable.
It was loud though, and costly to produce.
Originally posted by: BFG10K
Second place? Hang on, I thought you wanted the best possible hardware?I think people would have got over the second place performance if not for the fan noise.
The word "bias" doesn't do you justice.
Originally posted by: jazzboy
Well I voted 5800U - they really did did seem a bit pants at the time.
But what about an option for the Rage Fury Maxx
Originally posted by: crazydingo
Nothing can top NV30. :laugh:
Originally posted by: crazySOB297
I don't think so. I'm sorry but the performance of that card was way too low for anyone. Most nvidia fanboys would even conceit defeat that generation. The fan was loud, but the performance was just flat out horrible.
I never said you did. The problem is that the last year you've been blasting the R420 for everything under the sun when it was not only comparable to a single NV40, it was actually faster in many situations.I never said the 5800U was the best hardware at the time, only that it was comparable.
I ignore it because it's a big steaming pile. Your reason for trashing the R420 was because you "only supported the vendor with the best hardware" and because you "refuse to pay for the same thing over and over again because you feel ripped off". Funny, that didn't stop you purchasing a 5800U three times and "kicking ass at 1024x768 in UT2003".You've been told this time and time again, yet you ignore the fact that I have no brand loyalty and will pretty much try anyones product.
SLI didn't even exist at that time and still didn't exist when you picked up a vanilla 6800. Likewise you deemed titles like HL2 and Far Cry irrelevant until nVidia started winning thanks to SLI. SLI wasn't even a factor, you just changed your tune when it arrived.I bought the X800XT PE late last year knowing I would likely be replacing it with SLI,
Tiny? Hahaha. Perhaps if you're deaf. XBit summed it up well:As I've said, the nV30s downside to the user was the tiny hairdryer sound.
This fan is the main noise source of the working GeForce FX 5800 Ultra. The card has already earned the nick-names like ?hair-drier?, ?vacuum-cleaner? and the like for this loud and irritating noise. It is truly the most unpleasant issue about it. The fact that the fan only starts up in 3D applications doesn?t save the day: people buy these cards exactly for 3D, not for work in office applications. By the way, this noise from the working card is nothing compared to what you hear when the fan slows down. You see, the control circuitry doesn?t slow down the fan smoothly by reducing its rotation speed to a halt. It seems that the card doesn?t reduce the voltage sent to the fan little by little, but sends recurrent pulses of full voltage changing its on-off time ratio step-by-step. As a result, the rotation frequency of the fan collides with the power impulses sequence frequency, there arise beats and pulses and the card sets up a rich sound performance that could have waken 3dfx back to life .
Damage Control, eh ?Originally posted by: Rollo
Originally posted by: crazydingo
Nothing can top NV30. :laugh:
As I've said, the nV30s downside to the user was the tiny hairdryer sound. I'd take that over looking at 60Hz any day. 60Hz causes actual pain, the tiny hairdryer sound was mildly annoying. (and if you had some big OCer fans sort blended in)
Originally posted by: BFG10K
I never said you did. The problem is that the last year you've been blasting the R420 for everything under the sun when it was not only comparable to a single NV40, it was actually faster in many situations.I never said the 5800U was the best hardware at the time, only that it was comparable.
I ignore it because it's a big steaming pile. Your reason for trashing the R420 was because you "only supported the vendor with the best hardware" and because you "refuse to pay for the same thing over and over again because you feel ripped off". Funny, that didn't stop you purchasing a 5800U three times and "kicking ass at 1024x768 in UT2003".You've been told this time and time again, yet you ignore the fact that I have no brand loyalty and will pretty much try anyones product.
Incidentally, take a look at these UT2003 results running at the same settings you claimed to have seen no difference between the 5800 and 9700 Pro. You were running 1024x768 with 8xAF and 4xAA weren't you?
5800U: 45 FPS.
9700 Pro: 74.9 FPS. LINK? And not one to Rage3D if you don't mind
Yup "no difference there". "When it matters, there's no difference". "I'm kicking ass at 1024x768 and there's no difference between the cards".
-Rollo, the GPU "collector".
SLI didn't even exist at that time and still didn't exist when you picked up a vanilla 6800. Likewise you deemed titles like HL2 and Far Cry irrelevant until nVidia started winning thanks to SLI. SLI wasn't even a factor, you just changed your tune when it arrived.I bought the X800XT PE late last year knowing I would likely be replacing it with SLI,
Tiny? Hahaha. Perhaps if you're deaf. XBit summed it up well:As I've said, the nV30s downside to the user was the tiny hairdryer sound.
This fan is the main noise source of the working GeForce FX 5800 Ultra. LOL!! as if any other part of the card made noise!! The card has already earned the nick-names like ?hair-drier?, ?vacuum-cleaner? and the like for this loud and irritating noise. It is truly the most unpleasant issue about it. The fact that the fan only starts up in 3D applications doesn?t save the day: people buy these cards exactly for 3D, not for work in office applications. By the way, this noise from the working card is nothing compared to what you hear when the fan slows down. You see, the control circuitry doesn?t slow down the fan smoothly by reducing its rotation speed to a halt. It seems that the card doesn?t reduce the voltage sent to the fan little by little, but sends recurrent pulses of full voltage changing its on-off time ratio step-by-step. As a result, the rotation frequency of the fan collides with the power impulses sequence frequency, there arise beats and pulses and the card sets up a rich sound performance that could have waken 3dfx back to life .
Of course I imagine a deaf "collector" would have no problems at all with that sound.
If you're talking about Linux, then it would be a different horse race all together, I think most of us were talking about Windows shader performance.Well, I think the fact that some people did buy the FX 5900 is proof that his statement is true. I actually replaced a 9700 Pro with an FX 5900, which was the last ATI card I have ever owned (I've built rigs for others with ATI cards since though). You can call it inherent bias or whatever you want to call it, but I had my reasons. Primarily, I got tired of ATI making me look like an ass for spending top dollar on a card that barely worked with Linux, I just didn't feel like I was getting my money's worth. Look at it this way: even if you get 50% more performance, but it works on 100% less OS'es, you are't coming out ahead. Basically, if you only look at the two horse race as you call it from only one perspective it may seem like a no-brainer, but when you look at the entire package there are different angles to consider than just DX9 shader performance under Windows.
You make it sound like the VSA-100 sucked and had to double them for it to be competitive, but I doubt it. It was the plan all along to have 2 cores for the V5. That was the scaling plan. The reason they failed is because they were late to the show and came into it with an expensive product. If they arrived on time, they would have still survived.It needed 2 cores to be somewhat competitive, and because of this couldn't yield them much profit.
Originally posted by: Acanthus
Not to mention the Voodoo 5 5500 brought Antialiasing to the mainstream :thumbsup:
Originally posted by: southpawuni
Without doubt, Crossfire. Many old video cards can display above 16x12@60hz.
For me, even the 1600x1200 is useless if it cant go above 60hz. Too low, flicker fest.
All those other solutions are long in the past. Crossfire is not only historys greatest trajedy, its history in the making.
I think an option needs added to the list though that would top them all
ATI Video Drivers. Why? Because they are the greatest tragedy, its a continuing let-down.
The rest, including Crossfire, are one shot failures (hopefully).
While NV supports all the way back to the TNT in their drivers, the same can't be said for ATI. Now, ATI is dropping support for all their products older than the R300 in Vista.
Originally posted by: southpawuni
Without doubt, Crossfire. Many old video cards can display above 16x12@60hz.
For me, even the 1600x1200 is useless if it cant go above 60hz. Too low, flicker fest.
All those other solutions are long in the past. Crossfire is not only historys greatest trajedy, its history in the making.
I think an option needs added to the list though that would top them all
ATI Video Drivers. Why? Because they are the greatest tragedy, its a continuing let-down.
The rest, including Crossfire, are one shot failures (hopefully).
While NV supports all the way back to the TNT in their drivers, the same can't be said for ATI. Now, ATI is dropping support for all their products older than the R300 in Vista.