MDE
Lifer
- Jul 17, 2003
- 13,199
- 1
- 81
Originally posted by: Nebor
I'm gonna have to buy a 5900 just to settle this.
lol...Just ignore the moron. IMO, he needs to be silenced .
Originally posted by: Nebor
I'm gonna have to buy a 5900 just to settle this.
Originally posted by: MonkeyDriveExpress
Originally posted by: Nebor
I'm gonna have to buy a 5900 just to settle this.
lol...Just ignore the moron. IMO, he needs to be silenced .
Shady06,
So he speaks the truth huh? Care to back that up? Please list the games, other than HL2 itself, that are coming out this year based on the HL2 engine. Let me help you....0! I'll even go out farther, list me 1 game that will use the HL2 engine, other than HL2, that is coming out within the next YEAR. Now compare that to the games that are coming out or were released just recently based on the UT 2003 engine (UT 2004, Deus Ex 2, Splinter Cell, Unreal 2, etc) or on the modified Q3 engine (this list is too long). Gee, I better hurry up and sell my 5900. It won't run anything. If you base your video card buying off of one game (HL2), then you have serious problems. You realize the same can be said for the 9800 and the Doom III engine. If you follow every ATIdiots theory on this message board and you plan on playing Doom III, then your 9800 is junk. Benchmarks specifically show Nvidia is spanking the 9800 in Doom III and guess what? Doom III is optimized to run better on the NV3x series. So, all you 9800 owners got ripped off. Sounds kinda stupid huh. Getting the picture. Note I own BOTH ATI and Nvidia. So, calling me a Nvidiot doesnt hold water. I base my buying decision off of 10 years of experience building, repairing pcs and work with the ACTUAL hardware. So I have no worries that when HL2 finally comes out, the 5900 will run it very well and look just as good as the 9800. Definitely close enough that it's not an issue, like all the ATIdiots would lead you to believe.
Alkaline5,
Alkaline5,
What makes you so confident HL2 or any DX9 game won't run fine when the Det 52.s come out? If I took your perspective with the 9700, then nobody would have bought that card.correct me if i'm wrong, hasnt it be noted that the DX9 promblems with nvidia are hardware related, not software?
i guess my standards in hardware are higher than yours. when i spend 300+ dollars on a card i expect it to run ALL the games including HL2 esp since many people built gaming rigs for the particular reason of playing HL2. and there will be more games based on the HL2 engine,. doom III huh? that game is so far from being released god knows what'll happen from now to then. and i dont see how yours of experiance in buying hardware leads you to believe the 5900 will run HL2 just fine when it comes out, not logical at all
Instigator, I luv seeing guys like you that stand for their opinion.
It's your choice, you bought it and noone here is forcing you to change it.
They express their opinion like you do, some with a bigger dose of enthousiasm but you can't do nothing for that.
Though i see you are quite fanatic too...
Benchmarks as you know, are not demos.They are programs measuring the performance of hardware.
So it's not useless info to rely on them, and in that point i think you are beeing really optimistic if you don't believe
that Nvidia has major problems with NV35 right now in DX9 performance.
HL2 is not JUST A GAME.. and if you deny that i think you are seeing thingz from a very different perspective than common ppl do.
Also you have to realize that drivers are no magic hat,from which FPS pop up.
I owned an fx5800 from which i was pleased, but I luv HL and i'm really looking forward to the miracle called HL2.
It will be a revolution in gaming history, and i can't risk to have a card that cannot supply me with adequate prformance.
So if you want to stay with your FX5900 that's fine with me and with most of the guys in here.
It's your opinion as i said and the future will clear thingz.
I'm gonna have to buy a 5900 just to settle this.
When Doom III comes out (eta 3/15/2004), the $300 ATI card you bought isn't going to nearly as good as the NV35.
Originally posted by: Zepper
If you can actually SEE the difference between 170 and 176 fps, I'll eat an ATI video card... Get it thru your heads, 5% is NOTHING - 10% is NEARLY NOTHING in real world use.
.bh.
:moon:
Instigator, your missing one common fact, and that is the NV35 core is made with 16bit registers (not dx9) and uses 32bit register for dx9 (does this by combining the 16bit registers) The performance boost that is being seen on the NV card is that the drivers are being forced to run 16bit in some areas of the game then 32bit in other areas. So, its losing Image Quality for speed. If you dont believe me maybe this will convince you, here. Plain and simple, the FX series were marketed as a dx9 part. If you want less then dx9 then go with a much cheaper G4 ti 4600/4800, price per/performance is the best, better then the fx series! The only reason to by an FX would be for dx9. Since it doesnt do that well, i would never recommend it!
Originally posted by: Instigator
Instigator, your missing one common fact, and that is the NV35 core is made with 16bit registers (not dx9) and uses 32bit register for dx9 (does this by combining the 16bit registers) The performance boost that is being seen on the NV card is that the drivers are being forced to run 16bit in some areas of the game then 32bit in other areas. So, its losing Image Quality for speed. If you dont believe me maybe this will convince you, here. Plain and simple, the FX series were marketed as a dx9 part. If you want less then dx9 then go with a much cheaper G4 ti 4600/4800, price per/performance is the best, better then the fx series! The only reason to by an FX would be for dx9. Since it doesnt do that well, i would never recommend it!
Goose77,
What I'm tell you is the difference will not be that large and that picture isn't even of HL2. Tell ya what, when HL2 comes out, I'll setup 2 exact systems. The only difference being the video cards (9800 and 5900). I'll put $50 that says you won't be able to tell which machine has which card. You can run around stare at walls and look at water all you want. You game? Also, the quote about the only reason to buy an FX would be for DX9 is simply stupid. Damn, then I better quit playing DX7 and DX8 games right now. Ever think one of the main reason to get the card would be to play other killer new DX8 games at 4x FSAA and 8X AF (UT 2003, UT 2004, BF 1942, Deus Ex 2, Splinter Cell). Try that on a Ti 4600\4800 and you will be watching a slide show!
my reasoning is if you are gonna drop that much money on a DX9 card, it better be able to run DX9 games. obviously you arent gonna stop playing dx8 games, but you dont want to change video cards to play DX9 either
We will be taking a much closer look at image quality very soon, but until then, it looks like ATI and NVIDIA have equal footing in the Aquamark3 arena and we are left to find more useful information about their differences elsewhere.
To illustrate the seriousness of the problem, he then shared some benchmarks with us from Half-Life 2 running on current graphics hardware. The test config was like so:
The results are striking, as you can see. I believe the results were recorded at 1024x768 in 32-bit color. This is with only standard trilinear filtering and no texture or edge antialiasing.
NVIDIA's NV3x-derived chips are way off the pace set by the ATI DirectX 9-class cards. The low-end GeForce FX 5200 Ultra and mid-range GeForce FX 5600 Ultra are wholly unplayable. The high-end GeForce FX 5900 Ultra ekes out just over 30 fps, well behind ATI's mid-range Radeon 9600 Pro (and yes, that score is from a 9600 Pro, not a stock 9600?the slide was mis-labeled). The Radeon 9800 Pro is twice the speed of the GeForce FX 5900 Ultra. NOTICE DX 9 not HL2 game engine
Valve even ginned up a value-for-money slide to illustrate the problem with the current price/performance proposition for NVIDIA hardware.
However, NVIDIA has claimed the NV3x architecture would benefit greatly from properly optimized code, so Newell detailed Valve's sojourn down that path. The company developed a special codepath for the NV3x chips, distinct from its general DirectX codepath, which included everything from partial-precision hints (telling the chip to use 16-bit floating-point precision rather than the default 32-bit in calculating pixel shader programs) to hand-optimized pixel shader code.
The "mixed mode" NV3x codepath yielded mixed results, with a fairly decent performance gain on the FX 5900 Ultra, but not near enough of a boost on the FX 5200 Ultra and FX 5600 Ultra.
Newell also expressed skepticism about the payoffs for NV3x-specific optimizations, noting that the optimization process was arduous, expensive, and less likely to produce performance gains as shader techniques advance. What more, he said, smaller developers are not likely to have the resources Valve was able to bring to bear on the problem.
He suggested one way of dealing with the issue would be to treat all NV3x hardware as DirectX 8-class hardware, which would cut down significantly on eye candy and new graphics features, but which could yield more acceptable performance on NV3x chips. Obviously, he noted, one could always cut down visual quality in order to achieve higher performance, but in the case of Half-Life 2, falling back to DX8 will require tangible sacrifices.
Oddly enough, even using the DX8 codepath, the previous-generation GeForce Ti 4600 outperformed the brand-new GeForce FX 5600 Ultra.
Instigator, get this through your thick head.
1) The ENTIRE FX series is crippled when it comes to DX9. It's either sacrifice speed or image quality. We all know what they chose.
2) HL2 based games are coming. Only Valve knows when, but they are coming and people want their $200-500 video cards to rip through those games down the road, and it looks like the closest thing an FX card will come to ripping through those games is ripping some huge @$$ when you try running anything more complex than HL2\Doom III.
3) You're a nuisance. Please leave.
4) The reason a Ti4600 can't do 4x FSAA\8x AF is because Nvidia decided, "Hey, let's anti-alias EVERY edge in the scene. That'll look GREAT!" Too bad the GF4 Ti series doesn't have the memory bandwidth to deal with that.
5) See #3.
Originally posted by: MonkeyDriveExpress
It's a great game. IDK about slowdowns being that big of an issue, I run it on my GF4 Ti4600 with all the eye candy on @ 800x600 no AA\AF and it's very smooth. Co-op would be very hard to do on a PC, and also be kinda weird with two people playing on a 17-19" screen, and one mouse\keyboard, then a gamepad for the other. I think having the extra weapons, maps, and vehicles make up for not having co-op.
Originally posted by: MercenaryForHireLifer
There's this thing called "The Internet" ... you can play games over it now, you know.
- M4H
"We used ATI?s publicly available Catalyst 3.7 drivers and in order to support the NV38 we used NVIDIA?s forthcoming 52.14 drivers. The 52.14 drivers apparently have issues in two games, neither of which are featured in our test suite (Half Life 2 & Gunmetal)."
Conclusion
Well, the performance difference between NVIDIA GeForce FX 5900, ATI RADEON 9800 and their faster analogues appeared not very high: 10-15% in the heaviest work modes. The pricing on NVIDIA GeForce FX 5900 Ultra and ATI RADEON 9800 Pro exceeds that on the non-Ultra and non-Pro versions much more than by these 10-15%.
Therefore, if you are looking not for the ?world?s very best? graphics card, but just for a good High-End product with a nice potential to last it for a while, then why don?t you take a closer look at NVIDIA GeForce FX 5900 and ATI RADEON 9800? These solutions become the most interesting today, because if overclocked, they are mostly faster than their more expensive fellows.
The upcoming launch of NVIDIA NV38 and ATI R360, the ?overclocked versions? of the NV35 and R350, which is scheduled for this September-October, will not tell on the attractiveness of NVIDIA GeForce FX 5900 and ATI RADEON 9800. On the contrary, the new chips announcement will make the cards based on NVIDIA GeForce FX 5900 and ATI RADEON 9800 less expensive, while the overclocking potential will remain the same
It seems to be a more difficult question, which graphics card of the two is better: NVIDIA GeForce FX 5900 or ATI RADEON 9800?
On the one hand, NVIDIA GeForce FX 5900 with a better anisotropic filtering algorithm on the whole performs almost as fast as RADEON 9800 or maybe even a little faster in some cases.
On the other hand, DirectX8 shaders were the maximum our today's benchmarks could really involve, while in case of DirectX9 shaders RADEON 9800 will definitely be much faster. As a result, it will do really great in modern games using DirectX9 shaders. Finally, the price of graphics cards based on ATI chips is considerably lower, than that of NVIDIA based solutions. So, from the pricing point of view, NVIDIA GeForce FX 5900 is more likely to compete not with ATI RADEON 9800, but with a more powerful ATI RADEON 9800 Pro?
Originally posted by: Instigator
Instigator, get this through your thick head.
1) The ENTIRE FX series is crippled when it comes to DX9. It's either sacrifice speed or image quality. We all know what they chose.
2) HL2 based games are coming. Only Valve knows when, but they are coming and people want their $200-500 video cards to rip through those games down the road, and it looks like the closest thing an FX card will come to ripping through those games is ripping some huge @$$ when you try running anything more complex than HL2\Doom III.
3) You're a nuisance. Please leave.
4) The reason a Ti4600 can't do 4x FSAA\8x AF is because Nvidia decided, "Hey, let's anti-alias EVERY edge in the scene. That'll look GREAT!" Too bad the GF4 Ti series doesn't have the memory bandwidth to deal with that.
5) See #3.
MonkeyDriveExpress,
1) Read the Anandtech article. If you can. See the bar graph that shows the 5900 running DX9 games.
2) Ok sparky, list the HL2 based games that are coming out. Put your money where your mouth is. I'm already laughing that you brought up Doom III and ATI in the same sentence.
3)Go back to Rage3d forums. You've already displayed your stupidity with this remark from the Video Card forum:
Originally posted by: MonkeyDriveExpress
It's a great game. IDK about slowdowns being that big of an issue, I run it on my GF4 Ti4600 with all the eye candy on @ 800x600 no AA\AF and it's very smooth. Co-op would be very hard to do on a PC, and also be kinda weird with two people playing on a 17-19" screen, and one mouse\keyboard, then a gamepad for the other. I think having the extra weapons, maps, and vehicles make up for not having co-op.
Originally posted by: MercenaryForHireLifer
There's this thing called "The Internet" ... you can play games over it now, you know.
- M4H
Your a waste of space after the above answer. You've been here since July and I'm guessing that's how many months worth of knowledge you have in hardware. You've also posted 1200 times since then. Get a life.
Me a waste of space? Don't think so. I don't get my kicks by single-handedly fueling flame wars. I may not know as much as some people about PC hardware, but I can tell that I sure as hell know more than you. It's no coincidence that you're the only one on your side of the fence in this discussion. If my 15 posts a day is having no life, what about Shady's 50 posts a day?I think having the extra weapons, maps, and vehicles make up for not having co-op.
Originally posted by: Instigator
Instigator, your missing one common fact, and that is the NV35 core is made with 16bit registers (not dx9) and uses 32bit register for dx9 (does this by combining the 16bit registers) The performance boost that is being seen on the NV card is that the drivers are being forced to run 16bit in some areas of the game then 32bit in other areas. So, its losing Image Quality for speed. If you dont believe me maybe this will convince you, here. Plain and simple, the FX series were marketed as a dx9 part. If you want less then dx9 then go with a much cheaper G4 ti 4600/4800, price per/performance is the best, better then the fx series! The only reason to by an FX would be for dx9. Since it doesnt do that well, i would never recommend it!
Goose77,
What I'm tell you is the difference will not be that large and that picture isn't even of HL2. Tell ya what, when HL2 comes out, I'll setup 2 exact systems. The only difference being the video cards (9800 and 5900). I'll put $50 that says you won't be able to tell which machine has which card. You can run around stare at walls and look at water all you want. You game? Also, the quote about the only reason to buy an FX would be for DX9 is simply stupid. Damn, then I better quit playing DX7 and DX8 games right now. Ever think one of the main reason to get the card would be to play other killer new DX8 games at 4x FSAA and 8X AF (UT 2003, UT 2004, BF 1942, Deus Ex 2, Splinter Cell). Try that on a Ti 4600\4800 and you will be watching a slide show!
Originally posted by: Rollo
LOL
Arguing so strenuously about gaming performance on a few games that aren't even out yet. You'd think you were arguing an issue that actually mattered, but no, it comes down "Waaaahhhhhh, the fog isn't rendered properly!" . Allow me to post 9 links wherein my card of choice performs 4fps better and it's noted the colors are more vibrant!"
You'd think you guys actually designed the things, or had more at stake than a few hundred bucks the way you fight amongst one another.