Originally posted by: ketchup79
Originally posted by: DillonStorm
I just hope GFFX's clocked at 500/1000 from board partners who've designed quieter cooling solutions (like Gainward) still make it to market. A quiet FX is still 20% faster than a 9700, and some of us are willing to spend the premium to get the fastest thing out there, just not willing to put up with the noise of the original design.
I doont think you get it. The quiet fan was just the quieter in 2D version that hardocp talked about. There are no magical Nv30's at the end of the rainbow. If Nvidia could make a Quiter solution, dont you think they would have??? or do your really think that Gainward have better *fan engineers* Than Nvidia.....
And that is total nonsese about the GFFX ultra being 20% faster.. Didn't you read anand's review? or hardocp? when you equalize (as best you can, Nv30's FSAA is a joke) the IQ the Nv30 went down like this..
-4x FSAA on R300 is 30 FPS faster than GFFX 6xs
-6x FSAA on R300 is 40 FPS faster than GFFX 8xs
-2x FSAA on R300 is 20 FPS faster than GFFX 4x
Just go look at the UT 2003 pics at hardocp and see for yourself.
I don't think you get it. Gainward's press release said they will have a card with a fan on it that makes about as much noise as a human heartbeat, in other words, you can't hear it, in 2D or in 3D. The card hardOCP got was a reference card, and the fan did not run at all in 2D, so the noise from that would be zero.
Also, there is no reason to Equalize the IQ, as nVidia's FSAA should work just fine when the card is actually released. It was released for review with beta drivers, just because nVidia wanted to give everyone an idea of how this card will perform. Can you imagine how bad ATI's flagship card with mature drivers would look if it lost out to the Geforce FX with beta drivers?
First off, nVidia has had a
long time to work on their drivers for the FX, it is unlikely that performance will magically jump up 10-20%, and I seriously doubt they are purposely trying to hide the true performance of the boards by purposely holding them back with their drivers.
Second, Gainward is claiming to get 7db with their cooling system, yet they've yet to explain how. It seems they'd need some either highly absurd (huge huge heatsink), magical, or very expensive technology to get cooling that quiet, watercooling is louder and no fan can be that quiet without being able to push enough air to cool...maybe they've built the card inside of a "sound proof" box... this issue has been discussed to a large extent, and I think most of us have serious doubts to the claim. Sure it would be great if Gainward found a way to do it, but many of us don?t see how it is reasonably possible.
"Can you imagine how bad ATI's flagship card with mature drivers would look if it lost out to the Geforce FX with beta drivers?"
How would it look bad? Beta drivers aren't always crappy, and if the FX was as great as it was being hyped up to be, I don't think anyone would have been shocked if the FX won in every aspect. I remember the 9700 completely blowing the Ti 4600 out of the water when it was in beta form. Drivers for the 9700 back then weren't terrible...
Originally posted by: Xenon14
I think NVIDIA is doing what ATI did with the 9700. While NVIDIA was going about its business wiht the Geforce4, ATI didn't release competitive products until the 9700. I think NVIDIA released the FX simply as a ploy, the NV35 is probably what they have been concentrating on this whole time, and it explains the delay for the FX and why its performance isn't shocking at all.
edit: I expect NVIDIA's next generation card to blow everything out of the water, which will thus allow me to purchase an ATI card for a lot less $$.
What about the 8500, is that card not competetive? nVidia released the GF4 to grab up the lead as they had 3 solutions which all are faster than the 8500. Until that time ATI was putting pressure on the GeForce 3 line, starting to pull away from the Ti 500. What would have happened if ATI had had the performance lead with the 8500 up until the 9700? The 9700 is merely the next logical step for ATI. Their R200 core we very very impressive but it was perhaps too impressive as to this day it sounds better on paper than it really is (although it really is a nice product, I use it and love it). The R300 is even more impressive and a not too surprising technological advancement up from the R200. nVidia merely just fell behind, you don't release a product as a "fake", it just doesn't make any sense. Do you think ATI would bother with an R350 if nVidia was just going to skip the NV30? If anything the R350 will only put more pressure on nVidia, its like shooting yourself in the foot... FX is simply a ploy, lol. You can wish that they were concentrating on the NV35 all this time, they weren't. They obviously had to dedicate resources into the NV30, and I doubt they had more resources on the NV35 than they did the NV30. If anything if they knew their NV30 was going to be a bust, they would have dropped it asap to get out the NV35 potentially months sooner.
Don't expect the NV35 to blow everything out of the water.
"Huang is confident that Nvidia will end up back on top. "Tiger Woods doesn't win every day. We don't deny that ATI has a wonderful product and it took the performance lead from us. But if they think they're going to hold onto it, they're smoking something hallucinogenic.''
Yet there is evidence that ATI is gaining ground. ATI President Dave Orton said the company has picked up new customers, selling chips for Dell Computer's Dimension product line and Hewlett-Packard's Compaq Presario desktops.
And by the time Nvidia gets the GeForce FX to the market, ATI will be almost ready to launch a low-cost version of last fall's chip, a project that is code-named the R350.
Still, Orton isn't gloating. "We respect Jen-Hsun,'' said Orton. "We know he [nVidia] is not standing still, and we [ATI] aren't either.''
Yeah, don't expect the NV35 to not have any competition. ATI has had a lot more time than nVidia to work on their next gen chip (R400) than nVidia has had to work on theirs (they don't even have the NV30 out yet).
If you think ATi won't have the same exact problems with their move to 0.13 micron, I'd like some of what you're smoking. Whichever chip ATi makes the switch with will end up similarly delayed. Has yet to be seen whether they're willing to take that risk/reward and run with it like nVidia did. nVidia has the switch out of the way and can move to more mature chips on the same process, now.
No, ATI decided to wait for TSMC to be ready, they let nVidia play guinea pig for the process. I would think ATI?s .13 parts should be far less problematic than what the NV30 went through.
Liken it to the first Pentium 4's. They were slower than P3's half their speed. AMD fans laughed and laughed. Now Intel isn't looking quite so stupid, are they? This process will allow FUTURE nVidia chips to ramp at an increased rate, and if you assume nVidia is going to "sit on their a$$es" and continue to "blunder" like they have over the past six months, then you are the deluded one.
Sorry, 2 GHz Willamette is considerably faster than the 1GHz PIII. Yeah, Intel sure isn?t looking stupid when they switched cores on us, pushed SDR SDRAM before DDR SDRAM as an alternate solution to RDRAM which they also pushed on us and after all that time they have yet to smite AMD, who is still very very close on Intel?s heels. Had Intel started their P4 line with Northwood cores, there might be a chance that AMD wouldn?t be here today.
nVidia won?t sit on their asses and blunders can happen to anyone at any time, and don?t forget that ATI certainly won?t sit on their asses either. The GPU world is worlds different than the CPU, nVidia won?t be able to ?ramp at an increased rate? like you can do with CPUs you can?t compare apples to oranges here and get away with it.
I wonder how many fanATics realize that this nVidia-free world they hope for would be a BAD thing for the graphics market. I suppose not. Me, I'm an idealist, and I hope BOTH companies continue to prosper, pushing prices down across the board and keeping this wild speed ramp going. I see no more reason to believe nVidia is going to fall flat on their faces than that ATi will over the next generation. And further, I hope Matrox suddenly surprises everyone with a Parhelia II, living up to the hype of the original. The more the merrier! Competition is life!
Who says anyone wants to get rid of nVidia and who says nVidia is going anywhere? Even if they pull out of the graphics market? nVidia has expanded beyond graphics market, their nForce chipsets are a great example of this, and right now the nForce2 is a huge success for nVidia.
I totally agree with the need for competition, I just think it?s really good that nVidia has been brought back down to Earth a little. A Parhelia II would be sweet although I have very little faith in Matrox as a gaming solution. Their first go with the Parhelia seemed to be so very promising and was all the more disappointing, and their products are still far insanely expensive for what I want in a video card. Something tells me that Matrox doesn?t know what exactly what they are doing, especially if they intend to compete in the gaming niche.