Originally posted by: Shad0hawK
i compared what was shown in the demonstration to the same images on a machine with a athlon 3400+ with a 9800XT set at full image quality and the simple fact is ps3 does in fact look better than ps2 which makes your counterpoint invalid.
Actually, crytek later released an explanation saying the comparison was between PS3.0/2.0 and PS1.1
The fact that Nvidia explained in many interviews that PS3.0 does NOT offer image quality enhancements over PS2.0 doesn't seem to sway your opinion. PS3.0 is supposed to make things faster, which in reality suggests that you COULD write longer instruction sets and with branching allowing for deeper shaders and thus better image quality. But if you wanted to you could write the same code using PS2.0 it would just take longer. But if ATI cards already work faster in PS2.0 there is no need for them at this point to utilize PS3.0. Logically if Nvidia works slower in PS2.0, it would only make sense for it to work SLOWER in a LONGER instruction set of PS3.0. PS2.0 can do everythign PS3.0 can so image quality is IDENTICAL, just the way to get the same image is supposed to be optimized in 3.0; but that hasn't been proven yet with any game. Now nvidia is counting on running their games in less loops using PS3.0. I am not saying ATI is better and be done with it, but more games have to be compared that utilize PS3.0 to really decide if that feature is important or not.
As it stands, X800xt is the clear performance winner if you disregard price. Comparison of X800Pro to GT is another story which must be more carefully analyzed.
There is no point of arguying as the ppl who made up their minds on buying ATI or Nvidia will never change their view and then there remain the rest of us who want to wait for prices to fight over and see what the best card to buy is.
Some other factors to consider:
1) so what if Nvidia has a 2 slot design? How many ppl on these forums have all 5 PCI slots taken up? or run a Small form factor PC? Most of us have a large case with good airflow so even if the card took up 3 slots it makes no difference
2) Ok nvidia has higher energy output than ATI cards do....big deal many review sites ran it using 350 watt power supply and many showed total system consumption for the rig at <300 watts.
Furthermore, do you care if your gpu runs at 80*C or 50*C? If there are no artifacts and the card is stable I could care less. It surely wont affect how long my card lasts because both will last 5 years or before they become obsolete so heat and energy issues are a wash again. Besides how many ppl on this forum who have good rigs do not have a decent power supply?
So just see what games you play right now and what card runs them faster and then evaluate the purchasing decision on that and price. I play UT2K4, Splinter Cell, Halo, and Far Cry so ATI is a better card for me since it's faster in all of those games, but for someone else who playes other games this might not be true.....Besides by the time prices come down to make these cards more affordable, we'll have a better understanding on what card is better once more benches of newer games come out.