Originally posted by: PingSpike
Originally posted by: PC Surgeon
But wouldn't you guys that bought an 8800 feel cheated? Assuming that it has poor DirectX 10 performance, being touted as a DirectX 10 card and yet you can't play Crysis?
Why is the task of rending DX10 assumed to be so much different and more complicated then DX9? Of the few DX10 screenshots I've seen (at least the few that obviously weren't a photoshop job) I haven't seen anything that looks different then the best stuff rendered in DX9.
I see posts about DX10 all the time like its some sort of magic technology thats going to drastically change everything overnight...which to me doesn't line up with past versions and the fact that graphics advances will always continue to offer diminished returns of visual quality.
Originally posted by: Gstanfor
G80 Dx10 being borked is merely the last, desperate fantasy of the fanatics I'm afraid...
Originally posted by: PC Surgeon
Originally posted by: Gstanfor
G80 Dx10 being borked is merely the last, desperate fantasy of the fanatics I'm afraid...
That's NOT the reason I made this thread. I wanted to know why people bought G80 and how they would feel if you couldn't play Crysis or any DX10 game at playable framerates. After seeing one of the demos (Crysis) where one of the developers mentioned working with nVidia from the beginning, I figure "it should" be able to run Crysis. The only question that remains is, do you have to run 2 8800's to do so?
Originally posted by: PC Surgeon
I have a feeling that the now 6 month old DX10 8800's will be very poor performers for what they were made for. The game I'm thinking the most about is Crysis. I'm thinking even the Ultra version will struggle to give playable framerates. Of course this is just opinion not fact. So all the people that wanted a DX10 GPU to play a game like Crysis will in fact have to upgrade to get what they paid for the first time. If I'm wrong correct me, but I think nVidia "gotcha" with marketing. If you bought an 8800 for DX9 apps, its the best out there IMO.
Originally posted by: PC Surgeon
But wouldn't you guys that bought an 8800 feel cheated? Assuming that it has poor DirectX 10 performance, being touted as a DirectX 10 card and yet you can't play Crysis?
Originally posted by: tanishalfelven
Originally posted by: Jeff7181
Originally posted by: SPARTAN VI
Originally posted by: Jeff7181
I've said this all along. People got all excited about the 8800 being a DX10 part when the only thing they should really be concerned with is their DX9 performance. I'm willing to bet the 8800 series won't have the balls to run real DX10 games. Just look at how the 9700 Pro performs in HL2... and imagine how it would perform in a game like F.E.A.R.
Hehe, run like crap.
But I don't understand why you chose HL2 and 9700Pro comparison? I'd compare a radeon 9x00 card and its performance in a DX8 game, and then its mundane performance in a DX9 game if I wanted to relate to how badly a g80 would do in DX10.
The 9700 Pro was the first "good" DX9 card. It was THE top of the line when DX9 was still in its infancy... and it still runs HL2 like crap.
The 9700 Pro was fine for the first wave or so of Dx9 games. It didn't really run out of steam until 2005 or so when later, Native DirectX 9 titles came out.
I can't disagree more. The 9700 Pro was never really "fine" for any DX9 game, IMHO. It ran HL2 like crap and it ran Far Cry like crap. It wasn't till the 6800 and x850's came around that you could actuallly play one of these games at resolutions 1280x1024 and above with anti-aliasing. Granted, this wasn't necessarily due to the hardware not being up to processing DX9 shaders, but it was an "old technology" card running a next generation application, and that's exactly what you're going to see with the 8800 and next generation games like Crysis.
This isn't to say anyone wasted their money on the 8800 series cards... they run current games very very well, but expecting them to run tomorrow's games equally as well is ignorant and ridiculous. I'm sure someone's going to tell me I don't KNOW and I'm just guessing. Sure... but it's an educated guess based on the history of the industry. When there's a change in technology, the first generation of hardware to support that new technology doesn't perform well and does even worse as software developers begin to exploit the techcnology to its fullest.
i think your wrong here. my x800gt (barely faster than a 9800pro) blew away lost coast and HL2 with everything at high. i don't see how 9700pro could have sucked completly.
Originally posted by: PC Surgeon
I have a feeling that the now 6 month old DX10 8800's will be very poor performers for what they were made for. The game I'm thinking the most about is Crysis. I'm thinking even the Ultra version will struggle to give playable framerates. Of course this is just opinion not fact. So all the people that wanted a DX10 GPU to play a game like Crysis will in fact have to upgrade to get what they paid for the first time. If I'm wrong correct me, but I think nVidia "gotcha" with marketing. If you bought an 8800 for DX9 apps, its the best out there IMO.
Originally posted by: Jeff7181
Originally posted by: tanishalfelven
Originally posted by: Jeff7181
Originally posted by: SPARTAN VI
Originally posted by: Jeff7181
I've said this all along. People got all excited about the 8800 being a DX10 part when the only thing they should really be concerned with is their DX9 performance. I'm willing to bet the 8800 series won't have the balls to run real DX10 games. Just look at how the 9700 Pro performs in HL2... and imagine how it would perform in a game like F.E.A.R.
Hehe, run like crap.
But I don't understand why you chose HL2 and 9700Pro comparison? I'd compare a radeon 9x00 card and its performance in a DX8 game, and then its mundane performance in a DX9 game if I wanted to relate to how badly a g80 would do in DX10.
The 9700 Pro was the first "good" DX9 card. It was THE top of the line when DX9 was still in its infancy... and it still runs HL2 like crap.
The 9700 Pro was fine for the first wave or so of Dx9 games. It didn't really run out of steam until 2005 or so when later, Native DirectX 9 titles came out.
I can't disagree more. The 9700 Pro was never really "fine" for any DX9 game, IMHO. It ran HL2 like crap and it ran Far Cry like crap. It wasn't till the 6800 and x850's came around that you could actuallly play one of these games at resolutions 1280x1024 and above with anti-aliasing. Granted, this wasn't necessarily due to the hardware not being up to processing DX9 shaders, but it was an "old technology" card running a next generation application, and that's exactly what you're going to see with the 8800 and next generation games like Crysis.
This isn't to say anyone wasted their money on the 8800 series cards... they run current games very very well, but expecting them to run tomorrow's games equally as well is ignorant and ridiculous. I'm sure someone's going to tell me I don't KNOW and I'm just guessing. Sure... but it's an educated guess based on the history of the industry. When there's a change in technology, the first generation of hardware to support that new technology doesn't perform well and does even worse as software developers begin to exploit the techcnology to its fullest.
i think your wrong here. my x800gt (barely faster than a 9800pro) blew away lost coast and HL2 with everything at high. i don't see how 9700pro could have sucked completly.
One needn't look far for proof.
More specifically, 40 FPS average (worst performing level in the game) is not fine for a first person shooter.
Keep in mind, those test results are also "run without Anti-Aliasing or Anisotropic Filtering enabled."
I don't know about anyone else, but I expect eye candy from a top of the line card, so if I'm going to spend $500 on a video card, I'm going to run Anti-aliasing and Anisotropic Filtering.
Originally posted by: RussianSensation
I dont think many buy cards for future-proofing. Gamers buy cards to provide adequate gaming experience for the games they play today as long as the card meets their price/performance ratio and budget. Also everyone's idea of what is acceptable frames/image quality for gaming is different.
I played HL2 on Radeon 8500 at 800x600 and it was perfectly smooth for me and I enjoyed the game very much with 0 AA and 0 AF in DX8.1 mode. Sure graphics are important, but gameplay is what makes the game enjoyable at the end. I also finished Gears of War on 360 which bests any game graphically for PC today. But I still consider HL2 at 800x600 a better game regardless.
Games like Red Alert 2, Starcraft, Quake 3, Goldeneye, Legend of Zelda: Ocarina of Time do not have great graphics by today's standards. But many consider them some of the best their genre has to offer.
From this perspective, it's a lot more important for Crysis to succeed as a game vs. being able to play it with all settings on high at 1920x1200 with AA/AF on a G80 card for instance. Of course G80 will run Crysis acceptably considering 95+% of gamers will not have hardware capable of outperforming G80. I am almost certain from a marketing perspective, 8800GTX is cattered to consumers with either high income levels or those who spend $$ on gaming as a hobby. Those very consumers will buy the fastest card that's out anyway and they will not care whether or not G80 will "last" them. This is just my 2 cents.
Originally posted by: tanishalfelven
hey i intend to keep my G80 for atleast 3 years. so what your saying is not exactly true.
Originally posted by: PingSpike
Originally posted by: PC Surgeon
But wouldn't you guys that bought an 8800 feel cheated? Assuming that it has poor DirectX 10 performance, being touted as a DirectX 10 card and yet you can't play Crysis?
Why is the task of rending DX10 assumed to be so much different and more complicated then DX9? Of the few DX10 screenshots I've seen (at least the few that obviously weren't a photoshop job) I haven't seen anything that looks different then the best stuff rendered in DX9.
I see posts about DX10 all the time like its some sort of magic technology thats going to drastically change everything overnight...which to me doesn't line up with past versions and the fact that graphics advances will always continue to offer diminished returns of visual quality.
Originally posted by: tanishalfelven
Originally posted by: Jeff7181
Originally posted by: tanishalfelven
Originally posted by: Jeff7181
Originally posted by: SPARTAN VI
Originally posted by: Jeff7181
I've said this all along. People got all excited about the 8800 being a DX10 part when the only thing they should really be concerned with is their DX9 performance. I'm willing to bet the 8800 series won't have the balls to run real DX10 games. Just look at how the 9700 Pro performs in HL2... and imagine how it would perform in a game like F.E.A.R.
Hehe, run like crap.
But I don't understand why you chose HL2 and 9700Pro comparison? I'd compare a radeon 9x00 card and its performance in a DX8 game, and then its mundane performance in a DX9 game if I wanted to relate to how badly a g80 would do in DX10.
The 9700 Pro was the first "good" DX9 card. It was THE top of the line when DX9 was still in its infancy... and it still runs HL2 like crap.
The 9700 Pro was fine for the first wave or so of Dx9 games. It didn't really run out of steam until 2005 or so when later, Native DirectX 9 titles came out.
I can't disagree more. The 9700 Pro was never really "fine" for any DX9 game, IMHO. It ran HL2 like crap and it ran Far Cry like crap. It wasn't till the 6800 and x850's came around that you could actuallly play one of these games at resolutions 1280x1024 and above with anti-aliasing. Granted, this wasn't necessarily due to the hardware not being up to processing DX9 shaders, but it was an "old technology" card running a next generation application, and that's exactly what you're going to see with the 8800 and next generation games like Crysis.
This isn't to say anyone wasted their money on the 8800 series cards... they run current games very very well, but expecting them to run tomorrow's games equally as well is ignorant and ridiculous. I'm sure someone's going to tell me I don't KNOW and I'm just guessing. Sure... but it's an educated guess based on the history of the industry. When there's a change in technology, the first generation of hardware to support that new technology doesn't perform well and does even worse as software developers begin to exploit the techcnology to its fullest.
i think your wrong here. my x800gt (barely faster than a 9800pro) blew away lost coast and HL2 with everything at high. i don't see how 9700pro could have sucked completly.
One needn't look far for proof.
More specifically, 40 FPS average (worst performing level in the game) is not fine for a first person shooter.
Keep in mind, those test results are also "run without Anti-Aliasing or Anisotropic Filtering enabled."
I don't know about anyone else, but I expect eye candy from a top of the line card, so if I'm going to spend $500 on a video card, I'm going to run Anti-aliasing and Anisotropic Filtering.
40fps seems fine to me. that about all one needs for a smooth playing experience.
also note
you failed to realized that 12x10 then was like 19x12 is now. look at 10x7 which wasa far more common res back then. the 9700pro ran great even at 12x10 but at 10x7......!.
I totally agree with you. DX10 is more efficient than 9 and that fact is straight from microsoft's mouth. A DX10 game should be faster than its DX9 counterpart all things being equal. The primary reason DX10 games are so demanding is just because the poly count, model count, texture size, number of shaders, complexity of shaders, number of light sources, and etc have all gone up.Originally posted by: PingSpike
Originally posted by: PC Surgeon
But wouldn't you guys that bought an 8800 feel cheated? Assuming that it has poor DirectX 10 performance, being touted as a DirectX 10 card and yet you can't play Crysis?
Why is the task of rending DX10 assumed to be so much different and more complicated then DX9? Of the few DX10 screenshots I've seen (at least the few that obviously weren't a photoshop job) I haven't seen anything that looks different then the best stuff rendered in DX9.
I see posts about DX10 all the time like its some sort of magic technology thats going to drastically change everything overnight...which to me doesn't line up with past versions and the fact that graphics advances will always continue to offer diminished returns of visual quality.
Originally posted by: zephyrprime
I totally agree with you. DX10 is more efficient than 9 and that fact is straight from microsoft's mouth. A DX10 game should be faster than its DX9 counterpart all things being equal. The primary reason DX10 games are so demanding is just because the poly count, model count, texture size, number of shaders, complexity of shaders, number of light sources, and etc have all gone up.Originally posted by: PingSpike
Originally posted by: PC Surgeon
But wouldn't you guys that bought an 8800 feel cheated? Assuming that it has poor DirectX 10 performance, being touted as a DirectX 10 card and yet you can't play Crysis?
Why is the task of rending DX10 assumed to be so much different and more complicated then DX9? Of the few DX10 screenshots I've seen (at least the few that obviously weren't a photoshop job) I haven't seen anything that looks different then the best stuff rendered in DX9.
I see posts about DX10 all the time like its some sort of magic technology thats going to drastically change everything overnight...which to me doesn't line up with past versions and the fact that graphics advances will always continue to offer diminished returns of visual quality.
Originally posted by: Gstanfor
G80 Dx10 being borked is merely the last, desperate fantasy of the fanatics I'm afraid...
Originally posted by: hans007
Originally posted by: PingSpike
Originally posted by: PC Surgeon
But wouldn't you guys that bought an 8800 feel cheated? Assuming that it has poor DirectX 10 performance, being touted as a DirectX 10 card and yet you can't play Crysis?
Why is the task of rending DX10 assumed to be so much different and more complicated then DX9? Of the few DX10 screenshots I've seen (at least the few that obviously weren't a photoshop job) I haven't seen anything that looks different then the best stuff rendered in DX9.
I see posts about DX10 all the time like its some sort of magic technology thats going to drastically change everything overnight...which to me doesn't line up with past versions and the fact that graphics advances will always continue to offer diminished returns of visual quality.
carmack and the like have said that there is nothing special about dx10 that cant be done in dx9.
i think it was in relation to supporting dx9 in the future because there is no dx10 on xp which is a huge installed base.
dx10 is like the one maybe feature of vista that would maybe make it worth installing. maybe.
Originally posted by: destrekor
Originally posted by: hans007
Originally posted by: PingSpike
Originally posted by: PC Surgeon
But wouldn't you guys that bought an 8800 feel cheated? Assuming that it has poor DirectX 10 performance, being touted as a DirectX 10 card and yet you can't play Crysis?
Why is the task of rending DX10 assumed to be so much different and more complicated then DX9? Of the few DX10 screenshots I've seen (at least the few that obviously weren't a photoshop job) I haven't seen anything that looks different then the best stuff rendered in DX9.
I see posts about DX10 all the time like its some sort of magic technology thats going to drastically change everything overnight...which to me doesn't line up with past versions and the fact that graphics advances will always continue to offer diminished returns of visual quality.
carmack and the like have said that there is nothing special about dx10 that cant be done in dx9.
i think it was in relation to supporting dx9 in the future because there is no dx10 on xp which is a huge installed base.
dx10 is like the one maybe feature of vista that would maybe make it worth installing. maybe.
while Carmack is great, he has some opinions of the general gaming industry that shows people need to stop treating him as if he is a god. He is rather idiotic at times in his assumptions and thoughts. He is no economist and shouldn't predict console sales. Also, apparently he doesn't know much about DX10 because every article I had read has stated a full DX10 game (not one that is made to run on both 10 and 9), will run the same graphical quality with more performance when tested on the same hardware. Obviously that makes DX10 better than 9 IMO.
Originally posted by: Matt2
Originally posted by: destrekor
Originally posted by: hans007
Originally posted by: PingSpike
Originally posted by: PC Surgeon
But wouldn't you guys that bought an 8800 feel cheated? Assuming that it has poor DirectX 10 performance, being touted as a DirectX 10 card and yet you can't play Crysis?
Why is the task of rending DX10 assumed to be so much different and more complicated then DX9? Of the few DX10 screenshots I've seen (at least the few that obviously weren't a photoshop job) I haven't seen anything that looks different then the best stuff rendered in DX9.
I see posts about DX10 all the time like its some sort of magic technology thats going to drastically change everything overnight...which to me doesn't line up with past versions and the fact that graphics advances will always continue to offer diminished returns of visual quality.
carmack and the like have said that there is nothing special about dx10 that cant be done in dx9.
i think it was in relation to supporting dx9 in the future because there is no dx10 on xp which is a huge installed base.
dx10 is like the one maybe feature of vista that would maybe make it worth installing. maybe.
while Carmack is great, he has some opinions of the general gaming industry that shows people need to stop treating him as if he is a god. He is rather idiotic at times in his assumptions and thoughts. He is no economist and shouldn't predict console sales. Also, apparently he doesn't know much about DX10 because every article I had read has stated a full DX10 game (not one that is made to run on both 10 and 9), will run the same graphical quality with more performance when tested on the same hardware. Obviously that makes DX10 better than 9 IMO.
I agree with the fact that nothing done in DX10 can't be done using DX9.
At the same time I agree with you that the same game using the same hardware should be faster using DX10 vs DX9. With DX10 there is less driver and OS overhead.
OTOH, I have no doubt in my mind that developers will find a craptastic way to piss away that extra performance.