I have a feeling about nVidia's 8800's

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

hans007

Lifer
Feb 1, 2000
20,212
17
81
Originally posted by: PingSpike
Originally posted by: PC Surgeon
But wouldn't you guys that bought an 8800 feel cheated? Assuming that it has poor DirectX 10 performance, being touted as a DirectX 10 card and yet you can't play Crysis?

Why is the task of rending DX10 assumed to be so much different and more complicated then DX9? Of the few DX10 screenshots I've seen (at least the few that obviously weren't a photoshop job) I haven't seen anything that looks different then the best stuff rendered in DX9.

I see posts about DX10 all the time like its some sort of magic technology thats going to drastically change everything overnight...which to me doesn't line up with past versions and the fact that graphics advances will always continue to offer diminished returns of visual quality.

carmack and the like have said that there is nothing special about dx10 that cant be done in dx9.

i think it was in relation to supporting dx9 in the future because there is no dx10 on xp which is a huge installed base.

dx10 is like the one maybe feature of vista that would maybe make it worth installing. maybe.
 

NoStateofMind

Diamond Member
Oct 14, 2005
9,711
6
76
Originally posted by: Gstanfor
G80 Dx10 being borked is merely the last, desperate fantasy of the fanatics I'm afraid...

That's NOT the reason I made this thread. I wanted to know why people bought G80 and how they would feel if you couldn't play Crysis or any DX10 game at playable framerates. After seeing one of the demos (Crysis) where one of the developers mentioned working with nVidia from the beginning, I figure "it should" be able to run Crysis. The only question that remains is, do you have to run 2 8800's to do so?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: PC Surgeon
Originally posted by: Gstanfor
G80 Dx10 being borked is merely the last, desperate fantasy of the fanatics I'm afraid...

That's NOT the reason I made this thread. I wanted to know why people bought G80 and how they would feel if you couldn't play Crysis or any DX10 game at playable framerates. After seeing one of the demos (Crysis) where one of the developers mentioned working with nVidia from the beginning, I figure "it should" be able to run Crysis. The only question that remains is, do you have to run 2 8800's to do so?

well, then let me ease your fears ...
your GTX *will* run Crysis very well - just not at max resolutions and full "details" nor maxed AA/AF

the *other question* is whether HD2900xt will run it better
-or not

i doubt anyone in their right mind will 'dump' their gtx for am XT even if it is a "little faster"


 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: PC Surgeon
I have a feeling that the now 6 month old DX10 8800's will be very poor performers for what they were made for. The game I'm thinking the most about is Crysis. I'm thinking even the Ultra version will struggle to give playable framerates. Of course this is just opinion not fact. So all the people that wanted a DX10 GPU to play a game like Crysis will in fact have to upgrade to get what they paid for the first time. If I'm wrong correct me, but I think nVidia "gotcha" with marketing. If you bought an 8800 for DX9 apps, its the best out there IMO.

No clue what you are basing this "feeling on"... in fact, initial DX10 titles have been for the most part being developed on nv hardware.

DX10 doesn't bring anything to the table other than more efficient ways of performing current effects; you should be able to do more in DX10 than DX9 while using fewer resources...but not anything you couldn't do in DX9 (other than perhaps the 'expense' of the performance hit).

I'd bet the 8800 GTS/GTX will play Crysis just fine, as well as most other initial DX10 titles. perhaps not on the highest resolutions, but with most "eye candy" available.
 

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
Originally posted by: PC Surgeon
But wouldn't you guys that bought an 8800 feel cheated? Assuming that it has poor DirectX 10 performance, being touted as a DirectX 10 card and yet you can't play Crysis?

No. My card is also quite good at DX9 and much better than anything else I could buy. DX10-based cards also support Aero a lot better in Vista (you can have multiple DirectGraphics sessions so Aero can run while something that uses DirectGraphics (i.e. a game) is running)
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: tanishalfelven
Originally posted by: Jeff7181
Originally posted by: SPARTAN VI
Originally posted by: Jeff7181
I've said this all along. People got all excited about the 8800 being a DX10 part when the only thing they should really be concerned with is their DX9 performance. I'm willing to bet the 8800 series won't have the balls to run real DX10 games. Just look at how the 9700 Pro performs in HL2... and imagine how it would perform in a game like F.E.A.R.

Hehe, run like crap.

But I don't understand why you chose HL2 and 9700Pro comparison? I'd compare a radeon 9x00 card and its performance in a DX8 game, and then its mundane performance in a DX9 game if I wanted to relate to how badly a g80 would do in DX10.

The 9700 Pro was the first "good" DX9 card. It was THE top of the line when DX9 was still in its infancy... and it still runs HL2 like crap.

The 9700 Pro was fine for the first wave or so of Dx9 games. It didn't really run out of steam until 2005 or so when later, Native DirectX 9 titles came out.

I can't disagree more. The 9700 Pro was never really "fine" for any DX9 game, IMHO. It ran HL2 like crap and it ran Far Cry like crap. It wasn't till the 6800 and x850's came around that you could actuallly play one of these games at resolutions 1280x1024 and above with anti-aliasing. Granted, this wasn't necessarily due to the hardware not being up to processing DX9 shaders, but it was an "old technology" card running a next generation application, and that's exactly what you're going to see with the 8800 and next generation games like Crysis.

This isn't to say anyone wasted their money on the 8800 series cards... they run current games very very well, but expecting them to run tomorrow's games equally as well is ignorant and ridiculous. I'm sure someone's going to tell me I don't KNOW and I'm just guessing. Sure... but it's an educated guess based on the history of the industry. When there's a change in technology, the first generation of hardware to support that new technology doesn't perform well and does even worse as software developers begin to exploit the techcnology to its fullest.

i think your wrong here. my x800gt (barely faster than a 9800pro) blew away lost coast and HL2 with everything at high. i don't see how 9700pro could have sucked completly.

One needn't look far for proof.

More specifically, 40 FPS average (worst performing level in the game) is not fine for a first person shooter.

Keep in mind, those test results are also "run without Anti-Aliasing or Anisotropic Filtering enabled."

I don't know about anyone else, but I expect eye candy from a top of the line card, so if I'm going to spend $500 on a video card, I'm going to run Anti-aliasing and Anisotropic Filtering.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Originally posted by: PC Surgeon
I have a feeling that the now 6 month old DX10 8800's will be very poor performers for what they were made for. The game I'm thinking the most about is Crysis. I'm thinking even the Ultra version will struggle to give playable framerates. Of course this is just opinion not fact. So all the people that wanted a DX10 GPU to play a game like Crysis will in fact have to upgrade to get what they paid for the first time. If I'm wrong correct me, but I think nVidia "gotcha" with marketing. If you bought an 8800 for DX9 apps, its the best out there IMO.

Guess it depends on what you think it was made for? Personally I see any 1st gen hardware as nothing more than a dev tool. When the 8800 was released, how many DX10 titles and platforms were in the channel? During its life cycle(12-18months) what generation of games will predominately be played on it?

When I bought my 88800 it wasnt for the DX performance. It was for the DX9 performance in current games.

/shrug
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
I expect the 8800 to perform adequately in DX10 but it won't be able to play any DX10 native title all that well, as expected, like how well does a Radeon 9700 Pro play a game like Oblivion? Which is one of the first native DX9 games.

 

TanisHalfElven

Diamond Member
Jun 29, 2001
3,512
0
76
Originally posted by: Jeff7181
Originally posted by: tanishalfelven
Originally posted by: Jeff7181
Originally posted by: SPARTAN VI
Originally posted by: Jeff7181
I've said this all along. People got all excited about the 8800 being a DX10 part when the only thing they should really be concerned with is their DX9 performance. I'm willing to bet the 8800 series won't have the balls to run real DX10 games. Just look at how the 9700 Pro performs in HL2... and imagine how it would perform in a game like F.E.A.R.

Hehe, run like crap.

But I don't understand why you chose HL2 and 9700Pro comparison? I'd compare a radeon 9x00 card and its performance in a DX8 game, and then its mundane performance in a DX9 game if I wanted to relate to how badly a g80 would do in DX10.

The 9700 Pro was the first "good" DX9 card. It was THE top of the line when DX9 was still in its infancy... and it still runs HL2 like crap.

The 9700 Pro was fine for the first wave or so of Dx9 games. It didn't really run out of steam until 2005 or so when later, Native DirectX 9 titles came out.

I can't disagree more. The 9700 Pro was never really "fine" for any DX9 game, IMHO. It ran HL2 like crap and it ran Far Cry like crap. It wasn't till the 6800 and x850's came around that you could actuallly play one of these games at resolutions 1280x1024 and above with anti-aliasing. Granted, this wasn't necessarily due to the hardware not being up to processing DX9 shaders, but it was an "old technology" card running a next generation application, and that's exactly what you're going to see with the 8800 and next generation games like Crysis.

This isn't to say anyone wasted their money on the 8800 series cards... they run current games very very well, but expecting them to run tomorrow's games equally as well is ignorant and ridiculous. I'm sure someone's going to tell me I don't KNOW and I'm just guessing. Sure... but it's an educated guess based on the history of the industry. When there's a change in technology, the first generation of hardware to support that new technology doesn't perform well and does even worse as software developers begin to exploit the techcnology to its fullest.

i think your wrong here. my x800gt (barely faster than a 9800pro) blew away lost coast and HL2 with everything at high. i don't see how 9700pro could have sucked completly.

One needn't look far for proof.

More specifically, 40 FPS average (worst performing level in the game) is not fine for a first person shooter.

Keep in mind, those test results are also "run without Anti-Aliasing or Anisotropic Filtering enabled."

I don't know about anyone else, but I expect eye candy from a top of the line card, so if I'm going to spend $500 on a video card, I'm going to run Anti-aliasing and Anisotropic Filtering.

40fps seems fine to me. that about all one needs for a smooth playing experience.

also note
you failed to realized that 12x10 then was like 19x12 is now. look at 10x7 which wasa far more common res back then. the 9700pro ran great even at 12x10 but at 10x7......!.

 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I dont think many buy cards for future-proofing. Gamers buy cards to provide adequate gaming experience for the games they play today as long as the card meets their price/performance ratio and budget. Also everyone's idea of what is acceptable frames/image quality for gaming is different.

I played HL2 on Radeon 8500 at 800x600 and it was perfectly smooth for me and I enjoyed the game very much with 0 AA and 0 AF in DX8.1 mode. Sure graphics are important, but gameplay is what makes the game enjoyable at the end. I also finished Gears of War on 360 which bests any game graphically for PC today. But I still consider HL2 at 800x600 a better game regardless.

Games like Red Alert 2, Starcraft, Quake 3, Goldeneye, Legend of Zelda: Ocarina of Time do not have great graphics by today's standards. But many consider them some of the best their genre has to offer.

From this perspective, it's a lot more important for Crysis to succeed as a game vs. being able to play it with all settings on high at 1920x1200 with AA/AF on a G80 card for instance. Of course G80 will run Crysis acceptably considering 95+% of gamers will not have hardware capable of outperforming G80. I am almost certain from a marketing perspective, 8800GTX is cattered to consumers with either high income levels or those who spend $$ on gaming as a hobby. Those very consumers will buy the fastest card that's out anyway and they will not care whether or not G80 will "last" them. This is just my 2 cents.
 

TanisHalfElven

Diamond Member
Jun 29, 2001
3,512
0
76
Originally posted by: RussianSensation
I dont think many buy cards for future-proofing. Gamers buy cards to provide adequate gaming experience for the games they play today as long as the card meets their price/performance ratio and budget. Also everyone's idea of what is acceptable frames/image quality for gaming is different.

I played HL2 on Radeon 8500 at 800x600 and it was perfectly smooth for me and I enjoyed the game very much with 0 AA and 0 AF in DX8.1 mode. Sure graphics are important, but gameplay is what makes the game enjoyable at the end. I also finished Gears of War on 360 which bests any game graphically for PC today. But I still consider HL2 at 800x600 a better game regardless.

Games like Red Alert 2, Starcraft, Quake 3, Goldeneye, Legend of Zelda: Ocarina of Time do not have great graphics by today's standards. But many consider them some of the best their genre has to offer.

From this perspective, it's a lot more important for Crysis to succeed as a game vs. being able to play it with all settings on high at 1920x1200 with AA/AF on a G80 card for instance. Of course G80 will run Crysis acceptably considering 95+% of gamers will not have hardware capable of outperforming G80. I am almost certain from a marketing perspective, 8800GTX is cattered to consumers with either high income levels or those who spend $$ on gaming as a hobby. Those very consumers will buy the fastest card that's out anyway and they will not care whether or not G80 will "last" them. This is just my 2 cents.


hey i intend to keep my G80 for atleast 3 years. so what your saying is not exactly true.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: tanishalfelven

hey i intend to keep my G80 for atleast 3 years. so what your saying is not exactly true.

Sorry I was specifically referring to 8800GTX for $550. I think it would have probably made more sense for someone to buy 8800GTS 320mb for $260 today and 9800GTS for $260 in 1.5 years. 8800GTS 320mb would have probably provided more than adequate performance for today's games and 9800GTS will provide significantly more performance for the remaining 1.5 years of your 8800GTXs ownership.

But also you bring up an important point. You will keep 8800GTS 320mb for 3 years which means you are willing to compromise image quality for acceptable smoothness of the game in the future. Those who have 8800GTX are either not willing to accept the inferior detail level in games today or have enough $ to not care (that is one logical conclusion for the existence of high end gaming cards - to cater to these types of consumers).

Therefore, how can you argue whether or not G80 will play Crysis smoothly or not when a large determinant is each gamer's equilibrium of image quality and acceptable performance. That's like trying to argue personal preference of "everyone." You wont be able to reach anything conclusive.

If the OP said he doesn't believe G80 will be able to play Crysis with all settings on full, one could say "Most likely, it will not." But trying to argue that G80 wont be "playable" for Crysis is too vague. What is "playable"? To me it's 800x600 noAA/noAF but to you it could be 1600x1200 with 8AA/16AF.
 

RallyMaster

Diamond Member
Dec 28, 2004
5,581
0
0
Originally posted by: PingSpike
Originally posted by: PC Surgeon
But wouldn't you guys that bought an 8800 feel cheated? Assuming that it has poor DirectX 10 performance, being touted as a DirectX 10 card and yet you can't play Crysis?

Why is the task of rending DX10 assumed to be so much different and more complicated then DX9? Of the few DX10 screenshots I've seen (at least the few that obviously weren't a photoshop job) I haven't seen anything that looks different then the best stuff rendered in DX9.

I see posts about DX10 all the time like its some sort of magic technology thats going to drastically change everything overnight...which to me doesn't line up with past versions and the fact that graphics advances will always continue to offer diminished returns of visual quality.

Well, same reason why I question the move to SM3.0 when plainly 2.0 looks just about the same. I honestly noticed no difference in going from DX9.0b to 9.0c other than the fact that my frame rates dropped in some games but noticed no great improvement in rendering quality.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: tanishalfelven
Originally posted by: Jeff7181
Originally posted by: tanishalfelven
Originally posted by: Jeff7181
Originally posted by: SPARTAN VI
Originally posted by: Jeff7181
I've said this all along. People got all excited about the 8800 being a DX10 part when the only thing they should really be concerned with is their DX9 performance. I'm willing to bet the 8800 series won't have the balls to run real DX10 games. Just look at how the 9700 Pro performs in HL2... and imagine how it would perform in a game like F.E.A.R.

Hehe, run like crap.

But I don't understand why you chose HL2 and 9700Pro comparison? I'd compare a radeon 9x00 card and its performance in a DX8 game, and then its mundane performance in a DX9 game if I wanted to relate to how badly a g80 would do in DX10.

The 9700 Pro was the first "good" DX9 card. It was THE top of the line when DX9 was still in its infancy... and it still runs HL2 like crap.

The 9700 Pro was fine for the first wave or so of Dx9 games. It didn't really run out of steam until 2005 or so when later, Native DirectX 9 titles came out.

I can't disagree more. The 9700 Pro was never really "fine" for any DX9 game, IMHO. It ran HL2 like crap and it ran Far Cry like crap. It wasn't till the 6800 and x850's came around that you could actuallly play one of these games at resolutions 1280x1024 and above with anti-aliasing. Granted, this wasn't necessarily due to the hardware not being up to processing DX9 shaders, but it was an "old technology" card running a next generation application, and that's exactly what you're going to see with the 8800 and next generation games like Crysis.

This isn't to say anyone wasted their money on the 8800 series cards... they run current games very very well, but expecting them to run tomorrow's games equally as well is ignorant and ridiculous. I'm sure someone's going to tell me I don't KNOW and I'm just guessing. Sure... but it's an educated guess based on the history of the industry. When there's a change in technology, the first generation of hardware to support that new technology doesn't perform well and does even worse as software developers begin to exploit the techcnology to its fullest.

i think your wrong here. my x800gt (barely faster than a 9800pro) blew away lost coast and HL2 with everything at high. i don't see how 9700pro could have sucked completly.

One needn't look far for proof.

More specifically, 40 FPS average (worst performing level in the game) is not fine for a first person shooter.

Keep in mind, those test results are also "run without Anti-Aliasing or Anisotropic Filtering enabled."

I don't know about anyone else, but I expect eye candy from a top of the line card, so if I'm going to spend $500 on a video card, I'm going to run Anti-aliasing and Anisotropic Filtering.

40fps seems fine to me. that about all one needs for a smooth playing experience.

also note
you failed to realized that 12x10 then was like 19x12 is now. look at 10x7 which wasa far more common res back then. the 9700pro ran great even at 12x10 but at 10x7......!.

I didn't fail to realize anything. 1280x1024 is a very common resolution for LCD displays... has been for many years. Around that time I was using 1024x768 because I had a CRT monitor and the refresh rate had to be 60 Hz to play at 1280x1024, not to mention the aspect ratio was wrong. But nobody back then paid $400 for a 9700 Pro to play a game at 1024x768 with no AA and no AF, just like nobody today buys a 8800GTX to play a game at 1280x1024 with 2xAA and 8xAF.
 

zephyrprime

Diamond Member
Feb 18, 2001
7,512
2
81
Originally posted by: PingSpike
Originally posted by: PC Surgeon
But wouldn't you guys that bought an 8800 feel cheated? Assuming that it has poor DirectX 10 performance, being touted as a DirectX 10 card and yet you can't play Crysis?

Why is the task of rending DX10 assumed to be so much different and more complicated then DX9? Of the few DX10 screenshots I've seen (at least the few that obviously weren't a photoshop job) I haven't seen anything that looks different then the best stuff rendered in DX9.

I see posts about DX10 all the time like its some sort of magic technology thats going to drastically change everything overnight...which to me doesn't line up with past versions and the fact that graphics advances will always continue to offer diminished returns of visual quality.
I totally agree with you. DX10 is more efficient than 9 and that fact is straight from microsoft's mouth. A DX10 game should be faster than its DX9 counterpart all things being equal. The primary reason DX10 games are so demanding is just because the poly count, model count, texture size, number of shaders, complexity of shaders, number of light sources, and etc have all gone up.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: zephyrprime
Originally posted by: PingSpike
Originally posted by: PC Surgeon
But wouldn't you guys that bought an 8800 feel cheated? Assuming that it has poor DirectX 10 performance, being touted as a DirectX 10 card and yet you can't play Crysis?

Why is the task of rending DX10 assumed to be so much different and more complicated then DX9? Of the few DX10 screenshots I've seen (at least the few that obviously weren't a photoshop job) I haven't seen anything that looks different then the best stuff rendered in DX9.

I see posts about DX10 all the time like its some sort of magic technology thats going to drastically change everything overnight...which to me doesn't line up with past versions and the fact that graphics advances will always continue to offer diminished returns of visual quality.
I totally agree with you. DX10 is more efficient than 9 and that fact is straight from microsoft's mouth. A DX10 game should be faster than its DX9 counterpart all things being equal. The primary reason DX10 games are so demanding is just because the poly count, model count, texture size, number of shaders, complexity of shaders, number of light sources, and etc have all gone up.

That doesn't change the fact that the current crop of DX10 cards don't have the balls to run next generation software with all the bells and whistles. Whatever the reason, that's the truth. You WILL NOT play Crysis on an 8800GTX at 1920x1200 with 4XAA and 16XAF with in-game details set as high as they can go and see decent frame rates.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: Gstanfor
G80 Dx10 being borked is merely the last, desperate fantasy of the fanatics I'm afraid...

You among all of us would know best about fantasyland. You need to be banned from here and I mean pronto. :|

 

destrekor

Lifer
Nov 18, 2005
28,799
359
126
Originally posted by: hans007
Originally posted by: PingSpike
Originally posted by: PC Surgeon
But wouldn't you guys that bought an 8800 feel cheated? Assuming that it has poor DirectX 10 performance, being touted as a DirectX 10 card and yet you can't play Crysis?

Why is the task of rending DX10 assumed to be so much different and more complicated then DX9? Of the few DX10 screenshots I've seen (at least the few that obviously weren't a photoshop job) I haven't seen anything that looks different then the best stuff rendered in DX9.

I see posts about DX10 all the time like its some sort of magic technology thats going to drastically change everything overnight...which to me doesn't line up with past versions and the fact that graphics advances will always continue to offer diminished returns of visual quality.

carmack and the like have said that there is nothing special about dx10 that cant be done in dx9.

i think it was in relation to supporting dx9 in the future because there is no dx10 on xp which is a huge installed base.

dx10 is like the one maybe feature of vista that would maybe make it worth installing. maybe.

while Carmack is great, he has some opinions of the general gaming industry that shows people need to stop treating him as if he is a god. He is rather idiotic at times in his assumptions and thoughts. He is no economist and shouldn't predict console sales. Also, apparently he doesn't know much about DX10 because every article I had read has stated a full DX10 game (not one that is made to run on both 10 and 9), will run the same graphical quality with more performance when tested on the same hardware. Obviously that makes DX10 better than 9 IMO.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: destrekor
Originally posted by: hans007
Originally posted by: PingSpike
Originally posted by: PC Surgeon
But wouldn't you guys that bought an 8800 feel cheated? Assuming that it has poor DirectX 10 performance, being touted as a DirectX 10 card and yet you can't play Crysis?

Why is the task of rending DX10 assumed to be so much different and more complicated then DX9? Of the few DX10 screenshots I've seen (at least the few that obviously weren't a photoshop job) I haven't seen anything that looks different then the best stuff rendered in DX9.

I see posts about DX10 all the time like its some sort of magic technology thats going to drastically change everything overnight...which to me doesn't line up with past versions and the fact that graphics advances will always continue to offer diminished returns of visual quality.

carmack and the like have said that there is nothing special about dx10 that cant be done in dx9.

i think it was in relation to supporting dx9 in the future because there is no dx10 on xp which is a huge installed base.

dx10 is like the one maybe feature of vista that would maybe make it worth installing. maybe.

while Carmack is great, he has some opinions of the general gaming industry that shows people need to stop treating him as if he is a god. He is rather idiotic at times in his assumptions and thoughts. He is no economist and shouldn't predict console sales. Also, apparently he doesn't know much about DX10 because every article I had read has stated a full DX10 game (not one that is made to run on both 10 and 9), will run the same graphical quality with more performance when tested on the same hardware. Obviously that makes DX10 better than 9 IMO.

I agree with the fact that nothing done in DX10 can't be done using DX9.

At the same time I agree with you that the same game using the same hardware should be faster using DX10 vs DX9. With DX10 there is less driver and OS overhead.

OTOH, I have no doubt in my mind that developers will find a craptastic way to piss away that extra performance.
 

destrekor

Lifer
Nov 18, 2005
28,799
359
126
Originally posted by: Matt2
Originally posted by: destrekor
Originally posted by: hans007
Originally posted by: PingSpike
Originally posted by: PC Surgeon
But wouldn't you guys that bought an 8800 feel cheated? Assuming that it has poor DirectX 10 performance, being touted as a DirectX 10 card and yet you can't play Crysis?

Why is the task of rending DX10 assumed to be so much different and more complicated then DX9? Of the few DX10 screenshots I've seen (at least the few that obviously weren't a photoshop job) I haven't seen anything that looks different then the best stuff rendered in DX9.

I see posts about DX10 all the time like its some sort of magic technology thats going to drastically change everything overnight...which to me doesn't line up with past versions and the fact that graphics advances will always continue to offer diminished returns of visual quality.

carmack and the like have said that there is nothing special about dx10 that cant be done in dx9.

i think it was in relation to supporting dx9 in the future because there is no dx10 on xp which is a huge installed base.

dx10 is like the one maybe feature of vista that would maybe make it worth installing. maybe.

while Carmack is great, he has some opinions of the general gaming industry that shows people need to stop treating him as if he is a god. He is rather idiotic at times in his assumptions and thoughts. He is no economist and shouldn't predict console sales. Also, apparently he doesn't know much about DX10 because every article I had read has stated a full DX10 game (not one that is made to run on both 10 and 9), will run the same graphical quality with more performance when tested on the same hardware. Obviously that makes DX10 better than 9 IMO.

I agree with the fact that nothing done in DX10 can't be done using DX9.

At the same time I agree with you that the same game using the same hardware should be faster using DX10 vs DX9. With DX10 there is less driver and OS overhead.

OTOH, I have no doubt in my mind that developers will find a craptastic way to piss away that extra performance.

well yea. They gotta adapt to the DX10 APIs and how they make calls to the hardware
it may take until the second gen to truly reap the benefits of DX10.
however, I have a feeling CryEngine2 may be up to the challenge of being a great DX10 engine.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |