fierydemise
Platinum Member
- Apr 16, 2005
- 2,056
- 2
- 81
Will someone show me where they're getting the data that there is higher demand for the 7900 then the X1900, people keep saying this but I haven't seen any facts proving this.
Originally posted by: fierydemise
Will someone show me where they're getting the data that there is higher demand for the 7900 then the X1900, people keep saying this but I haven't seen any facts proving this.
Originally posted by: Zstream
OMG just shutup, I read all of your responses and you point to IQ without talking about FPS.
It DOES matter if they used grass shaders the performance hit is around 5-6% and when a game averages 15-60 FPS that matters a TON! If you do not believe me run the FRIEKING game with them on and off! If you cannot do this then try google! If you still do not understand then please stop posting.
Originally posted by: Ackmed
Typical bias response. I guess NV was not experiencing high demand back with the 6800U, and ATi was with the X800XT/PE. According to your logic, thats how it works.
As I said before, NV didnt have this problem with the 7800GT/GTX launch. They never sold out, and prices dropped, not increased due to lack of cards. They were in very high demand as well, yet none of the current problems now, were back then. Thus, your arguement is false.
Yeah I think this is the real question with the benchmarks. What driver revisions they were using might make a big difference (both the Omega 6.3's and the NGO optimized 81.25's have reportedly produced performance increases), as well as the Direct3D render ahead/flip queue size settings in the drivers, plus any .ini tweaks like the bAllow30Shaders, iPreLoadSizeLimit, bUseHardDriveCache, etc.Originally posted by: hans030390
I didn't know you could bench Oblivion...but I guess they did.
Also, did it mention what drivers they were running? Were they using the optimized ones? If not, that may be why there is a difference in performance/IQ.
About Sm3, I think theres an option in the ini file that allows you to run it in a sm3 path. Not sure how that affects IQ or performance at all.
I think the article isn't completely believable because usually the 7900gtx performs very closely with the x1900xtx, and this is a bigger gap than I expected.
I guess we'll just have to wait for more benchmarks.
Originally posted by: Ackmed
Oh ok. So its ok for NV to be sold out, and that means that nobody is buying ATi cards, but it doesnt work the other way. Just about what I expected you to say. The X800XT/PE was sold out, and the 6800U wasnt. Now that the roles have reversed, its not the same anymore. Funny that.
"Where do you think ALL those 7900GT/GTXs went? Down the toliet? Why do you think they are pretty much giving away those expensive X1800XT/X1900XTX?"
"All"? Could you give us some numbers, of these "all"? As I said before, the 7800GT/GTX launch did not have these problems. Im betting that there were far more 7800's available in the first month, than the 7900's. No facts to back this up, just an observation. And the 7800 series dropped in price in less than a month, just like the X1900 series. I guess they were "pretty much giving away" the 7800 series as well? No. Dont be so ignorant, if its good for the goose, its good for the gander, so to speak.
Originally posted by: CaiNaM
i have hq af. it makes no difference in oblivion. if it makes no visible difference, then is it in fact really doing "more"? or is that a difficult concept to grasp?
Originally posted by: Cookie Monster
Bah I dont really care anymore. Its hard nor impossible to have a discussion with a person when a he or she is already convinced that he is right no matter what.
Here, have a cookie.
Originally posted by: wizboy11
The subtopic said "it wasn't even close". The difference was 3fps.
Seems close to me.
But, the ATI card IS using HQ AF so you kinda have to take that into account. Lucky for ATI their AF doesn't have the same performance hit as it does on NV hardware.
Overall, I'd say it was pretty close, with the X1900 coming out a "hair" in the lead. Eitherway, Oblivion will pretty much make any card chug along, even my 7900GT's in SLI.
Also, can someone explain why even with my 7900GT's ( @ 556/1800) that it still chugs and has noticable slowdowns? Is it because I have everything maxed out? Are softshadows a big hit in this game?
Originally posted by: 5150Joker
Originally posted by: wizboy11
The subtopic said "it wasn't even close". The difference was 3fps.
Seems close to me.
But, the ATI card IS using HQ AF so you kinda have to take that into account. Lucky for ATI their AF doesn't have the same performance hit as it does on NV hardware.
Overall, I'd say it was pretty close, with the X1900 coming out a "hair" in the lead. Eitherway, Oblivion will pretty much make any card chug along, even my 7900GT's in SLI.
Also, can someone explain why even with my 7900GT's ( @ 556/1800) that it still chugs and has noticable slowdowns? Is it because I have everything maxed out? Are softshadows a big hit in this game?
Did you read any of this thread? If not see my post above. The GTX was:
1. OC'd
2. Not using grass shadows which netted a 41.6% increase in min fps (12-->17 fps)
3. Not using angle independent AF (not possible)
4. Costs $60 more than a stock XTX.
That's a major thrashing for a card that costs more, has bad availability, is forced to use lower settings and STILL loses.
Originally posted by: mazeroth
it's pretty pathetic when ppl have to grasp at every little straw to try and claim an advantage. while i'd have to agree the xtx has an edge (also cost more, btw), the reality is they offer the same gameplay and same iq (tho you have to "suffer" with a little lighter grass texture if you have a GTX).
Actually, the 7900 GTX is CONSIDERABLY more expensive than the X1900XT. Yes, I know the review used the X1900XTX, but I've never seen an X1900XT that can't reach XTX speeds, ever. The X1900XT can be had for $100 less than a 7900 GTX, so the comparison is way more in favor of the ATI. No, I'm not a fan of either as I have a 7800GT and an X1900XT.
Originally posted by: Yreka
Originally posted by: mazeroth
it's pretty pathetic when ppl have to grasp at every little straw to try and claim an advantage. while i'd have to agree the xtx has an edge (also cost more, btw), the reality is they offer the same gameplay and same iq (tho you have to "suffer" with a little lighter grass texture if you have a GTX).
Actually, the 7900 GTX is CONSIDERABLY more expensive than the X1900XT. Yes, I know the review used the X1900XTX, but I've never seen an X1900XT that can't reach XTX speeds, ever. The X1900XT can be had for $100 less than a 7900 GTX, so the comparison is way more in favor of the ATI. No, I'm not a fan of either as I have a 7800GT and an X1900XT.
Mine wouldnt reach XTX with the stock HSF. It has the 1.1 memory too, I was limited by temp.
Originally posted by: CaiNaM
Originally posted by: 5150Joker
The X1900 XTX had grass shadows enabled in addition to HQ AF (something nVidia can't even reproduce) and it STILL beat out the 7900 GTX in Oblivion, that's pretty significant. If they did an apples to apples comparison and turned off grass shadows for the XTX and used standard AF it would've stomped the 7900 GTX; the GTX went from a minimum of 12 fps to 17 fps by shutting off grass shadows.
a feature with no benefit is hardly a feature:
The difference between the image quality on the Radeon X1900XTX and GeForce 7900 GTX wasn't noticeable, even with Grass Shadows turned off - all that seemed to do was darken the grass texture a little.
it's pretty pathetic when ppl have to grasp at every little straw to try and claim an advantage. while i'd have to agree the xtx has an edge (also cost more, btw), the reality is they offer the same gameplay and same iq (tho you have to "suffer" with a little lighter grass texture if you have a GTX).
Originally posted by: Clauzii
Originally posted by: CaiNaM
Originally posted by: 5150Joker
The X1900 XTX had grass shadows enabled in addition to HQ AF (something nVidia can't even reproduce) and it STILL beat out the 7900 GTX in Oblivion, that's pretty significant. If they did an apples to apples comparison and turned off grass shadows for the XTX and used standard AF it would've stomped the 7900 GTX; the GTX went from a minimum of 12 fps to 17 fps by shutting off grass shadows.
a feature with no benefit is hardly a feature:
The difference between the image quality on the Radeon X1900XTX and GeForce 7900 GTX wasn't noticeable, even with Grass Shadows turned off - all that seemed to do was darken the grass texture a little.
it's pretty pathetic when ppl have to grasp at every little straw to try and claim an advantage. while i'd have to agree the xtx has an edge (also cost more, btw), the reality is they offer the same gameplay and same iq (tho you have to "suffer" with a little lighter grass texture if you have a GTX).
And given the fact that the X1900XTX runs at a higher resolution makes You´r statement(s) seem like bable to me :roll:
Originally posted by: Clauzii
Originally posted by: CaiNaM
Originally posted by: 5150Joker
The X1900 XTX had grass shadows enabled in addition to HQ AF (something nVidia can't even reproduce) and it STILL beat out the 7900 GTX in Oblivion, that's pretty significant. If they did an apples to apples comparison and turned off grass shadows for the XTX and used standard AF it would've stomped the 7900 GTX; the GTX went from a minimum of 12 fps to 17 fps by shutting off grass shadows.
a feature with no benefit is hardly a feature:
The difference between the image quality on the Radeon X1900XTX and GeForce 7900 GTX wasn't noticeable, even with Grass Shadows turned off - all that seemed to do was darken the grass texture a little.
it's pretty pathetic when ppl have to grasp at every little straw to try and claim an advantage. while i'd have to agree the xtx has an edge (also cost more, btw), the reality is they offer the same gameplay and same iq (tho you have to "suffer" with a little lighter grass texture if you have a GTX).
And given the fact that the X1900XTX runs at a higher resolution makes You´r statement(s) seem like bable to me :roll:
Originally posted by: zzzvideocardzzz
Originally posted by: nib95
Well, I'm running mine at nearly everything maxed 1920 x 1200 and my average frame rate is 60fps.
60-160fps in dungeons, 40-100fps in cities, and 40-60fps outdoors.
SLI FTW!
and exactly how many ppl can afford 600 dollars? and the extra money on SLI mobo and sli psu? and we all know how SLi setups does badly in mininum framerate department. Just buy a x1900xt for around 400 dollars now and oc it to the max FTW
Originally posted by: Cookie Monster
Originally posted by: Ackmed
Typical bias response. I guess NV was not experiencing high demand back with the 6800U, and ATi was with the X800XT/PE. According to your logic, thats how it works.
As I said before, NV didnt have this problem with the 7800GT/GTX launch. They never sold out, and prices dropped, not increased due to lack of cards. They were in very high demand as well, yet none of the current problems now, were back then. Thus, your arguement is false.
When the 6 series was released along with the X series LOTS of people thought ATi had the edge. The success of 9700/9800 cards gave great reputation to ATi along with other stuff like better IQ etc. No the 6800U wasnt that succesful like the GT variant, but the whole 6 series was the first step to redemption for NV when their FX series lost them alot of market share plus reputation AND money (They had to convince many that there product wasnt afailure like last time).Back then, MANY had an ATi card unlike some (Rollo for example although he had a reason).
Notice why NV didnt release the NV47(G70) against the X850 refreshes? NV had 1 year stock piling "7800" chips. They also had Sli which wouldve made no sense to release the "refreshes". Even with the demand, they never sold out because there was the quantity to back it up.
As of now, MANY people think NV is better than ATi. Performance is the ONLY thing that matters to the average joe. Only you and I plus the rest of the hardware folks talk about OCing, IQ etc. The 7800GT along with the hard launch of 7800GTX pretty much delivered what the average joe wanted where ATi delayed its launch and had problems.
As i was saying, NV can produce twice as much per wafer than R580 thanks to the die shrink plus decrease in transistor count. There had been NO report of heat issue NOR power leakage. NV has moved to 90nm fully now low to high end. They are getting good yields, but the question is why are they out of stock already?
There are two possible explanations. One is high demand. Second is that they are having problems. (Third could be because the partners didnt buy enough chips). I doubt second because we would know it by now. Im guessing and think it is high demand. (Based on the current trend of market share/capitalisation and what NV is doing as of now)
"Typical bias response."
Typical what? bias? Think about it. Where do you think ALL those 7900GT/GTXs went? Down the toliet? Why do you think they are pretty much giving away those expensive X1800XT/X1900XTX?