Oblivion finally benchmarked!

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Janooo

Golden Member
Aug 22, 2005
1,067
13
81
Originally posted by: CaiNaM
Originally posted by: Janooo
Originally posted by: CaiNaM
...

yes, i stated that as an XTX was reviewed, not an XT.

but using your logic, would it not be true the GTX could be overclocked as well (they use stock speeds on a BFG GXT)? heck, XFX ships with 700mhz core (BFG is 670mhz).


Yes, and XT(X) can go higher than 650. So what's your point?

how many? and how high?

how about the GTX, what do they all OC to? should we use the 800mhz GT? where do we stop?

the point is we can speculate all we want. i'm not the one speculating. the review was the review and it uses what it used, and i made statements based on that.


So why do you bring XFX into the picture? The review didn't use it. You are speculating!
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: Janooo
Originally posted by: CaiNaM
...

i personally don't like the methodology, but the review still shows that both cards offer comparable quality/performance curve.

I assume you would see the difference if the grass shadow is present or not. It means XT is giving you more realistic iamge. That's a quality difference right there.

you shouldn't assume:

The difference between the image quality on the Radeon X1900XTX and GeForce 7900 GTX wasn't noticeable, even with Grass Shadows turned off - all that seemed to do was darken the grass texture a little.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: Janooo
So why do you bring XFX into the picture? The review didn't use it. You are speculating!

read the thread. i made statements based on the article. someone else then stated "well, you can overclock and XT to an XTX." i replied, "well you can OC the GTX as well"..

following? make sense now?


Originally posted by: mazeroth
So CaiNaM, what kind of video card do YOU own? I wonder...

i have a sapphire x1800xt 512 (690/1600mhz) in my main rig and a bfg 7800oc in my second rig (hmm.. perhaps i should say wife's rig, hehe), a 6800GT and 9800AIW in my other 2 pc's, and a 9600 in my laptop.
 

Janooo

Golden Member
Aug 22, 2005
1,067
13
81
Originally posted by: CaiNaM
Originally posted by: Janooo
So why do you bring XFX into the picture? The review didn't use it. You are speculating!

read the thread. i made statements based on the article. someone else then stated "well, you can overclock and XT to an XTX." i replied, "well you can OC the GTX as well"..

following? make sense now?
...

Nevertheless you are speculating. The guy just said that for the price of XT you can get XTX performance. He wasn't talking about unlimited overclocking. You brought it up.

 

NoDamage

Member
Oct 7, 2000
65
0
0
Originally posted by: CaiNaM

it's not me who doesn't "get it" bud.

it's just hilarious (as well as annoying) to see the "technicalites" you have to reach for in order to make the 1900 superior. kinda like rollo used to do on behalf of nvidia...

like i stated earlier, the x1900xtx seems to have the edge, but for all intensive purposes they offer similar play/IQ.

as for "why" they did it, it's pretty simple. they were reaching for the highest "playable" setting. this isn't a "who has the highest fps" comparison or e-penis measuring stick, rather a hardocp style "best playable setting" comparison which tries to reflect "real life" gaming scenarios.

i personally don't like the methodology, but the review still shows that both cards offer comparable quality/performance curve.
In this case I don't see how this is reaching for technicalities or fanboyism. In the benchmarks, the X1900XTX shows a marginal (5 fps) lead over the 7900GTX. His point was that the lead would be much more significant if the two cards were run at the same settings (with the same detail level, same AF, and same shadows). From the numbers it is clear that with everything else being equal, the X1900XTX would offer a smoother gameplay experience compared to the 7900GTX.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: NoDamageIn this case I don't see how this is reaching for technicalities or fanboyism. In the benchmarks, the X1900XTX shows a marginal (5 fps) lead over the 7900GTX. His point was that the lead would be much more significant if the two cards were run at the same settings (with the same detail level, same AF, and same shadows). From the numbers it is clear that with everything else being equal, the X1900XTX would offer a smoother gameplay experience compared to the 7900GTX.

the problem is he is using speculation to make one product look signficantly better than another. the XTX does indeed offer a bit more performance in whatever particular area they benchmarked, however it's not significant enough to make alot of realworld difference.

there are other problems.. in this type of game, different areas are going to be limited by different hardware (cpu is a big thing in this game), and the nature of the game doesn't really lend itself well to being a performance benchmark. further, there can be no "apples to apples" comparison as they run different shader paths (sm2.0a for nv, sm2.0b for ati).

regardless of which product you have, you're limited to 1280 with high settings, yet for whatever reason some feel the need to make a big deal out of which is slightly faster by exagerrating the facts.

yes, the XTX in this review is a little faster. nothing gets "stomped" or "owned", and neither offers substantially more gameplay/iq than the other. regardless of which card you have, you can enjoy a pretty decent game. who do some people have to troll and take things out of context and to extremes to make it an nv vs ati issue?
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
I think ill wait for a more indepth review. Preferably by xbitlabs/hothardware/techreport (xbitlabs didnt review the 7 series for some reason however).
 

nib95

Senior member
Jan 31, 2006
997
0
0
For those of you who have the game, you will realise that grass shadows really affects performance.
So all I can say is kudos to ATI for winning this benchmark.

But on another note, this benchmark means diddly squat.

Again, for those of you with the game, and who follow it on the forums etc.
No two machines run this game exactly the same even WITH the same hardware.
It is one of the glitchiest, oddest and most perculiar performing games ever.
Until a patch is released no one can really take much head.

I've seen some 7900 GTX's manage max settings at 1600 x 1200 with high 50/60fps averages, but then another person with nearly exactly the same hardware gets 20fps.
It's just messed up. Without necessary tweaks and so on, these results are just not anything more then a pondering indication.

My PC for instance, one day it will run the same level at 50fps average (out doors at 1920 x 1200 max), and the next, I'll load from exactly the same spot and I'm getting 30fps average.
really weird and hard to explain. Some think it may have to do with a memory leak within the game.

 

NoDamage

Member
Oct 7, 2000
65
0
0
Originally posted by: CaiNaM
Originally posted by: NoDamageIn this case I don't see how this is reaching for technicalities or fanboyism. In the benchmarks, the X1900XTX shows a marginal (5 fps) lead over the 7900GTX. His point was that the lead would be much more significant if the two cards were run at the same settings (with the same detail level, same AF, and same shadows). From the numbers it is clear that with everything else being equal, the X1900XTX would offer a smoother gameplay experience compared to the 7900GTX.

the problem is he is using speculation to make one product look signficantly better than another. the XTX does indeed offer a bit more performance in whatever particular area they benchmarked, however it's not significant enough to make alot of realworld difference.

there are other problems.. in this type of game, different areas are going to be limited by different hardware (cpu is a big thing in this game), and the nature of the game doesn't really lend itself well to being a performance benchmark. further, there can be no "apples to apples" comparison as they run different shader paths (sm2.0a for nv, sm2.0b for ati).

regardless of which product you have, you're limited to 1280 with high settings, yet for whatever reason some feel the need to make a big deal out of which is slightly faster by exagerrating the facts.

yes, the XTX in this review is a little faster. nothing gets "stomped" or "owned", and neither offers substantially more gameplay/iq than the other. regardless of which card you have, you can enjoy a pretty decent game. who do some people have to troll and take things out of context and to extremes to make it an nv vs ati issue?
I agree that it is speculation so we don't really know for sure, but it does seem highly plausible given the reports we have seen from others. I am curious what the difference between the shader paths are, but even with those differences it would still be a good idea to attempt to get as close of a direct comparison as possible. Even if we can't change the fact that one card is using SM2.0a and one is using SM2.0b, we can at least adjust the AF, draw distance, shadows, and other settings so that they are equal on both cards. The point of these benchmarks is to show which card is better at handling this particular game, and consequently which card to recommend to someone who is looking to upgrade specifically for Oblivion. If they are on different shader paths and that is the maximum performance each card can pump out, then so be it.

As for the testing issues, unfortunately I don't think these guys explained the methodology they used to get their numbers. I think in order to eliminate as many variables as possible I would have tested in cards in multiple scenes (city, forest, landscape, dungeon, etc) and then collected the results for each one. Hopefully a more thorough review is forthcoming in which something like this is done.




 

Madellga

Senior member
Sep 9, 2004
713
0
0
Yes, ATI is better on this game. Then, Nvidia is better on others. So what?

Read the reviews, buy what you think it is better to you. Rollo was doing wrong here, but even so it is not forbidden to buy a Nvidia card and like it, is it?

I tried both cards and picked the one it was best for me. I prefer a balanced solution, which will play most games well. And a quieter one also, btw.

Some european magazines benchmark several games and make a score at the end that's the average of all tested games. This is way a better system than fighting for score on specific games.

BTW, I am playing several games at 1920x1200, HQ 4AA and 8AF with a GTX and have absolutely no issues on my rig. Haven't tried Oblivion yet.
 

5150Joker

Diamond Member
Feb 6, 2002
5,559
0
71
www.techinferno.com
Originally posted by: CaiNaM
Originally posted by: 5150Joker
Originally posted by: CaiNaM
Originally posted by: 5150Joker
Who said anything about image quality? My post was very clear: The XTX has a heavier workload with HQ AF and grass shadows turned on. Set it to standard AF and no grass shadows and the GTX would've gotten stomped on even harder. You reposting what bit-tech just reinforces what I said. If there was no noticeable IQ difference with grass shadows turned ON, then they should've disabled it for the XTX as well - of course if they did that the poor old GTX would've lost even worse.

like i said, a feature which offers no benefit is hardly a feature... not sure what's so hard to understand about that.


So what's your point? If the grass shadows offers no noticeable IQ gain yet incurs a noticeable performance penalty then it should've been turned off for the XTX as well. The GTX benefitted by having it turned off because it went from a min fps of 12 to 17. How hard is that for you to grasp? Do I need to send smoke signals to you so it finally gets through to you?
it's not me who doesn't "get it" bud.

it's just hilarious (as well as annoying) to see the "technicalites" you have to reach for in order to make the 1900 superior. kinda like rollo used to do on behalf of nvidia...

like i stated earlier, the x1900xtx seems to have the edge, but for all intensive purposes they offer similar play/IQ.

as for "why" they did it, it's pretty simple. they were reaching for the highest "playable" setting. this isn't a "who has the highest fps" comparison or e-penis measuring stick, rather a hardocp style "best playable setting" comparison which tries to reflect "real life" gaming scenarios.

i personally don't like the methodology, but the review still shows that both cards offer comparable quality/performance curve.



It's pretty pathetic how you have to reach so much to make excuses for nVidiia's pitiful performance. nVidia had to resort to emergency drivers + the review used much less demanding settings that gave the GTX a min. of 5 fps boost and it STILL lost. On top of that, the GTX is overclocked vs. a standard XTX. People that want the best D3D gaming card that is set to take advantage of shader heavy titles have one choice: X1900 XT/X.
 

Zebo

Elite Member
Jul 29, 2001
39,398
19
81
Yup.. Need more OGL like Riddick for nV to shine....that was one on my top 3 games last couple years..Until nV addresses thier lack of D3D, shimmering and shader performance they will continue to rank second best with those informed.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: NoDamageThe point of these benchmarks is to show which card is better at handling this particular game, and consequently which card to recommend to someone who is looking to upgrade specifically for Oblivion.

exactly.

the issue I have is the verbage used by some does not reflect that. so does the 7800gtx suck at this game? NO.

the point is they both performs similarly, however in this particular game the XTX performs a little better. why is that so hard to say for some people? does everything have to be made into a drama?

i have no problem with saying the XTX has an advantage here, but why do ppl have to say "this card stomps that", "not even fair", etc. to make it appear one got it's ass handed to it, when that is not what happened?

or if i say something like "well, the XTX does cost more" (which it does) then someone else jumps in with "well you can overclock an XT to XTX speeds".. so what? you can overclock a GTX as well... i mean, if you want to speculate about that, get an XT and a GTX and overclock them then compare. at least then we're not running from one speculation to another.



Originally posted by: 5150JokerIt's pretty pathetic how you have to reach so much to make excuses for nVidiia's pitiful performance. nVidia had to resort to emergency drivers + the review used much less demanding settings that gave the GTX a min. of 5 fps boost and it STILL lost. On top of that, the GTX is overclocked vs. a standard XTX. People that want the best D3D gaming card that is set to take advantage of shader heavy titles have one choice: X1900 XT/X.

what excuse? i haven't made any. i simply quoted what the article stated:

"The BFG Tech GeForce 7900 GTX OC wasn't quite as fast as the Radeon X1900XTX, as we were unable to run the game at its maximum settings. We were able to use almost maximum settings and high quality drivers too, as there were some areas where texture filtering could have been a little better - this was removed when optimisations were removed."

"The main difference was that the card didn't seem to want to achieve reasonably smooth gameplay when Grass Shadows were enabled. After disabling the grass shadowing, the frame rate improved enough to make the game feel reasonably smooth - the minimum frame rate improved from around 12 fps up to 17 fps. The difference between the image quality on the Radeon X1900XTX and GeForce 7900 GTX wasn't noticeable, even with Grass Shadows turned off - all that seemed to do was darken the grass texture a little."


the article concludes both cards perform similarly while offering the same level of image quality. have i stated anything other than that?

it's you who is twisting what they stated with your own commentary and overexaggerated adjectives to try and convince other the 7800gtx gets "stomped". i'm not the one with the hidden agenda here who has to resort to misquoting/misrepresenting the article - you are.
 

5150Joker

Diamond Member
Feb 6, 2002
5,559
0
71
www.techinferno.com
You keep harping on the image quality difference between the grass shadows being turned on or off yet nobody is discussing this except you. It's pretty damn clear that the GTX has subpar performance considering:

1. It's an overclocked GTX and not a stock one which would lose even worse.
2. It has grass shadows turned off that results in a shift of the min fps from 12 to 17.
3. It does not have the workload of angle independent AF.


Despite all 3 of the above, it still LOSES to a stock clocked XTX. That shows the GTX is quite an inferior card to the XTX in newer shader heavy games. There's no speculation needed, if you removed the extra workload on the XTX, it would get even further ahead of the OVERCLOCKED GTX. A standard GTX would get stomped on even harder. Pretty sad for nVidia..just proves their 7900 is a disappointment for newer shader heavy D3D games as predicted.
 

akugami

Diamond Member
Feb 14, 2005
5,837
2,101
136
Originally posted by: Bull Dog
To the whole mess of people posting above please keep in mind these facts:

A. The 7900GTX is running slighty OVERCLOCKED (4.6core OC and 2.5% memory OC, probably provides a 2% increase in FPS)
B. The X1900XTX is using 16xHQ AF (This causes a very measureable performance hit)
C. The X1900XTX has grass shadows turned on (On the GTX, this caused the minimum FPS to take a 5FPS hit/gain 5FPS)

Cliff notes: The X1900XTX gives the 7900GTX a nice thrashing.

QFT.

Seriously, not talking about image quality or how well the extra grass shadows enhance the overall quality of the graphics but the fact that it causes a noticeable framerate hit means it takes a lot to power it. Now, it's pure speculation that the 7900GTX would take an even worse beating if the X1900XTX didn't have grass shadows on but it's logical speculation given the noticeable performance hit of enabling grass shadows. And as others have stated till they're blue in the face, the X1900XTX has a heavier workload and still beats the 7900GTX which was slightly overclocked out of the box. I don't even understand why there is an argument as bit-tech's article seemed pretty fair and they were going after max playable settings rather than the normal gameplay benchmarks.

Originally posted by: nib95
For those of you who have the game, you will realise that grass shadows really affects performance.
So all I can say is kudos to ATI for winning this benchmark.

But on another note, this benchmark means diddly squat.

Again, for those of you with the game, and who follow it on the forums etc.
No two machines run this game exactly the same even WITH the same hardware.
It is one of the glitchiest, oddest and most perculiar performing games ever.
Until a patch is released no one can really take much head.

I've seen some 7900 GTX's manage max settings at 1600 x 1200 with high 50/60fps averages, but then another person with nearly exactly the same hardware gets 20fps.
It's just messed up. Without necessary tweaks and so on, these results are just not anything more then a pondering indication.

My PC for instance, one day it will run the same level at 50fps average (out doors at 1920 x 1200 max), and the next, I'll load from exactly the same spot and I'm getting 30fps average.
really weird and hard to explain. Some think it may have to do with a memory leak within the game.

I don't think the performance differences are a glitch. I read somewhere that Oblivion creates random terrain elements (rocks and other small objects) which could account for the differences in performance. This could account for differences because depending on what the game puts up, the amount of objects that need to be put on the screen can be different. This could also mean that multiple runs in the game with a HardOCP styled benchmark would be the only way to benchmark such a game if it had dynamic objects being produced that affects fps.

A google search for "oblivion random object" comes up with a lot of links. Most notably the one below from firing squad. http://www.firingsquad.com/news/newsarticle.asp?searchid=9465

BTW, this could account for some performance differences between the 7900GTX and the X1900XTX so I wouldn't say this is a 100% sure win for the X1900XTX. Without knowing further details of the testing, such as the exact area they tested it at, how many runs and how long each of the runs were for each card I can't conclusively say this was a win for the X1900XTX. This game is after all not like other games where if you run through the same area, regardless of which card you're using, the same amount of objects get rendered and in the same way each time. It is conceivable that during the runs for each of the nVidia cards that the game threw up more terrain elements than for the runs using the ATI cards. Conceivable, but highly highly unlikely.
 

Capt Caveman

Lifer
Jan 30, 2005
34,547
651
126
Originally posted by: mazeroth
it's pretty pathetic when ppl have to grasp at every little straw to try and claim an advantage. while i'd have to agree the xtx has an edge (also cost more, btw), the reality is they offer the same gameplay and same iq (tho you have to "suffer" with a little lighter grass texture if you have a GTX).

Actually, the 7900 GTX is CONSIDERABLY more expensive than the X1900XT. Yes, I know the review used the X1900XTX, but I've never seen an X1900XT that can't reach XTX speeds, ever. The X1900XT can be had for $100 less than a 7900 GTX, so the comparison is way more in favor of the ATI. No, I'm not a fan of either as I have a 7800GT and an X1900XT.

FWIW - I just received my HIS X1900XTX from Monarch today for $430 AR. Significantly lower than the 7900GTX.
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
I'm not keen on these kind of benchmarks... I want a "head to head comparison" not this kind of "selective" settings..
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: Capt Caveman
FWIW - I just received my HIS X1900XTX from Monarch today for $430 AR. Significantly lower than the 7900GTX.

while i don't consider limited time rebates as overall price, it is true some ati partners have dropped their prices, tho for the most part they are still at $500 and up (not to say you can't find a deal somewhere from time to time if you look hard enough), tho with tight supply it's not like the gtx is any cheaper.

Originally posted by: jim1976
I'm not keen on these kind of benchmarks... I want a "head to head comparison" not this kind of "selective" settings..

i think this is somewhat useful in comparing usable settings between different cards (tho it's still not my favorite testing methodology), but it certianly isn't an accurate way of determining whose e-penis is larger


Originally posted by: 5150Joker
You keep harping on the image quality difference between the grass shadows being turned on or off yet nobody is discussing this except you. It's pretty damn clear that the GTX has subpar performance considering:

1. It's an overclocked GTX and not a stock one which would lose even worse.
it's a GTX clocked at 670mhz. other brands (XFX is an example) clock at 700mhz. so maybe i should (like you) just take the ball and run and state the xtx would lose to the the 700mhz part (even tho that would be inaccurate on my part, as it would simply be speculation)?

2. It has grass shadows turned off that results in a shift of the min fps from 12 to 17.

so? it makes no diff to IQ.

3. It does not have the workload of angle independent AF.

so? again, it makes no diff to IQ

Despite all 3 of the above, it still LOSES to a stock clocked XTX. That shows the GTX is quite an inferior card to the XTX in newer shader heavy games. There's no speculation needed, if you removed the extra workload on the XTX, it would get even further ahead of the OVERCLOCKED GTX. A standard GTX would get stomped on even harder. Pretty sad for nVidia..just proves their 7900 is a disappointment for newer shader heavy D3D games as predicted.

you certainly make plenty of excuses (not to mention speculation).

i simply stated they both have comparable performance and IQ, with an edge in performance to the XTX part. nothing more, nothing less. what is inaccurate about my statement?

an im not asking you to make excuses or speculate on other things to make your e-penis feel larger to you.. while you speculate and hypothesize regarding what if this was on, what if this was such and such speed.. just a simple question: what is inaccurate about my statements?
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: CaiNaM
Originally posted by: Capt Caveman
FWIW - I just received my HIS X1900XTX from Monarch today for $430 AR. Significantly lower than the 7900GTX.

while i don't consider limited time rebates as overall price, it is true some ati partners have dropped their prices, tho for the most part they are still at $500 and up (not to say you can't find a deal somewhere from time to time if you look hard enough), tho with tight supply it's not like the gtx is any cheaper.

you got a great buy tho

Originally posted by: jim1976
I'm not keen on these kind of benchmarks... I want a "head to head comparison" not this kind of "selective" settings..

i think this is somewhat useful in comparing usable settings between different cards (tho it's still not my favorite testing methodology), but it certianly isn't an accurate way of determining whose e-penis is larger


Originally posted by: 5150Joker
You keep harping on the image quality difference between the grass shadows being turned on or off yet nobody is discussing this except you. It's pretty damn clear that the GTX has subpar performance considering:

1. It's an overclocked GTX and not a stock one which would lose even worse.
it's a GTX clocked at 670mhz. other brands (XFX is an example) clock at 700mhz. so maybe i should (like you) just take the ball and run and state the xtx would lose to the the 700mhz part (even tho that would be inaccurate on my part, as it would simply be speculation)?

2. It has grass shadows turned off that results in a shift of the min fps from 12 to 17.

so? it makes no diff to IQ.

3. It does not have the workload of angle independent AF.

so? again, it makes no diff to IQ

Despite all 3 of the above, it still LOSES to a stock clocked XTX. That shows the GTX is quite an inferior card to the XTX in newer shader heavy games. There's no speculation needed, if you removed the extra workload on the XTX, it would get even further ahead of the OVERCLOCKED GTX. A standard GTX would get stomped on even harder. Pretty sad for nVidia..just proves their 7900 is a disappointment for newer shader heavy D3D games as predicted.

you certainly make plenty of excuses (not to mention speculation).

i simply stated they both have comparable performance and IQ, with an edge in performance to the XTX part. nothing more, nothing less. what is inaccurate about my statement?

an im not asking you to make excuses or speculate on other things to make your e-penis feel larger to you.. while you speculate and hypothesize regarding what if this was on, what if this was such and such speed.. just a simple question: what is inaccurate about my statements?
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Originally posted by: CaiNaM
i think this is somewhat useful in comparing usable settings between different cards (tho it's still not my favorite testing methodology), but it certianly isn't an accurate way of determining whose e-penis is larger

LOL. Well if you see it this way m8 then every single benchmark is a "e-penis measurement". I would like to see what my x1900xt@xtx can do in straight comparison with Nvidia's solutions..
And if they didn't see a difference in IQ why did they use it in X1900XTX anyway?
You see now why I don't favor this kind of "selective" benchmarks?
And though it might not make the big diffrence if 7900gtx is clocked @670 it's still NOT a direct comparison.. It's not excuses, it's not a fair comparsion..
 

Frostwake

Member
Jan 12, 2006
163
0
0
quote:
2. It has grass shadows turned off that results in a shift of the min fps from 12 to 17.



so? it makes no diff to IQ.

quote:
3. It does not have the workload of angle independent AF.



so? again, it makes no diff to IQ

Omg... Are you just plain dumb or a troll? IF IT MAKES NO DIFFERENCE WHY USE IT ON THE x1900 AND NOT ON THE 7900? Dont you get it? Its COMPLETELY stupid and biased

And HQ AF doesnt make any diff to iq? LOL right, thats why its the most sought after feature on new ati cards

Seriously, stop trolling already
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: jim1976
LOL. Well if you see it this way m8 then every single benchmark is a "e-penis measurement". I would like to see what my x1900xt@xtx can do in straight comparison with Nvidia's solutions..

well really, isn't it? if they both perform similarly and give the user the same "experience", then does it really matter which is 5fps faster than the other?

i mean.. since i have an x1800xt, it's slower than both of em. not like it ruins my day. if they compared it to a 7900gt, it really would not be a big deal to me which is a few fps faster if both can give similar performance and image quality. if one gave way better than the other, than i might switch to the other card, but i'm certainly not gonna get bent out of shape over it.

And if they didn't see a difference in IQ why did they use it in X1900XTX anyway?

i don't know. you would have to ask them. maybe they left it cause it made no difference in perfomance, maybe they are biased against ati and are nv fanboys? only they would know for certain...

You see now why I don't favor this kind of "selective" benchmarks?
And though it might not make the big diffrence if 7900gtx is clocked @670 it's still NOT a direct comparison.. It's not excuses, it's not a fair comparsion..

then what is? one clocked at 700mhz? they compare a bfg 7900gtx at it's shipping clockspeed. what's unfair about that?

at any rate, i don't put too much into any single benchmark from a single site. i tend to look at multiple review and benchmarks, read the editorials as well.. then make a decision. often times i won't even reach a conclusion until i get to test these myself (which is why i had both x800 and 6800 last generation).

Originally posted by: Frostwake
quote:
2. It has grass shadows turned off that results in a shift of the min fps from 12 to 17.



so? it makes no diff to IQ.

quote:
3. It does not have the workload of angle independent AF.



so? again, it makes no diff to IQ

Omg... Are you just plain dumb or a troll? IF IT MAKES NO DIFFERENCE WHY USE IT ON THE x1900 AND NOT ON THE 7900? Dont you get it? Its COMPLETELY stupid and biased

And HQ AF doesnt make any diff to iq? LOL right, thats why its the most sought after feature on new ati cards

i have hq af. it makes no difference in oblivion. if it makes no visible difference, then is it in fact really doing "more"? or is that a difficult concept to grasp?

Seriously, stop trolling already

what they state is both card perform similarly and give the same IQ. that was the point of the article. their endeavor was not to find out who had the almightiest video card on the planet.

it's YOU guys who have to turn it into a "my card is better than your card" argument. if you don't "get" that, then heh.. you need to look in the mirror and question intelligence

and if it really ruins your day that they did that, then complain to them and have them "turn off" shadows on the XTX and publish the results. just explain to them if they do everything equal and the xtx "stomps" the geforce, it will make you feel much better about yourself
 

Frostwake

Member
Jan 12, 2006
163
0
0
Actually im runing a dinosaur gf2 mx here lol but i just got so annoyed that you wouldnt understand the fact joker was pointing out - they should have used the same settings on each card if there was no iq diff... Ill wait for some "apples to apples" site to review it though
 

blatherbeard

Member
Mar 31, 2005
55
0
0
Playability is in the eyes(literally) of the beholder. I can play this game on my 6800gt even at 12-17 fps. ( i actually get 22-50 now with that tweaked driver and the card slightly clocked under and ultra). But my eyes can take the low fps, i just hate load times and slowdowns which i dont get at all now. (all my fades are turned all the way up to max, shadows are off(i dont like the way the shadows look on the people), grass is on but not max and its still a great looking game "to me". But i think im abnormal when it comes to playing games anyway cause i know tons of people who cant play games with lower fps like i can. I think it goes back to the days when i had a really really crappy pc and tried playing duke nukem 3 for hours just to find out almost to the end my pc coulndt handle the end game graphics. lol
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |