Xbitlab's G71 review!

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
ahhhh keysplayr2003/rollo/falllenangell/cainam/wreckage/crusader/hemmy vs 5150Joker

Wooo ATI days outnumbered ;*(

Balance of power has changed dramatically and 51510Joker was on NVIDIA side once when he owned a 7800GTX.

Nvidia bought everyone it seems...

ATI days are bad and they ain't getting even better
 

nib95

Senior member
Jan 31, 2006
997
0
0
Originally posted by: keysplayr2003
Originally posted by: nib95
Originally posted by: keysplayr2003
Originally posted by: BFG10K
Why would adaptive AA make any difference?
Alpha textures.

And that means......???? If you have a link or something explaining Alpha textures, that would be cool. Thanks.

Well I have no idea what it means.
All I know is, with my own two eyes, I saw that with Adaptive AA enabled on my old X1900 XTX, I saw AA even in the distance. That's all I know.

Well thats a good thing. And is your performance different with and without adaptive AA?


Though in theory the'r should have been.
The only game I really noticed a difference in performance with Adaptive AA enabled was Counter Strike. Oddly COD2 didnt make a difference.
But asking me now would be from memory because I now own two Nvidia cards
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: 5150Joker
Originally posted by: keysplayr2003
Originally posted by: 5150Joker
Originally posted by: keysplayr2003


Shhhh.... Jokers is hard at work here..


It just shows Xbit's newest review results are inconsistent with other reviews out there including their own. Furthermore they failed to test Oblivion with AF because as I pointed out, if they had, nVidia's numbers would be even lower.

I saw no mention of them not using AF. I did see that they disabled FSAA on both cards so they could use HDR.

Quote: "We decided not to test our solutions with enabled FSAA, because the HDR support gets disabled even on ATI cards in this case and the graphics quality drops down significantly."

I read through the oblivion page twice and saw no mention of AF unless I am missing something. Are you just assuming they aren't using AF? Or did you actually read that somewhere and I missed it?



They used their Pure Speed setting which has no AF.

Ahh, thank you. Knew I must have overlooked something.

 

Barkotron

Member
Mar 30, 2006
66
0
0
Originally posted by: keysplayr2003
Originally posted by: Barkotron
Originally posted by: keysplayr2003
Originally posted by: Barkotron
Originally posted by: Cookie Monster
Well, what do you think?

I think it's a desperate attempt to make a pretty bad beating not look so bad, to be honest.

Do you really think they're going to be able to make up 80% in the minimum frame rates with driver updates? If so, why wouldn't ATI be able to bring performance improvements as well? Why do you think the position is somehow magically going to change after a couple of driver revisions?

Driver revision improvements in performance are always a possibility and a frequently used excuse from both sides in this and a bazillion other forums. Much like the way folks are saying the 6.4 Cats will correct the problem for Crossfire and oblivion. So yes, Nvidia can improve almost anything with driver revisions and they have well proven this.
ATI can do the same thing. So I don't really see any merit in your post here.


The "merit in my post", as you would have it, is to point out that trying to pass off what is a pretty hefty beating by saying that future hypothetical driver improvements could produce different results is pointless and, frankly, desperate. The results are what they are. Whether they'd be different with drivers that don't currently exist or not is neither here nor there, especially as - as you agree - there's no reason to believe that both sides can't improve their performance.

I'm not saying that there wouldn't be performance improvements. I'm saying that using hypothetical future improvements to attempt to explain away what is a bad result in one game for one manufacturer is a bogus and desperate tactic which I, personally, "don't really see any merit" in.

We don't even know how long that minimum framerate was sustained. It could have been a nanosecond drop and fraps recorded that minimum. So as for now, unless someone can show the duration of that drop in framerate on the 7900GTX card, it is open for interpretation and opinion.

And what did you make of Jokers very informative post about various review site descrepancies?


"Could have" - sure it could have. And? People can either take the figures as they are or try to pick holes in them and second-guess them to suit their own particular agenda. What's the point in saying that the frame-rate "could have" just been a nanosecond drop as if that's some kind of argument against the results? From those graphs, it's just as easy to argue that the ATI card stayed at 40FPS the whole time apart from a nanosecond drop to 27, or 22 or whatever. I don't see that it remotely helps anyone to claim that though - it just makes the people making these kinds of hypothetical excuses look ridiculous.

Saying, "oh it might have been a nanosecond drop" isn't "interpretation and opinion" - I just don't see that it brings anything worthwhile to the discussion, and sounds more like spin and PR than anything. What's the point in having reviews at all if we're just going to pull random statements out of the air which make them meaningless?

As for the differences between review sites, personally, I find it hard to compare the Xbit and Extremetech reviews - they seem to quote totally different figures for the same games. Are Xbit listing minimum and maximum in some reviews, versus just the maximum on Extremetech? Extremetech also don't list (as far as I can see) what the various Catalyst A.I/ NV Control panel thingy settings are - quality, performance, adaptive etc. etc. Any changes here could have a big effect on FPS and IQ.

I suppose people have to do what they tend to do, which is either:

a) look at a whole bunch of reviews and see what comes out on top for the games they play. For me, on balance, this is the X1900XT (the fact that it's about £80 cheaper than the only 7900GTXs I can find in stock is also a consideration), especially as I can't see myself playing anything other than Oblivion for quite a while yet . If I was particularly into Quake or something I might have gone for the 7900. I did actually pre-order the 7900GT but cancelled because I saw the XT for not a whole lot more.

b) pick the reviews they read according to whether it shows their favoured manufacturer (which is a weird idea. How can someone who doesn't work for the companies care either way which vendor comes out on top? Just buy the best card, weirdos... ) in the best light or not, and explain away any results which show "their team" losing.
 

coldpower27

Golden Member
Jul 18, 2004
1,677
0
76
Also remember this article hasn't tested with the Oblivion enhanced Forceware 84.25, it's using the older 84.21.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: Barkotron
Originally posted by: keysplayr2003
Originally posted by: Barkotron
Originally posted by: keysplayr2003
Originally posted by: Barkotron
Originally posted by: Cookie Monster
Well, what do you think?

I think it's a desperate attempt to make a pretty bad beating not look so bad, to be honest.

Do you really think they're going to be able to make up 80% in the minimum frame rates with driver updates? If so, why wouldn't ATI be able to bring performance improvements as well? Why do you think the position is somehow magically going to change after a couple of driver revisions?

Driver revision improvements in performance are always a possibility and a frequently used excuse from both sides in this and a bazillion other forums. Much like the way folks are saying the 6.4 Cats will correct the problem for Crossfire and oblivion. So yes, Nvidia can improve almost anything with driver revisions and they have well proven this.
ATI can do the same thing. So I don't really see any merit in your post here.


The "merit in my post", as you would have it, is to point out that trying to pass off what is a pretty hefty beating by saying that future hypothetical driver improvements could produce different results is pointless and, frankly, desperate. The results are what they are. Whether they'd be different with drivers that don't currently exist or not is neither here nor there, especially as - as you agree - there's no reason to believe that both sides can't improve their performance.

I'm not saying that there wouldn't be performance improvements. I'm saying that using hypothetical future improvements to attempt to explain away what is a bad result in one game for one manufacturer is a bogus and desperate tactic which I, personally, "don't really see any merit" in.

We don't even know how long that minimum framerate was sustained. It could have been a nanosecond drop and fraps recorded that minimum. So as for now, unless someone can show the duration of that drop in framerate on the 7900GTX card, it is open for interpretation and opinion.

And what did you make of Jokers very informative post about various review site descrepancies?


"Could have" - sure it could have. And? People can either take the figures as they are or try to pick holes in them and second-guess them to suit their own particular agenda. What's the point in saying that the frame-rate "could have" just been a nanosecond drop as if that's some kind of argument against the results? From those graphs, it's just as easy to argue that the ATI card stayed at 40FPS the whole time apart from a nanosecond drop to 27, or 22 or whatever. I don't see that it remotely helps anyone to claim that though - it just makes the people making these kinds of hypothetical excuses look ridiculous.

Saying, "oh it might have been a nanosecond drop" isn't "interpretation and opinion" - I just don't see that it brings anything worthwhile to the discussion, and sounds more like spin and PR than anything. What's the point in having reviews at all if we're just going to pull random statements out of the air which make them meaningless?

As for the differences between review sites, personally, I find it hard to compare the Xbit and Extremetech reviews - they seem to quote totally different figures for the same games. Are Xbit listing minimum and maximum in some reviews, versus just the maximum on Extremetech? Extremetech also don't list (as far as I can see) what the various Catalyst A.I/ NV Control panel thingy settings are - quality, performance, adaptive etc. etc. Any changes here could have a big effect on FPS and IQ.

I suppose people have to do what they tend to do, which is either:

a) look at a whole bunch of reviews and see what comes out on top for the games they play. For me, on balance, this is the X1900XT (the fact that it's about £80 cheaper than the only 7900GTXs I can find in stock is also a consideration), especially as I can't see myself playing anything other than Oblivion for quite a while yet . If I was particularly into Quake or something I might have gone for the 7900. I did actually pre-order the 7900GT but cancelled because I saw the XT for not a whole lot more.

b) pick the reviews they read according to whether it shows their favoured manufacturer (which is a weird idea. How can someone who doesn't work for the companies care either way which vendor comes out on top? Just buy the best card, weirdos... ) in the best light or not, and explain away any results which show "their team" losing.


Well, both of the cards compared very closely to each other. So, looks like we can't go wrong with either choice. Each has it's pro's and cons. Both are great products.

And driver updates are a very valid concern. Look how much nvidia improved F.E.A.R. performance with a driver update. Was previously getting owned by R580 and it's shader power. So for you to dismiss the potential of driver improvements makes no sense. Look how much ATI improved OGL performance lately. You shouldn't rule these things out.

About the nanosecond min frame drop. Because you stated "sure it could have", that's enough for me.
Every comment thereafter went against your initial statement of "sure it could have". So, it was kind of moot.

 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: nib95
Originally posted by: keysplayr2003
Originally posted by: nib95
Originally posted by: keysplayr2003
Originally posted by: BFG10K
Why would adaptive AA make any difference?
Alpha textures.

And that means......???? If you have a link or something explaining Alpha textures, that would be cool. Thanks.

Well I have no idea what it means.
All I know is, with my own two eyes, I saw that with Adaptive AA enabled on my old X1900 XTX, I saw AA even in the distance. That's all I know.

Well thats a good thing. And is your performance different with and without adaptive AA?


Though in theory the'r should have been.
The only game I really noticed a difference in performance with Adaptive AA enabled was Counter Strike. Oddly COD2 didnt make a difference.
But asking me now would be from memory because I now own two Nvidia cards

ATI card gone?

 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: coldpower27
Also remember this article hasn't tested with the Oblivion enhanced Forceware 84.25, it's using the older 84.21.

Kind of my point. If the 84.25 drivers enhance oblivion performance, they should be tried and tested. Even if it is a beta, it would be good to see any improvements as a hint of performance gains to come. This goes for both companies mind you. The 6.4 CATs are supposed to fix the Xfire issue with oblivion.

 

dfloyd

Senior member
Nov 7, 2000
978
0
0
Originally posted by: Barkotron
Originally posted by: keysplayr2003
Originally posted by: Barkotron
Originally posted by: keysplayr2003
Originally posted by: Barkotron
Originally posted by: Cookie Monster
Well, what do you think?

I think it's a desperate attempt to make a pretty bad beating not look so bad, to be honest.

Do you really think they're going to be able to make up 80% in the minimum frame rates with driver updates? If so, why wouldn't ATI be able to bring performance improvements as well? Why do you think the position is somehow magically going to change after a couple of driver revisions?

Driver revision improvements in performance are always a possibility and a frequently used excuse from both sides in this and a bazillion other forums. Much like the way folks are saying the 6.4 Cats will correct the problem for Crossfire and oblivion. So yes, Nvidia can improve almost anything with driver revisions and they have well proven this.
ATI can do the same thing. So I don't really see any merit in your post here.


The "merit in my post", as you would have it, is to point out that trying to pass off what is a pretty hefty beating by saying that future hypothetical driver improvements could produce different results is pointless and, frankly, desperate. The results are what they are. Whether they'd be different with drivers that don't currently exist or not is neither here nor there, especially as - as you agree - there's no reason to believe that both sides can't improve their performance.

I'm not saying that there wouldn't be performance improvements. I'm saying that using hypothetical future improvements to attempt to explain away what is a bad result in one game for one manufacturer is a bogus and desperate tactic which I, personally, "don't really see any merit" in.

We don't even know how long that minimum framerate was sustained. It could have been a nanosecond drop and fraps recorded that minimum. So as for now, unless someone can show the duration of that drop in framerate on the 7900GTX card, it is open for interpretation and opinion.

And what did you make of Jokers very informative post about various review site descrepancies?


"Could have" - sure it could have. And? People can either take the figures as they are or try to pick holes in them and second-guess them to suit their own particular agenda. What's the point in saying that the frame-rate "could have" just been a nanosecond drop as if that's some kind of argument against the results? From those graphs, it's just as easy to argue that the ATI card stayed at 40FPS the whole time apart from a nanosecond drop to 27, or 22 or whatever. I don't see that it remotely helps anyone to claim that though - it just makes the people making these kinds of hypothetical excuses look ridiculous.

Saying, "oh it might have been a nanosecond drop" isn't "interpretation and opinion" - I just don't see that it brings anything worthwhile to the discussion, and sounds more like spin and PR than anything. What's the point in having reviews at all if we're just going to pull random statements out of the air which make them meaningless?

As for the differences between review sites, personally, I find it hard to compare the Xbit and Extremetech reviews - they seem to quote totally different figures for the same games. Are Xbit listing minimum and maximum in some reviews, versus just the maximum on Extremetech? Extremetech also don't list (as far as I can see) what the various Catalyst A.I/ NV Control panel thingy settings are - quality, performance, adaptive etc. etc. Any changes here could have a big effect on FPS and IQ.

I suppose people have to do what they tend to do, which is either:

a) look at a whole bunch of reviews and see what comes out on top for the games they play. For me, on balance, this is the X1900XT (the fact that it's about £80 cheaper than the only 7900GTXs I can find in stock is also a consideration), especially as I can't see myself playing anything other than Oblivion for quite a while yet . If I was particularly into Quake or something I might have gone for the 7900. I did actually pre-order the 7900GT but cancelled because I saw the XT for not a whole lot more.

b) pick the reviews they read according to whether it shows their favoured manufacturer (which is a weird idea. How can someone who doesn't work for the companies care either way which vendor comes out on top? Just buy the best card, weirdos... ) in the best light or not, and explain away any results which show "their team" losing.



Ahh what its like to be young (As in post count ), honest, and uncorrupted. Strange concept huh?
 

coldpower27

Golden Member
Jul 18, 2004
1,677
0
76
Originally posted by: keysplayr2003
Originally posted by: coldpower27
Also remember this article hasn't tested with the Oblivion enhanced Forceware 84.25, it's using the older 84.21.

Kind of my point. If the 84.25 drivers enhance oblivion performance, they should be tried and tested. Even if it is a beta, it would be good to see any improvements as a hint of performance gains to come. This goes for both companies mind you. The 6.4 CATs are supposed to fix the Xfire issue with oblivion.


Yeah I have also heard of Catalyst 6.4 helping improving Crossifre perofrmance on ATI cards, it will be interesting to see if Nvidia can fix the performance level further in Oblivion like they did with F.E.A.R.
 

Barkotron

Member
Mar 30, 2006
66
0
0
Originally posted by: keysplayr2003
Well, both of the cards compared very closely to each other. So, looks like we can't go wrong with either choice. Each has it's pro's and cons. Both are great products.

And driver updates are a very valid concern. Look how much nvidia improved F.E.A.R. performance with a driver update. Was previously getting owned by R580 and it's shader power. So for you to dismiss the potential of driver improvements makes no sense. Look how much ATI improved OGL performance lately. You shouldn't rule these things out.

About the nanosecond min frame drop. Because you stated "sure it could have", that's enough for me.
Every comment thereafter went against your initial statement of "sure it could have". So, it was kind of moot.

I didn't rule the driver updates out. What I said was that using non-existent drivers as a basis for trying to dismiss test results is bogus. Which it is.

As for the "sure it could have", you need to read the rest of my post again. My point is that saying "it could have been a nanosecond frame drop" is totally useless pontificating which doesn't lead us anywhere and makes you look ridiculous. If people are just going to come up with bizarre suggestions to explain away results in one vendor's favour instead of taking the results as they are, then there's no point in anyone writing reviews and actually coming out with real, tested figures. We might as well all just make up whatever numbers we like in our heads. Maybe that way all the supporters of ATI or NVidia will get the numbers they're happy with and forums will be spared the idiocy...
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
My thought on this:

1. They should have used high quality mode on both cards. I wouldn't buy a $500 card and run games with lower IQ than the card is capable of.

2. Beta drivers should not be used in any official benchmarks. They may improve performance in some games, but it may result in lower performance in other games, and the increased performance may have come from some IQ "optimizations". I still have not seen the FEAR improvement on Nv cards benchmarked using actual gameplay and not the built in timedemo. Given that the 7800 series has been released almost a year ago, I find it unusual that with the release of the almost identical 7900 series Nv somehow found a magical fix for their FEAR performance that has eluded them all this time.

3. Interesting results from their shader tests. It seems like the 7900gtx is on par with the x1900xtx in most shader tests, despite having less shader processors. Also, it's obvious that dynamic branching is a major weakness of Nv cards, the g71 offers no improvement over the g70 in that area, except for higher clockspeeds.
 

Barkotron

Member
Mar 30, 2006
66
0
0
Originally posted by: keysplayr2003
Originally posted by: coldpower27
Also remember this article hasn't tested with the Oblivion enhanced Forceware 84.25, it's using the older 84.21.

Kind of my point. If the 84.25 drivers enhance oblivion performance, they should be tried and tested. Even if it is a beta, it would be good to see any improvements as a hint of performance gains to come. This goes for both companies mind you. The 6.4 CATs are supposed to fix the Xfire issue with oblivion.

Fine. If they're there they should be tested. If they can make up the 80% minimum frame rate gap at 12x10 without cheating on IQ then the people who wrote them should be given a big ole wodge of cash, and the people who wrote the 84.21s should be taken out back and shot.

From looking around, I don't see any tests yet. There are a bunch of totally conflicting reports in various forums though, from "OMG an extra 20FPS!" to "WTF these drivers broke my computer", which is to be expected, I suppose.

Anyone sensible tested them yet?
 
Sep 6, 2005
135
0
0
Feh. As much as I'd hate to say it, Joker really is becoming the Rollo of ATi at this point, regardless of past sidings.

First off, I won't dismiss possible driver updates. We've seen what's happened with them in the past (ATi's OpenGL enhancement, nVidia's FEAR improvements, etc), so I won't count them out in the future, especially in such popular titles like Oblivion. Sure, ATi can also improve performance here, but hey, better for both companies.

Also, I agree with what Munky said: If you're shelling out 500 dollars, you're not gonna play on anything lower than the highest settings. It seems companies are using the Doom 3 precedent, that being to make it so cards can't (technically) fully utilize a game's maximum settings with current-gen videocards (I believe it was said that 512MB of videocard memory was required for Ultra High in Doom3, although folks seemed to be able to run it just fine with 256 parts), as an excuse to poorly code a game. As good as Oblivion looks, regardless of what videocard, there's no excuse, especially on a HIGH END CARD, for it to drop below 30 AT ALL, even on max settings. Sure, dual-cards may have taken the ultra-highend, but that doesn't mean that single-card users should have to be stuck with sub-par graphical settings.

However, even so, I don't think this review should be disregarded. Results from various other websites have varied in their numbers as well, and since X-bit has proven to be plenty reliable in the past, I think it can be trusted again. Furthermore, I don't believe that nVidia supporters should be castrated just for liking the results here.
 

Barkotron

Member
Mar 30, 2006
66
0
0
Originally posted by: Finny
Feh. As much as I'd hate to say it, Joker really is becoming the Rollo of ATi at this point, regardless of past sidings.

First off, I won't dismiss possible driver updates. We've seen what's happened with them in the past (ATi's OpenGL enhancement, nVidia's FEAR improvements, etc), so I won't count them out in the future, especially in such popular titles like Oblivion. Sure, ATi can also improve performance here, but hey, better for both companies.

Also, I agree with what Munky said: If you're shelling out 500 dollars, you're not gonna play on anything lower than the highest settings. It seems companies are using the Doom 3 precedent, that being to make it so cards can't (technically) fully utilize a game's maximum settings with current-gen videocards (I believe it was said that 512MB of videocard memory was required for Ultra High in Doom3, although folks seemed to be able to run it just fine with 256 parts), as an excuse to poorly code a game. As good as Oblivion looks, regardless of what videocard, there's no excuse, especially on a HIGH END CARD, for it to drop below 30 AT ALL, even on max settings. Sure, dual-cards may have taken the ultra-highend, but that doesn't mean that single-card users should have to be stuck with sub-par graphical settings.

However, even so, I don't think this review should be disregarded. Results from various other websites have varied in their numbers as well, and since X-bit has proven to be plenty reliable in the past, I think it can be trusted again. Furthermore, I don't believe that nVidia supporters should be castrated just for liking the results here.


1. Nobody's dismissing possible driver updates. I, however, am dismissing the idea that possible future driver updates should be used as an excuse for a poor showing on either side. Well, unless there's a very clear driver bug, as there have been in the past with SLI/Crossfire setups, or the renaming fear.exe in earlier Cats etc.

2. Why is there a right to expect a minimum of 30FPS just because someone's spent a lot of money on a videocard? Have you actually seen the amount of stuff that goes on outdoors in Oblivion? It looks incredible, and screenshots just don't do it justice. Frankly I'm impressed that frame rates are as high as they are.

3. Xbit seems to be as reliable as other sites out there. Personally I have no problem with the numbers they're putting out - I'm just pointing out that specifically those Oblivion scores are not numbers "NVidia fans" should be shouting about. The ATI card has 80% better minimum FPS @12x10 and nearly 60% better @16x12. That's a big difference at any level, but when the difference is between 27FPS and 15FPS, or 22FPS and 14FPS, then it's the difference between "just about playable" and "severely affecting gameplay". Those scores are a hammering, and no meaningless drivel about "nanosecond frame rate drops" can hide that.

Sure, driver updates can bring big improvements, but until they've been tested, using possible future driver enhancements as an excuse is pure spin.
 

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
Originally posted by: Finny
Feh. As much as I'd hate to say it, Joker really is becoming the Rollo of ATi at this point, regardless of past sidings.

First off, I won't dismiss possible driver updates. We've seen what's happened with them in the past (ATi's OpenGL enhancement, nVidia's FEAR improvements, etc), so I won't count them out in the future, especially in such popular titles like Oblivion. Sure, ATi can also improve performance here, but hey, better for both companies.

Also, I agree with what Munky said: If you're shelling out 500 dollars, you're not gonna play on anything lower than the highest settings. It seems companies are using the Doom 3 precedent, that being to make it so cards can't (technically) fully utilize a game's maximum settings with current-gen videocards (I believe it was said that 512MB of videocard memory was required for Ultra High in Doom3, although folks seemed to be able to run it just fine with 256 parts), as an excuse to poorly code a game. As good as Oblivion looks, regardless of what videocard, there's no excuse, especially on a HIGH END CARD, for it to drop below 30 AT ALL, even on max settings. Sure, dual-cards may have taken the ultra-highend, but that doesn't mean that single-card users should have to be stuck with sub-par graphical settings.

However, even so, I don't think this review should be disregarded. Results from various other websites have varied in their numbers as well, and since X-bit has proven to be plenty reliable in the past, I think it can be trusted again. Furthermore, I don't believe that nVidia supporters should be castrated just for liking the results here.


lol have you notice that no one reallys defence ATI no more ;( I remember this forum having more than few ATI fanboy nows it bascially Craploads of Nvidia fanboys vs Joker

WHat that shows is that ATI has lost total control in this forum and now its Nvidia infested ; !

like for example if i said something outrages about NVidia .. I bet i get flamed but if keysplayr2003/rollo/falllenangell/cainam/wreckage/crusader/hemmy said something outrages i bet the only real dude that would flame them would be joker.. !

So my point is that joker really isn?t that bad as you think .. it is just no one is really their to back him up like NVIDIA fanboy team do : )

Nvidia seem to be buying everything they need to win hard. They have great pr dude hyping up crap and have an awesome program like AEG that suites the needs of Popular enthusiastic wanting cheaper/free video cards and also cash . They have wholesalers and Editors on their side. They also have payed game developing teams to make sure games run better on Nvidia cards (ID DOOM3 , Epic UT2007).
 

Dethfrumbelo

Golden Member
Nov 16, 2004
1,499
0
0
Indeed, minimum frames are really all that counts and ATI does a much better job there. 20 fps in some games can be playable, probably not enjoyable, but tolerable for short periods. 15 fps or less will be a killer in most games.

It's pretty clear that ATI is the better choice for this generation in terms of performance/IQ/features. But I'm guessing that will all change in June. G71 is a time-filler until G80 comes out.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: tuteja1986

like for example if i said something outrages about NVidia .. I bet i get flamed but if keysplayr2003/rollo/falllenangell/cainam/wreckage/crusader/hemmy said something outrages i bet the only real dude that would flame them would be joker.. !

So as my point is that joker really isn?t that bad as you think .. it is just no one is really their to back him up like NVIDIA fanboy team do : )

Well, there is always yourself, Ackmed, Morph and a few others to side with Joker. Just as you are now.

 
Sep 6, 2005
135
0
0
Originally posted by: Barkotron
1. Nobody's dismissing possible driver updates. I, however, am dismissing the idea that possible future driver updates should be used as an excuse for a poor showing on either side. Well, unless there's a very clear driver bug, as there have been in the past with SLI/Crossfire setups, or the renaming fear.exe in earlier Cats etc.

2. Why is there a right to expect a minimum of 30FPS just because someone's spent a lot of money on a videocard? Have you actually seen the amount of stuff that goes on outdoors in Oblivion? It looks incredible, and screenshots just don't do it justice. Frankly I'm impressed that frame rates are as high as they are.

3. Xbit seems to be as reliable as other sites out there. Personally I have no problem with the numbers they're putting out - I'm just pointing out that specifically those Oblivion scores are not numbers "NVidia fans" should be shouting about. The ATI card has 80% better minimum FPS @12x10 and nearly 60% better @16x12. That's a big difference at any level, but when the difference is between 27FPS and 15FPS, or 22FPS and 14FPS, then it's the difference between "just about playable" and "severely affecting gameplay". Those scores are a hammering, and no meaningless drivel about "nanosecond frame rate drops" can hide that.

Sure, driver updates can bring big improvements, but until they've been tested, using possible future driver enhancements as an excuse is pure spin.

1. I'm just saying that it can't be discounted that there may indeed be a driver issue at hand here. If there isn't, then some work on the drivers themselves would'nt hurt too much either (People like to conviniently call it "optimization", but any performance increase in any game is just fine, IMO)

2. Ehh... From my standpoint, I wouldn't ever spend such an insane amount of money for what I believe to be sub-par framerates. Granted, it's no FPS, but I've seen Oblivion in action before myself (Granted, it was the 360 version). Sure, it looks good, but it's quite obviously suffering from the same thing that hurt FEAR, CoD2, Black & White 2, and so many other big-name games comming out recently: Abysmal coding efficiency. If you can buy a single console for $300 (retail, anyway) that can play with such beauty, then I really can't see any arguement that would warrant paying $500 for a single part that doesn't top it with ease. However, both of these cards are plenty powerful; Developers simply aren't spending any time making sure it can run well across all hardware. They seem to be thinking "Hell, SLI can run anything, so it'll be fine!", or something along those lines. As great as SLI may be, it's really hurting the high-end of single videocards by giving developers a little too much headroom to work with in graphical efficiency.

3. Well, I for one didn't say that nVidia was on par here with ATi here, but I wouldn't call it a hammering per se - That was the 5800 in DX9 - but it's not really something to just scoff off either. Even so, I see why you'd note this, as many did state that they were about equal, when there's really more to it than that.Even if, IMO, averages may be the most important factor (Others have already proven that minimums & maximums can be debated till the end of the world), it's still reassuring to know exactly how far one's card is going to drop under stress, and ATi definitively wins in that case.

Anywho, I'm not saying that I'm going to sit here and wait for the updates that'll never come, I'm just saying that you can't discount the possibility. If nothing is announced, then there's nothing to debate. However, there are already drivers that are apparently going to help nVidia with Oblivion's performance, so hopefully it'll help mend the situation a tid-bit, and make the Oblivion experience on the PC a little less suck-tastic.

Speaking of which, I wanted to say something about Munky's position on driver usage- I kind of agree & disagree at the same time. Some consumers will use any driver that comes out, unafraid of mucking around with driver cleaner & whatnot if anything goes wrong. On the other hand, others will swear by official drivers, wary (sp?) of any potential problems (Haha, alliteration!) caused by untested beta drivers. In tests, if I had it my way, I'd test both the latest beta drivers and official drivers, so as to see how they each compare in performance, and to see the benefits & deficiencies that would come with using a newer beta driver, and also to prove whether or not a said beta driver would actually create the promised performance increase or not. Sure, it's doubtful that any site would do that, but hey, it's a thought.
 

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
Originally posted by: Wreckage
Originally posted by: tuteja1986

like for example if i said something outrages about NVidia .. I bet i get flamed but if keysplayr2003/rollo/falllenangell/cainam/wreckage/crusader/hemmy said something outrages i bet the only real dude that would flame them would be joker.. !

So as my point is that joker really isn?t that bad as you think .. it is just no one is really their to back him up like NVIDIA fanboy team do : )

Well, there is always yourself, Ackmed, Morph and a few others to side with Joker. Just as you are now.

Quote me full or die like a man
Ackmed don't do full on ATI fanboy now days... I have seen him recommend Nvidia Products many times and as for Morph i haven't seen him do ATI fanboy acts like you or joker have :! its all in your head...

 
Sep 6, 2005
135
0
0
Originally posted by: tuteja1986
lol have you notice that no one reallys defence ATI no more ;( I remember this forum having more than few ATI fanboy nows it bascially Craploads of Nvidia fanboys vs Joker

WHat that shows is that ATI has lost total control in this forum and now its Nvidia infested ; !

like for example if i said something outrages about NVidia .. I bet i get flamed but if keysplayr2003/rollo/falllenangell/cainam/wreckage/crusader/hemmy said something outrages i bet the only real dude that would flame them would be joker.. !

So my point is that joker really isn?t that bad as you think .. it is just no one is really their to back him up like NVIDIA fanboy team do : )

Nvidia seem to be buying everything they need to win hard. They have great pr people hyping out crap and have awesome program like EGA going to suite the needs of Popular enthusiastic wanting cheaper/free video cards and cash . They have wholesalers and Editors on their side. They have payed of game developing teams to make sure games run better on Nvidia cards (ID DOOM3 , Epic UT2007).

Huh?! Maybe I'm just completely stupid (That'll be used against me some time, I bet ), but I've seen plenty, plenty of other ATi boys out there myself. To replicate the little list thingy there, there's Joker/Ackmed (IIRC, he uses nVidia hardware, but seems to defend ATi like a hawk)/crazydingo/M0RPH/Excalibur (I think that's his name)/etc... The list could go on, but hell, I don't need to offend anymore folks by categorizing them all...

As far as I can tell, there may be plenty of nVidia defenders, but the same can be said about ATi. Hell, anyone who's stood up for one company before could be named as a fanboy (A lot of you probably believe I'm an nVidia fanboy right now ). TBH, I really don't mind (So long as their comments don't get too close to becoming merely blindly fighting the enemy), but the constant flaming is what's pissing me off myself. That's why I pointed Joker out; I really used to think he was a pretty good debater (Not sure if that's even a real word, but what the hell), but at this point, he's sounding like another Rollo (As an admitted nVidia fan, I'll admit that whenever he posted, I generally felt embarrassed by his pathetic comments), and we certainly don't need that. The constant "Oh, hi Rollo! Defending nVidia and AEG again, huh?" comments are really getting old.

I won't say that I like anyone's business practices, but this isn't about that; this is about the main product: The video cards. Some may call ATi the AMD of video cards, but, in the end, it's all about the pure performance here. IMO
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: tuteja1986
Originally posted by: Finny
Feh. As much as I'd hate to say it, Joker really is becoming the Rollo of ATi at this point, regardless of past sidings.

First off, I won't dismiss possible driver updates. We've seen what's happened with them in the past (ATi's OpenGL enhancement, nVidia's FEAR improvements, etc), so I won't count them out in the future, especially in such popular titles like Oblivion. Sure, ATi can also improve performance here, but hey, better for both companies.

Also, I agree with what Munky said: If you're shelling out 500 dollars, you're not gonna play on anything lower than the highest settings. It seems companies are using the Doom 3 precedent, that being to make it so cards can't (technically) fully utilize a game's maximum settings with current-gen videocards (I believe it was said that 512MB of videocard memory was required for Ultra High in Doom3, although folks seemed to be able to run it just fine with 256 parts), as an excuse to poorly code a game. As good as Oblivion looks, regardless of what videocard, there's no excuse, especially on a HIGH END CARD, for it to drop below 30 AT ALL, even on max settings. Sure, dual-cards may have taken the ultra-highend, but that doesn't mean that single-card users should have to be stuck with sub-par graphical settings.

However, even so, I don't think this review should be disregarded. Results from various other websites have varied in their numbers as well, and since X-bit has proven to be plenty reliable in the past, I think it can be trusted again. Furthermore, I don't believe that nVidia supporters should be castrated just for liking the results here.


lol have you notice that no one reallys defence ATI no more ;( I remember this forum having more than few ATI fanboy nows it bascially Craploads of Nvidia fanboys vs Joker

WHat that shows is that ATI has lost total control in this forum and now its Nvidia infested ; !

like for example if i said something outrages about NVidia .. I bet i get flamed but if keysplayr2003/rollo/falllenangell/cainam/wreckage/crusader/hemmy said something outrages i bet the only real dude that would flame them would be joker.. !

So my point is that joker really isn?t that bad as you think .. it is just no one is really their to back him up like NVIDIA fanboy team do : )

Nvidia seem to be buying everything they need to win hard. They have great pr people hyping out crap and have awesome program like EGA going to suite the needs of Popular enthusiastic wanting cheaper/free video cards and cash . They have wholesalers and Editors on their side. They have payed of game developing teams to make sure games run better on Nvidia cards (ID DOOM3 , Epic UT2007).

It's not that Nv has totally taken control of this forum. It's just that some people, including me, got a x1900xt(x) before Nv even released their g71, and as a single card it's still better than the g71, so I really dont care how many trolls spout pro-Nv crap, since I got a card that's faster and technologically more advanced than what they're raving about. Or, maybe it's because a certain AEG shill has been keeping his head low these days, so there's less flaming going on.
 

moonboy403

Golden Member
Aug 18, 2004
1,828
0
76
Originally posted by: munky
Originally posted by: tuteja1986
Originally posted by: Finny
Feh. As much as I'd hate to say it, Joker really is becoming the Rollo of ATi at this point, regardless of past sidings.

First off, I won't dismiss possible driver updates. We've seen what's happened with them in the past (ATi's OpenGL enhancement, nVidia's FEAR improvements, etc), so I won't count them out in the future, especially in such popular titles like Oblivion. Sure, ATi can also improve performance here, but hey, better for both companies.

Also, I agree with what Munky said: If you're shelling out 500 dollars, you're not gonna play on anything lower than the highest settings. It seems companies are using the Doom 3 precedent, that being to make it so cards can't (technically) fully utilize a game's maximum settings with current-gen videocards (I believe it was said that 512MB of videocard memory was required for Ultra High in Doom3, although folks seemed to be able to run it just fine with 256 parts), as an excuse to poorly code a game. As good as Oblivion looks, regardless of what videocard, there's no excuse, especially on a HIGH END CARD, for it to drop below 30 AT ALL, even on max settings. Sure, dual-cards may have taken the ultra-highend, but that doesn't mean that single-card users should have to be stuck with sub-par graphical settings.

However, even so, I don't think this review should be disregarded. Results from various other websites have varied in their numbers as well, and since X-bit has proven to be plenty reliable in the past, I think it can be trusted again. Furthermore, I don't believe that nVidia supporters should be castrated just for liking the results here.


lol have you notice that no one reallys defence ATI no more ;( I remember this forum having more than few ATI fanboy nows it bascially Craploads of Nvidia fanboys vs Joker

WHat that shows is that ATI has lost total control in this forum and now its Nvidia infested ; !

like for example if i said something outrages about NVidia .. I bet i get flamed but if keysplayr2003/rollo/falllenangell/cainam/wreckage/crusader/hemmy said something outrages i bet the only real dude that would flame them would be joker.. !

So my point is that joker really isn?t that bad as you think .. it is just no one is really their to back him up like NVIDIA fanboy team do : )

Nvidia seem to be buying everything they need to win hard. They have great pr people hyping out crap and have awesome program like EGA going to suite the needs of Popular enthusiastic wanting cheaper/free video cards and cash . They have wholesalers and Editors on their side. They have payed of game developing teams to make sure games run better on Nvidia cards (ID DOOM3 , Epic UT2007).

It's not that Nv has totally taken control of this forum. It's just that some people, including me, got a x1900xt(x) before Nv even released their g71, and as a single card it's still better than the g71, so I really dont care how many trolls spout pro-Nv crap, since I got a card that's faster and technologically more advanced than what they're raving about. Or, maybe it's because a certain AEG shill has been keeping his head low these days, so there's less flaming going on.

the less people flame at each other, the better this forum will be
this forum is designed for us to help each other out, not kill each other

 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |