Catalyst 5.7 available

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Killrose

Diamond Member
Oct 26, 1999
6,230
8
81
Why are people with 256mb cards even piping in here? The release notes clearly state the real benefits of this driver release is for 64mb - 128mb cards. Also the cards must be Ati and not nVidia.
 

blckgrffn

Diamond Member
May 1, 2003
9,198
3,185
136
www.teamjuchems.com
LOL ^^^^ that has got to be, LOL, wow, captain obvious to the rescue, explaining to rollo how the newest cats don't support his SLI setup.... ROFL!!!!
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Aries64
Originally posted by: Rollo
Originally posted by: Sylvanas
Why the negativity? ATI release new drivers every month without fail, and they provide various performance enhancements and improvements. Every new release Rollo is always first to tell you how crap they are and point out anything negative.
It should be easy for you to find another post where I've done so then? I can't remember flaming an ATI driver since the infamous Humus tweaks.

If ATI have such 'Old GPU tech' why do they more than keep up with Nv and often best them in game benchmarks? Keep an open mind.
They don't. 6800GT SLI, 6800U SLI, 7800GTX, and 7800GTX SLI all smoke all ATI cards in any benchmark, usually by a lot.

I'll re-open my mind when I have a R520 in my second rig and it shows itself worthy.
Rollo, you are shading the numbers and twisting facts by using SLI configurations for comparison. Not exactly "fair" comparing SLI configurations (current or last-gen) to last gen single card ATI stuff, wouldn't you agree?

With SLI/6800s, we are talking about a last gen product that is over nine months old at this point.

My point was that ATI hasn't been keeping up at the highend for that long, they offer no alternatives.

Last gen single card performance was very similar, to me the defining factor was the nV40 chipset and the options it gave you on some of the newer games.
 

zendari

Banned
May 27, 2005
6,558
0
0
Rollo is correct, ATI hasn't kept up with the high end (>$400) of the market.

Cmon BFG i know you aren't stupid. You know full well that the X850 is only faster than the SLI parts at really low resolution or games where SLI does not fully work.
And SLI not working is whose fault?

Text

X850 XTPE beats 6800 U SLI in highres gaming.
 

Aries64

Golden Member
Jul 30, 2004
1,030
0
0
Originally posted by: Rollo
Originally posted by: Aries64
Originally posted by: Rollo
Originally posted by: Sylvanas
Why the negativity? ATI release new drivers every month without fail, and they provide various performance enhancements and improvements. Every new release Rollo is always first to tell you how crap they are and point out anything negative.
It should be easy for you to find another post where I've done so then? I can't remember flaming an ATI driver since the infamous Humus tweaks.

If ATI have such 'Old GPU tech' why do they more than keep up with Nv and often best them in game benchmarks? Keep an open mind.
They don't. 6800GT SLI, 6800U SLI, 7800GTX, and 7800GTX SLI all smoke all ATI cards in any benchmark, usually by a lot.

I'll re-open my mind when I have a R520 in my second rig and it shows itself worthy.
Rollo, you are shading the numbers and twisting facts by using SLI configurations for comparison. Not exactly "fair" comparing SLI configurations (current or last-gen) to last gen single card ATI stuff, wouldn't you agree?

With SLI/6800s, we are talking about a last gen product that is over nine months old at this point.

My point was that ATI hasn't been keeping up at the highend for that long, they offer no alternatives.

Last gen single card performance was very similar, to me the defining factor was the nV40 chipset and the options it gave you on some of the newer games.
OK, in that context I tend to agree with you. Nvidia has won the summer by default due to ATI being a no-show. I envy your slew of videocards you fvcker...(thats my envious side no offense Rollo).

I am waiting to see what happens when R20 and R580 are released against Nvidia's next card (must be 32 pipes). I also want to see how Crossfire setups work out once there are benches and gamers running them. Although, I'll probably just get the fastest single card since I'm current limited to 1,280x1,024 on my 172X. I need at least an 8ms, .025mm dot pitch, 1,600x1,200 or higher 19" LCD before I can really justify SLI or Crossfire. But I am somewhat GPU-bound with my current card (HIS Excaliber IceQ II X800 XT 256MB PE).



 

Aries64

Golden Member
Jul 30, 2004
1,030
0
0
Originally posted by: blckgrffn

lol, gpu bound at 1280*1024? Well, I guess if you want maxed out, I mean maxed out, settings at that resolution, maybe...

Nat
Theres' no "maybe" about it - my system is definitely GPU-bound at 1,280x1,024 with everything turned-on and set to "High". If I drop down to 1,024x768 keeping all other settings the same my Halo timedemo framerates go from 88+ FPS (1,280x1,024) to 116+ FPS (1,024x768). I know that going to a PCI-e mobo and getting a 7800 GTX would have given me much higher framerates, but I'm waiting until at least Q3 before upgrading my mobo and videocard. When I do I'll have two gaming PCs' I can run on a gigabit lan. In the meantime my move from an FX-53 to my FX-57 allows me to run a little faster and a lot cooler without overclocking.

Nat, my question to you is, why wouldn't you want to turn-on all the eye candy and max-out all settings to make the game look as pretty (read "cool") as your system is capable of?
 

blckgrffn

Diamond Member
May 1, 2003
9,198
3,185
136
www.teamjuchems.com
Well, using Halo as a benchmark is, in my opinion, worse than using any synthetic benchmark. What a horribly coded game. I would also say 88+ FPS is playable and indistinguishable (sp? sorry...) from 116. Further more, my point was that, with out taking screen shots and comparing them side side, can you tell the difference between 8xAF and 16x? Or plain old 4xAA and 6x? Super sampling aside, if enabling those extremes shoots your performance down to unplayable levels, then why turn them on? If there is notangible benefit, what is the point? I always turn all the game settings to high and turn on a little AA and AF to minimized the jaggies and bring texture detail up, and when I can't do that I buy a new card. But running, what seems to me as just wasteful, high settings to run high settingsthem seems foolish.

I agree that a game should look as cool as possible, no doubt But there will always be a bottleneck on your system, and I would say what you have right now is increibly well rounded. Getting a 7800 series will just make your bottleneck the CPU, and what fun would that be when there won't be a faster one than you have for who knows how long?

If you have the money, why aren't you playing on an uber-nice widescreen CRT? That would be my next purchase if I had 7800GTX money laying around

Nat
 

coomar

Banned
Apr 4, 2005
2,431
0
0
the 5.7 x64 drivers seem fine, played cs:s and farcry with them, i'll try out the x32 version
 

Ackmed

Diamond Member
Oct 1, 2003
8,478
524
126
Originally posted by: Gamingphreek


Not to mention you trade HDR for AA. Which is not worth it to everyone, myself included.

WHich is a flaw of that implementation of HDR and has nothing to do with Nvidia. THey simply support it, which is more than you can say for ATI.

Yes they support it, and probably had some input since Farcry is a NV supported game. he fact is, not everyone likes it better.

Considering its not playable, I would say so. Having to drop the res, to get close to playable, is not a viable "feature" to me. Also, for the millionth time, it doesnt look better to everyone. I dont like how Farcry looks most of the time with HDR.

Well lets see if you jump all the way to 16x12 on last gen cards and enable HDR then, yeah it wont be playable. However, the 7800's do just fine.
Just because it doesn't look better to you doesn't mean it isn't an IQ enhancing feature, it just doesn't appeal to all people.


[/quote]

I dont know what planet you're on, but even with a 7800GTX 1600x1200 is not playable with HDR in Farcry. Unless you turn down every other setting, which defeats the point.

I didnt say it wasnt an "IQ enhancing feature", for everyone. I like how it looks sometimes, and others, I dont. It can look too fake, and shiny. I simply pointed how, that its not for everyone. No matter what some people try to shove down others throats.

But what does it really matter? Who even plays Farcry anymore? Not anyone I know. The multiplayer sucks. The SP was great, but you can only do it so many times. Other than a few mods out there, its not worth playing anymore to me.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
And SLI not working is whose fault?

Text

X850 XTPE beats 6800 U SLI in highres gaming.

Ok, STOP and THINK.

Now, dont you think there is a problem when SLI gives you another 5fps. Dont you think there could, POSSIBLY, be a bug there? Yeah, thats what i thought. On the second one why even comopare the last gen cards at 20x15. So the 6800SLI wins by a small margin. Neither card is too playable. Personally, if i was running 20x15, i wouldn't be investing in an X850 or a 6800.

Finally, you do remember that Nvidia did not release any refresh. I would say the 1 year old product does a good job at keeping up with a much much higher clocked component.

-Kevin
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
dont know what planet you're on, but even with a 7800GTX 1600x1200 is not playable with HDR in Farcry. Unless you turn down every other setting, which defeats the point.

Link

Looks to me like it does just fine. 56FPS leaves room for some AF too. You still cannot use AA because of the implementation. WHy would you blame Nvidia for the design of that particular implementation of HDR. Just because they worked with the developer doesn't mean they wrote the implementations. DO i ithink it would have been smarter for another implementation... yes; can you fault Nvidia for the way it is now, no.

didnt say it wasnt an "IQ enhancing feature", for everyone. I like how it looks sometimes, and others, I dont. It can look too fake, and shiny. I simply pointed how, that its not for everyone. No matter what some people try to shove down others throats.
I never said you had to appreciate it. However, to stress the video card as much as possible and to get the arguably "best" looking IQ you want HDR.

-Kevin
 

zendari

Banned
May 27, 2005
6,558
0
0
Originally posted by: Gamingphreek
And SLI not working is whose fault?

Text

X850 XTPE beats 6800 U SLI in highres gaming.

Ok, STOP and THINK.

Now, dont you think there is a problem when SLI gives you another 5fps. Dont you think there could, POSSIBLY, be a bug there? Yeah, thats what i thought. On the second one why even comopare the last gen cards at 20x15. So the 6800SLI wins by a small margin. Neither card is too playable. Personally, if i was running 20x15, i wouldn't be investing in an X850 or a 6800.

Finally, you do remember that Nvidia did not release any refresh. I would say the 1 year old product does a good job at keeping up with a much much higher clocked component.

-Kevin

Are you saying Nvidia has bugs in their drivers? Again, SLI not working is whose fault?
 

Aries64

Golden Member
Jul 30, 2004
1,030
0
0
Originally posted by: blckgrffn
Well, using Halo as a benchmark is, in my opinion, worse than using any synthetic benchmark. What a horribly coded game. I would also say 88+ FPS is playable and indistinguishable (sp? sorry...) from 116. Further more, my point was that, with out taking screen shots and comparing them side side, can you tell the difference between 8xAF and 16x? Or plain old 4xAA and 6x? Super sampling aside, if enabling those extremes shoots your performance down to unplayable levels, then why turn them on? If there is notangible benefit, what is the point? I always turn all the game settings to high and turn on a little AA and AF to minimized the jaggies and bring texture detail up, and when I can't do that I buy a new card. But running, what seems to me as just wasteful, high settings to run high settingsthem seems foolish.
Well you didn't say anything like this "point" in your post before. All you did was sound doubtful that I might be GPU-bound at 1,280x1,024 with everything maxed. And while 88+ FPS is great in a timedemo, I can guarantee that I don't get that framerate in an online 16-player game. On a fast server when you are playing a map with a full complement of vehicles and full teams you need a fast videocard(s) to maintain smooth gameplay. Lots of players and vehicles shooting rockets and plasma charges along with the myriad of other special effects tax the videocard. As far as screenshots go no I can't tell the difference while I'm playing between 8XAF and 16X. I'm too busy playing and trying to stay alive.

Originally posted by: blckgrffnI agree that a game should look as cool as possible, no doubt But there will always be a bottleneck on your system, and I would say what you have right now is increibly well rounded. Getting a 7800 series will just make your bottleneck the CPU, and what fun would that be when there won't be a faster one than you have for who knows how long?
Yes, my system is well-rounded, but are you telling me you are perfectly satified with the performance of your PC and that you wouldn't like to go faster? Comon' if you are a true performance enthusiast "its' never fast enough", is it?

Originally posted by: blckgrffnIf you have the money, why aren't you playing on an uber-nice widescreen CRT? That would be my next purchase if I had 7800GTX money laying around
I am limited in my desktop space. This system is used for work as well as play, and with a laser printer, DSL router, telephone, speakers, mouse charger/stand, PDA charger/stand, a laptop, and my keyboard I barely have enough room to write with my 17" LCD as it is. Current 20" LCDs' such as the new Samsung 204T are about the maximum I can fit on my desk - so a big CRT, WS or not, is out of the question. My desk is only so large and moving my desk back from the wall is unacceptable.

 

Ackmed

Diamond Member
Oct 1, 2003
8,478
524
126
Originally posted by: Gamingphreek
dont know what planet you're on, but even with a 7800GTX 1600x1200 is not playable with HDR in Farcry. Unless you turn down every other setting, which defeats the point.

Link

Looks to me like it does just fine. 56FPS leaves room for some AF too. You still cannot use AA because of the implementation. WHy would you blame Nvidia for the design of that particular implementation of HDR. Just because they worked with the developer doesn't mean they wrote the implementations. DO i ithink it would have been smarter for another implementation... yes; can you fault Nvidia for the way it is now, no.

didnt say it wasnt an "IQ enhancing feature", for everyone. I like how it looks sometimes, and others, I dont. It can look too fake, and shiny. I simply pointed how, that its not for everyone. No matter what some people try to shove down others throats.
I never said you had to appreciate it. However, to stress the video card as much as possible and to get the arguably "best" looking IQ you want HDR.

-Kevin


In that review it does. However, in this one, it doesnt; http://www.firingsquad.com/hardware/nvidia_geforce_7800_gtx/page14.asp With an average of 29.9, you can get you'll get singke digits from time to time, and lots of frames in the teens. Not playable in my book. I trust FS more, but thats just me.

For you perhaps. AA is also "arguably" better than HDR for different people.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Ackmed
Originally posted by: Gamingphreek
dont know what planet you're on, but even with a 7800GTX 1600x1200 is not playable with HDR in Farcry. Unless you turn down every other setting, which defeats the point.

Link

Looks to me like it does just fine. 56FPS leaves room for some AF too. You still cannot use AA because of the implementation. WHy would you blame Nvidia for the design of that particular implementation of HDR. Just because they worked with the developer doesn't mean they wrote the implementations. DO i ithink it would have been smarter for another implementation... yes; can you fault Nvidia for the way it is now, no.

didnt say it wasnt an "IQ enhancing feature", for everyone. I like how it looks sometimes, and others, I dont. It can look too fake, and shiny. I simply pointed how, that its not for everyone. No matter what some people try to shove down others throats.
I never said you had to appreciate it. However, to stress the video card as much as possible and to get the arguably "best" looking IQ you want HDR.

-Kevin


In that review it does. However, in this one, it doesnt; http://www.firingsquad.com/hardware/nvidia_geforce_7800_gtx/page14.asp With an average of 29.9, you can get you'll get singke digits from time to time, and lots of frames in the teens. Not playable in my book. I trust FS more, but thats just me.

For you perhaps. AA is also "arguably" better than HDR for different people.


Hmmm. That's odd. I got 37-48fps on the four Far Cry demos I tested in HDR.
http://endeavorquest.net:8880/SLI.htm
 

ponyo

Lifer
Feb 14, 2002
19,689
2,811
126
Originally posted by: BFG10K
They don't. 6800GT SLI, 6800U SLI, 7800GTX, and 7800GTX SLI all smoke all ATI cards in any benchmark, usually by a lot.
That's just BS and you know it. The X850 is faster than a 6800U SLI in some cases.

I didn't know running SC:CT in SM1.1, Riddick without soft shadows, Lego Star Wars without shadows, and Far Cry without HDR was as "as good"?
You would you know this how? None of those games were in your list of three that you've finished in the last six months.

Not to mention that you claimed you run games at 1920x1440 with 8x/4x which would give you what, 10 FPS average on your rig with those settings?

So which is it Rollo?

Are you running low resolution 1999 settings or are you not even using any of the features you continually parrot?

What? You didn't know Rollo plays SC:CT, Riddick, and Lego Star Wars? He plays it everyday just like he plays that Tomb Raider game he so loves. These four games are the sole reason he has SLI.
 

Drayvn

Golden Member
Jun 23, 2004
1,008
0
0
Originally posted by: Gamingphreek
dont know what planet you're on, but even with a 7800GTX 1600x1200 is not playable with HDR in Farcry. Unless you turn down every other setting, which defeats the point.

Link

Looks to me like it does just fine. 56FPS leaves room for some AF too. You still cannot use AA because of the implementation. WHy would you blame Nvidia for the design of that particular implementation of HDR. Just because they worked with the developer doesn't mean they wrote the implementations. DO i ithink it would have been smarter for another implementation... yes; can you fault Nvidia for the way it is now, no.

didnt say it wasnt an "IQ enhancing feature", for everyone. I like how it looks sometimes, and others, I dont. It can look too fake, and shiny. I simply pointed how, that its not for everyone. No matter what some people try to shove down others throats.
I never said you had to appreciate it. However, to stress the video card as much as possible and to get the arguably "best" looking IQ you want HDR.

-Kevin


But if i remember correctly the fram buffer that makes the HDR on the hardware doesnt support AA and so cant produce it...

So its not the implementation of the devs, its that the hardware cant physically do it.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
You know full well that the X850 is only faster than the SLI parts at really low resolution or games where SLI does not fully work.
Really.

56FPS leaves room for some AF too
Yes but Rollo claims he games at 1920x1440 with 8xAF and 4xAA which is actually impossible if he's running HDR in Far Cry. Which means either he doesn't run HDR or he doesn't run at the settings above.

In either case his comments are typical deceptive and misleading pro-nVidia pimpage.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Getting back to the drivers, I got an almost 200 point increase in 3dmark05 on my modded 128mb 9800xt.

Cat 5.3 = 2916
Cat 5.7 = 3107

The weird thing is now 3dmark05 reports the card having 256mb mem, even though it has 128. Anyway, I'm about to build a new rig, so it won't matter much to me anymore.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |