Originally posted by: Aries64
Rollo, you are shading the numbers and twisting facts by using SLI configurations for comparison. Not exactly "fair" comparing SLI configurations (current or last-gen) to last gen single card ATI stuff, wouldn't you agree?Originally posted by: Rollo
It should be easy for you to find another post where I've done so then? I can't remember flaming an ATI driver since the infamous Humus tweaks.Originally posted by: Sylvanas
Why the negativity? ATI release new drivers every month without fail, and they provide various performance enhancements and improvements. Every new release Rollo is always first to tell you how crap they are and point out anything negative.
They don't. 6800GT SLI, 6800U SLI, 7800GTX, and 7800GTX SLI all smoke all ATI cards in any benchmark, usually by a lot.If ATI have such 'Old GPU tech' why do they more than keep up with Nv and often best them in game benchmarks? Keep an open mind.
I'll re-open my mind when I have a R520 in my second rig and it shows itself worthy.
And SLI not working is whose fault?Cmon BFG i know you aren't stupid. You know full well that the X850 is only faster than the SLI parts at really low resolution or games where SLI does not fully work.
OK, in that context I tend to agree with you. Nvidia has won the summer by default due to ATI being a no-show. I envy your slew of videocards you fvcker...(thats my envious side no offense Rollo).Originally posted by: Rollo
Originally posted by: Aries64
Rollo, you are shading the numbers and twisting facts by using SLI configurations for comparison. Not exactly "fair" comparing SLI configurations (current or last-gen) to last gen single card ATI stuff, wouldn't you agree?Originally posted by: Rollo
It should be easy for you to find another post where I've done so then? I can't remember flaming an ATI driver since the infamous Humus tweaks.Originally posted by: Sylvanas
Why the negativity? ATI release new drivers every month without fail, and they provide various performance enhancements and improvements. Every new release Rollo is always first to tell you how crap they are and point out anything negative.
They don't. 6800GT SLI, 6800U SLI, 7800GTX, and 7800GTX SLI all smoke all ATI cards in any benchmark, usually by a lot.If ATI have such 'Old GPU tech' why do they more than keep up with Nv and often best them in game benchmarks? Keep an open mind.
I'll re-open my mind when I have a R520 in my second rig and it shows itself worthy.
With SLI/6800s, we are talking about a last gen product that is over nine months old at this point.
My point was that ATI hasn't been keeping up at the highend for that long, they offer no alternatives.
Last gen single card performance was very similar, to me the defining factor was the nV40 chipset and the options it gave you on some of the newer games.
Theres' no "maybe" about it - my system is definitely GPU-bound at 1,280x1,024 with everything turned-on and set to "High". If I drop down to 1,024x768 keeping all other settings the same my Halo timedemo framerates go from 88+ FPS (1,280x1,024) to 116+ FPS (1,024x768). I know that going to a PCI-e mobo and getting a 7800 GTX would have given me much higher framerates, but I'm waiting until at least Q3 before upgrading my mobo and videocard. When I do I'll have two gaming PCs' I can run on a gigabit lan. In the meantime my move from an FX-53 to my FX-57 allows me to run a little faster and a lot cooler without overclocking.Originally posted by: blckgrffn
lol, gpu bound at 1280*1024? Well, I guess if you want maxed out, I mean maxed out, settings at that resolution, maybe...
Nat
Originally posted by: Gamingphreek
Not to mention you trade HDR for AA. Which is not worth it to everyone, myself included.
WHich is a flaw of that implementation of HDR and has nothing to do with Nvidia. THey simply support it, which is more than you can say for ATI.
Considering its not playable, I would say so. Having to drop the res, to get close to playable, is not a viable "feature" to me. Also, for the millionth time, it doesnt look better to everyone. I dont like how Farcry looks most of the time with HDR.
And SLI not working is whose fault?
Text
X850 XTPE beats 6800 U SLI in highres gaming.
dont know what planet you're on, but even with a 7800GTX 1600x1200 is not playable with HDR in Farcry. Unless you turn down every other setting, which defeats the point.
I never said you had to appreciate it. However, to stress the video card as much as possible and to get the arguably "best" looking IQ you want HDR.didnt say it wasnt an "IQ enhancing feature", for everyone. I like how it looks sometimes, and others, I dont. It can look too fake, and shiny. I simply pointed how, that its not for everyone. No matter what some people try to shove down others throats.
Originally posted by: Gamingphreek
And SLI not working is whose fault?
Text
X850 XTPE beats 6800 U SLI in highres gaming.
Ok, STOP and THINK.
Now, dont you think there is a problem when SLI gives you another 5fps. Dont you think there could, POSSIBLY, be a bug there? Yeah, thats what i thought. On the second one why even comopare the last gen cards at 20x15. So the 6800SLI wins by a small margin. Neither card is too playable. Personally, if i was running 20x15, i wouldn't be investing in an X850 or a 6800.
Finally, you do remember that Nvidia did not release any refresh. I would say the 1 year old product does a good job at keeping up with a much much higher clocked component.
-Kevin
Well you didn't say anything like this "point" in your post before. All you did was sound doubtful that I might be GPU-bound at 1,280x1,024 with everything maxed. And while 88+ FPS is great in a timedemo, I can guarantee that I don't get that framerate in an online 16-player game. On a fast server when you are playing a map with a full complement of vehicles and full teams you need a fast videocard(s) to maintain smooth gameplay. Lots of players and vehicles shooting rockets and plasma charges along with the myriad of other special effects tax the videocard. As far as screenshots go no I can't tell the difference while I'm playing between 8XAF and 16X. I'm too busy playing and trying to stay alive.Originally posted by: blckgrffn
Well, using Halo as a benchmark is, in my opinion, worse than using any synthetic benchmark. What a horribly coded game. I would also say 88+ FPS is playable and indistinguishable (sp? sorry...) from 116. Further more, my point was that, with out taking screen shots and comparing them side side, can you tell the difference between 8xAF and 16x? Or plain old 4xAA and 6x? Super sampling aside, if enabling those extremes shoots your performance down to unplayable levels, then why turn them on? If there is notangible benefit, what is the point? I always turn all the game settings to high and turn on a little AA and AF to minimized the jaggies and bring texture detail up, and when I can't do that I buy a new card. But running, what seems to me as just wasteful, high settings to run high settingsthem seems foolish.
Yes, my system is well-rounded, but are you telling me you are perfectly satified with the performance of your PC and that you wouldn't like to go faster? Comon' if you are a true performance enthusiast "its' never fast enough", is it?Originally posted by: blckgrffnI agree that a game should look as cool as possible, no doubt But there will always be a bottleneck on your system, and I would say what you have right now is increibly well rounded. Getting a 7800 series will just make your bottleneck the CPU, and what fun would that be when there won't be a faster one than you have for who knows how long?
I am limited in my desktop space. This system is used for work as well as play, and with a laser printer, DSL router, telephone, speakers, mouse charger/stand, PDA charger/stand, a laptop, and my keyboard I barely have enough room to write with my 17" LCD as it is. Current 20" LCDs' such as the new Samsung 204T are about the maximum I can fit on my desk - so a big CRT, WS or not, is out of the question. My desk is only so large and moving my desk back from the wall is unacceptable.Originally posted by: blckgrffnIf you have the money, why aren't you playing on an uber-nice widescreen CRT? That would be my next purchase if I had 7800GTX money laying around
Originally posted by: Gamingphreek
dont know what planet you're on, but even with a 7800GTX 1600x1200 is not playable with HDR in Farcry. Unless you turn down every other setting, which defeats the point.
Link
Looks to me like it does just fine. 56FPS leaves room for some AF too. You still cannot use AA because of the implementation. WHy would you blame Nvidia for the design of that particular implementation of HDR. Just because they worked with the developer doesn't mean they wrote the implementations. DO i ithink it would have been smarter for another implementation... yes; can you fault Nvidia for the way it is now, no.
I never said you had to appreciate it. However, to stress the video card as much as possible and to get the arguably "best" looking IQ you want HDR.didnt say it wasnt an "IQ enhancing feature", for everyone. I like how it looks sometimes, and others, I dont. It can look too fake, and shiny. I simply pointed how, that its not for everyone. No matter what some people try to shove down others throats.
-Kevin
Originally posted by: Ackmed
Originally posted by: Gamingphreek
dont know what planet you're on, but even with a 7800GTX 1600x1200 is not playable with HDR in Farcry. Unless you turn down every other setting, which defeats the point.
Link
Looks to me like it does just fine. 56FPS leaves room for some AF too. You still cannot use AA because of the implementation. WHy would you blame Nvidia for the design of that particular implementation of HDR. Just because they worked with the developer doesn't mean they wrote the implementations. DO i ithink it would have been smarter for another implementation... yes; can you fault Nvidia for the way it is now, no.
I never said you had to appreciate it. However, to stress the video card as much as possible and to get the arguably "best" looking IQ you want HDR.didnt say it wasnt an "IQ enhancing feature", for everyone. I like how it looks sometimes, and others, I dont. It can look too fake, and shiny. I simply pointed how, that its not for everyone. No matter what some people try to shove down others throats.
-Kevin
In that review it does. However, in this one, it doesnt; http://www.firingsquad.com/hardware/nvidia_geforce_7800_gtx/page14.asp With an average of 29.9, you can get you'll get singke digits from time to time, and lots of frames in the teens. Not playable in my book. I trust FS more, but thats just me.
For you perhaps. AA is also "arguably" better than HDR for different people.
Originally posted by: BFG10K
That's just BS and you know it. The X850 is faster than a 6800U SLI in some cases.They don't. 6800GT SLI, 6800U SLI, 7800GTX, and 7800GTX SLI all smoke all ATI cards in any benchmark, usually by a lot.
You would you know this how? None of those games were in your list of three that you've finished in the last six months.I didn't know running SC:CT in SM1.1, Riddick without soft shadows, Lego Star Wars without shadows, and Far Cry without HDR was as "as good"?
Not to mention that you claimed you run games at 1920x1440 with 8x/4x which would give you what, 10 FPS average on your rig with those settings?
So which is it Rollo?
Are you running low resolution 1999 settings or are you not even using any of the features you continually parrot?
Originally posted by: Gamingphreek
dont know what planet you're on, but even with a 7800GTX 1600x1200 is not playable with HDR in Farcry. Unless you turn down every other setting, which defeats the point.
Link
Looks to me like it does just fine. 56FPS leaves room for some AF too. You still cannot use AA because of the implementation. WHy would you blame Nvidia for the design of that particular implementation of HDR. Just because they worked with the developer doesn't mean they wrote the implementations. DO i ithink it would have been smarter for another implementation... yes; can you fault Nvidia for the way it is now, no.
I never said you had to appreciate it. However, to stress the video card as much as possible and to get the arguably "best" looking IQ you want HDR.didnt say it wasnt an "IQ enhancing feature", for everyone. I like how it looks sometimes, and others, I dont. It can look too fake, and shiny. I simply pointed how, that its not for everyone. No matter what some people try to shove down others throats.
-Kevin
Really.You know full well that the X850 is only faster than the SLI parts at really low resolution or games where SLI does not fully work.
Yes but Rollo claims he games at 1920x1440 with 8xAF and 4xAA which is actually impossible if he's running HDR in Far Cry. Which means either he doesn't run HDR or he doesn't run at the settings above.56FPS leaves room for some AF too