Geforce 7800GTX Confirmed Specs + Picture

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Zenoth

Diamond Member
Jan 29, 2005
5,196
197
106
It was a good gamble from nVidia to include SM 3.0 into their latest generation.

And it worked, because most of the people who bought, or want to buy a 6800 series GeFroce GPU usually do so with the excuse of "investment". So that they have SM 3.0 the day it will be in use, fully, or not.

In my opinion, nVidia fooled them big time, and got their money.

By the time games that uses SM 3.0 fully, or not even fully, the new generation will be out, and their "investment" excuse will fall into darkness. Technically speaking, there's what, one, two or three games ? That use SM 3.0 OFFICIALLY, WITHOUT modifying game files ? I know there is Painkiller, with a patch, and that's it.

And, buying a game because the game has SM 3.0 is not the reason to buy it.

You buy a game because its essential features and context are interesting you. The supported visual features are pure extras.

But again ... people fall for that. And give money blindly. In my humble opinion.

I wanted to buy one of ATi X800 series GPU, and then, I told to myself "oh, SM 3.0 huh ? hmmm...why not".

But after more thinking I said to myself "oh c'mon ... not mass support, and barely official support for SM 3.0 so far, and that UltraShadow II feature only good for two games including Doom 3, I mean, I don't shell out $300 to even $500 for the sake of being able to run two or three games at their highest settings, and then stop playing them because they're eventually and usually, ultimately linear and boring, unless there is a good multi-player mode".

Really ...

So I stayed with my 9800 Pro, which does the job extremely well, at good resolutions.

I'm waiting for R520 out of curiosity, first. Then, I might hype myself once it is officially revealed and benchmarked.

I'm not saying nVidia GPUs are "crap", overall. No. I'm just saying that even though they're good, they feature stuff that people fall for, just like a prey going straight into the trap. They got your money for your "investment" reasoning. They got it.
 

fierydemise

Platinum Member
Apr 16, 2005
2,056
2
81
While I agree that?s its nice for nVidea to innovate, innovation for the sake of innovation is pointless, Intel has done that and very few people have gone for it, they tried to innovate to DDR2 and that hasn?t done much for performance, they tried to make a new standard with BTX and no ones biting. SM3 is a better innovation, SM3 is being used and does do some things for image quality but it?s not widely utilized enough (I can only come up with 5 games) to be worth the extra cost (in my opinion). nVidea was good innovating to SM3 but until its more widely utilized (next gen games) it doesn't seem worth it.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Zenoth
It was a good gamble from nVidia to include SM 3.0 into their latest generation.
Yes it was- because it's always better to have the MS standard than not.

And it worked, because most of the people who bought, or want to buy a 6800 series GeFroce GPU usually do so with the excuse of "investment". So that they have SM 3.0 the day it will be in use, fully, or not.
Only a fool invests in video cards, they depreciate FAST.

In my opinion, nVidia fooled them big time, and got their money.
There are other reasons to buy nVidia cards: better OGL, better linux, SLI, HDR. The people who are fooled are buying current gen ATI with it's 2003 feature set +3dc.

By the time games that uses SM 3.0 fully, or not even fully, the new generation will be out, and their "investment" excuse will fall into darkness. Technically speaking, there's what, one, two or three games ? That use SM 3.0 OFFICIALLY, WITHOUT modifying game files ? I know there is Painkiller, with a patch, and that's it.
Splinter Cell Chaos theory is SM3/SM1.1 only. Far Cry, Pacific Fighters, Pitfall are SM3. Guess you're wrong.

And, buying a game because the game has SM 3.0 is not the reason to buy it.
This makes no sense.

You buy a game because its essential features and context are interesting you. The supported visual features are pure extras.
SM3, other than in Splinter Cell, currently offers no visual differences, only performance.

But again ... people fall for that. And give money blindly. In my humble opinion.

I wanted to buy one of ATi X800 series GPU, and then, I told to myself "oh, SM 3.0 huh ? hmmm...why not".

But after more thinking I said to myself "oh c'mon ... not mass support, and barely official support for SM 3.0 so far, and that UltraShadow II feature only good for two games including Doom 3, I mean, I don't shell out $300 to even $500 for the sake of being able to run two or three games at their highest settings, and then stop playing them because they're eventually and usually, ultimately linear and boring, unless there is a good multi-player mode".

I guess you forgot about Prey, Quake4, and Wolfenstein 2 that will all be based on the Doom3 engine, and probably others?

Really ...

So I stayed with my 9800 Pro, which does the job extremely well, at good resolutions.
If you're happy with low settings on modern games, you can save a lot of money. A lot of us like to see the new features and play at higher settings than 10X7.

I'm waiting for R520 out of curiosity, first. Then, I might hype myself once it is officially revealed and benchmarked.
Word from devs is that G70 owns it.

I'm not saying nVidia GPUs are "crap", overall. No. I'm just saying that even though they're good, they feature stuff that people fall for, just like a prey going straight into the trap. They got your money for your "investment" reasoning. They got it.
I bought my SLI to have my games look more realistic than you can imagine usinga 9800Pro?

 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
That is a fat ass, but it'll shrink. This is just a prototype, just like the others before it, they are long at first, but then they are smaller when released.
 

cmp1223

Senior member
Jun 7, 2004
522
0
0
Zenoth, nobody buy a graphics card for any sort of investment. First, understand that the people who buy $300+ graphics cards is a small market segment of very dedicated gamers. Many of which are glad to pay a premium not only for general performance, but also for neat features such as SM3.0. There will always be a next generations, and every product on earth markets itself as bigger and better. I bought my BFG 6800nu for arounf $250 with HL2 and Far-Cry. I unlocked it to 16pipes and 6vp. I couldn't be happier with my so called investment, especially with the crop of games we had this year and whats to come next year, which I'm sure it can handle.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Zenoth
Better engineering or not, the fact is ... within the latest generation GPUs from nVidia and ATi, the winner clearly is ATi in terms of overall power consumption. It's not an opinion, it's proven black on white in reviews on the web. Especially the very high end GPUs.

My guess, and my hope, is that ATi keeps the pace.

Personally, overall power consumption is important for me. Maybe it's not to others, but in book, it is. So, so far, go ATi for me.

http://www.anandtech.com/video/showdoc.aspx?i=2290&p=2

In terms of power consumption the X850 XT PE breaks new records for single card power consumption as it drew more power than its predecessor as well as NVIDIA's GeForce 6800 Ultra

http://www.gamepc.com/labs/view_content.asp?id=x850xtpt&page=12&cookie%5Ftest=1

...the Radeon X850 XT Platinum consumes quite a lot of power, in fact, it consumes the most power of any single consumer level graphics card to date...
 

Avalon

Diamond Member
Jul 16, 2001
7,567
156
106
Originally posted by: nitromullet
Originally posted by: Zenoth
Better engineering or not, the fact is ... within the latest generation GPUs from nVidia and ATi, the winner clearly is ATi in terms of overall power consumption. It's not an opinion, it's proven black on white in reviews on the web. Especially the very high end GPUs.

My guess, and my hope, is that ATi keeps the pace.

Personally, overall power consumption is important for me. Maybe it's not to others, but in book, it is. So, so far, go ATi for me.

http://www.anandtech.com/video/showdoc.aspx?i=2290&p=2

In terms of power consumption the X850 XT PE breaks new records for single card power consumption as it drew more power than its predecessor as well as NVIDIA's GeForce 6800 Ultra

http://www.gamepc.com/labs/view_content.asp?id=x850xtpt&page=12&cookie%5Ftest=1

...the Radeon X850 XT Platinum consumes quite a lot of power, in fact, it consumes the most power of any single consumer level graphics card to date...

Harrrr!
 

Zenoth

Diamond Member
Jan 29, 2005
5,196
197
106
Burned ?

Wait you think I got somewhat "owned" by even facts ?

I still prefer ATi GPUs.

I do NOT "hate" nVidia. I can consider their products, in fact I almost did, like I said. I wanted to get a 6800 GT. Especially for its SM 3.0 feature, because that feature on its own DOES interest me. The thing is, there is not enough support for it at the moment, and this is not an opinion, but a fact.

As for the X850 XT consuming more power than any single GPU on the market, yes, I know. It's actually an EXCEPTION. Generally speaking, ATi IS better at producing GPUs consuming less power.

I feel like you guys searched for facts just to make me close my rookie mouth huh ?

I still strongly believe ATi does a superb job.

Only a fool invests in video cards, they depreciate FAST.

Then I think you should be surprised, because there is many "fools" around that ARE buying the latest generation nVidia GPUs BECAUSE of the SM 3.0 over any ATi GPUs, exactly because those do not feature SM 3.0. With the number of posts you have I presume you knew this. You surely saw people around here making the decision to buy nVidia solutions over ATi's especially because of that feature, and it being a good reason for the higher price, over the X800 XL, for example.

Splinter Cell Chaos theory is SM3/SM1.1 only. Far Cry, Pacific Fighters, Pitfall are SM3. Guess you're wrong.

As far as ... Far Cry goes, IF I am not mistaken (and I think I'm not since I take the risk to write this and then getting "burned" by a fact) it doesn't support it "officially". You must change something within a game's files, I'm sure I read that somewhere. What I expect by "official" is ... you access a game's video options, and you can select yes, or no, that's it. That's the kind of "support" I'm talking about. From the developers. And not something like "OMG OMG OMG I FOUND OUT HOW TO USE SM 3.0 / HDR IN FAR CRY BUT YOU MUST CHANGE THE FOLLOWING...".

This makes no sense.

In your head perhaps, but I've read (and actually, sadly, I know someone who did) that there is rare (granted) cases where people actually bought Painkiller along with their newly bought 6800 GPU just so they could see the patched SM 3.0 effect added into the game that wasn't originally supported. Or course, that's an extreme case, but my point is ... there IS fools around, see ? Those are good types of prey for nVidia, at the moment. Yeah yeah ... it's rare. Ok, that's alright. I don't feel burned for much.

If you're happy with low settings on modern games, you can save a lot of money. A lot of us like to see the new features and play at higher settings than 10X7.

Did you own, or do you own a 9800 Pro ? People under-estimate it or something, still these days. Who tells you I'm playing my games on "Low settings" ? What about playing Half-Life 2 at 1280 x 1024, without any A-A, but with 8 x A-F ? At frames-per-second at a minium (so far from what I've played) of around 45 ? And as high as even 60 ? Yes, with a 9800 Pro. The type of game doesn't matter. My point here is ... I don't play them at "low settings". Unless you personally consider what I mentioned to be "low". Then that's in your book. And yes I agree actually with the rest of your sentence.

Word from devs is that G70 owns it.

And words from "vets" of this board makes the specs of G70 not completely official. My point is that no, I don't mean it's actually "crap", it might even be higher than what we saw. What I mean is, actually, what's your point ? What "owns" mean in your head ? Let's presume the highest G70 version has 15 f-p-s more than the highest R520 version at a specific game benchmark, at 1600 x 1200 (pure example). And let's pretend the f-p-s involved were 100 for G70 and 85 for R520. You call that "owned" ? One thing is quite for sure, that ATi will support SM 3.0 for the next generation, so, features-specific it will be even, or at least near that. Anyways ... if you want, give me your own personal definition of "owned".

I bought my SLI to have my games look more realistic than you can imagine usinga 9800Pro?

More realistic then I can even imagine ? Then, I got news for you. If you GPU(s) is / are capable of producing image quality and overall visual presentation that meets my imagination, then they are certainly NOT 6800 GPUs of any variants even in SLi mode.

Seriously, what's more realistic, by compresing the pixels at higher resolution ? The polygon count remains the same on your Doom Marine guy than mine. But because it's higher resolution with less "jaggies" it makes things more realistic ? Alright, well, actually if that's how you see that, I can't comment. It won't change how you see things.

Man ...

C'mon guys it's not like I'm here to show off or anything. Jesus, first of all I wanted to post opinions.

I LIKE BOTH.

And when people post things here automatically there's some sort of long time members vets or not counter-saying my stuff. I didn't want to start a debate here.

Yes I did post stuff without facts for back-up, it means it opinions, with also a little dose of ignorance. Happy now ? Sleep well.

I might consider G70, I might consider R520.

Noticed the first sentence of my first post in this thread ? I still do not believe the specs at true.

So I will wait.

One thing is for sure, I am not interested by the actual generation GPUs. I'm happy with my 9800 Pro.

And later this summer or this fall, I will be the one here that will say things like "I can play my games with more realistic looking picture and overall visuals than most of you here with your mere 6800 or X800 or X850 GPUs" just for the hell of it and getting it out of my chest. Heh ...
 

ZimZum

Golden Member
Aug 2, 2001
1,281
0
76
Originally posted by: Insomniak
Originally posted by: Zenoth
Better engineering or not, the fact is ... within the latest generation GPUs from nVidia and ATi, the winner clearly is ATi in terms of overall power consumption. It's not an opinion, it's proven black on white in reviews on the web. Especially the very high end GPUs.

My guess, and my hope, is that ATi keeps the pace.

Personally, overall power consumption is important for me. Maybe it's not to others, but in book, it is. So, so far, go ATi for me.


That would all be well and good if ATi would push new ideas out the door as often as Nvidia does. The problem is, they follow NV most of the time.

TWIMTBP -> GITG
HardWare TnL -> GeForce was 1st
SLi -> AMR


There are other examples. I like ATi, but I wish they'd innovate every now and again.


TWIMTBP has nothing to do with engineering innovation. Its a marketing campaign. If you threw enough money at the game Devs they would plaster your logo on Splash screens as well.

Hardware TNL - Rendition V2200 was the first

SLI was the product of 3DFX. Not to mention we have yet to see AMR in action so at this point you can hardly say one is superior to the other. But ATI has had an SLI solution for professional graphics workstations for Years. They're just now making it available for the consumer market.

 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: fierydemise
While I agree that?s its nice for nVidea to innovate, innovation for the sake of innovation is pointless, Intel has done that and very few people have gone for it, they tried to innovate to DDR2 and that hasn?t done much for performance, they tried to make a new standard with BTX and no ones biting. SM3 is a better innovation, SM3 is being used and does do some things for image quality but it?s not widely utilized enough (I can only come up with 5 games) to be worth the extra cost (in my opinion). nVidea was good innovating to SM3 but until its more widely utilized (next gen games) it doesn't seem worth it.

It ISN'T nVidia's innovation, it's microsoft's innovation. You may not have noticed, but DirectX9.0c launched at the same time nV40 and R420 launched. One of those chips had the foresight to support the new standard while also being strong in the old (SM2.0 and below) standard, while the other ignored the new standard altogether, milked the old standard for all it was worth (who does that benefit - the consumer or the company?) and put out an infamous .pdf file rubbishing "nVidia's SM3.0" (funny how they are using SM3.0 in the XBOX-2 GPU...) and stating that SM3.0 was "worthless until we decide to support it"...
 
Jun 14, 2003
10,442
0
0
Originally posted by: Rudee
Originally posted by: Zenoth
I do not believe those specifications are true.

But if they are, then I am definitely waiting for ATi R520.

Also, ATi are better at producing superb GPUs with lower power consumption than nVidia's GPUs. And, personally, it's important to me. Less power consumption, less heat, more room for over-clocking, or just better stability potential at stock speeds.

I agree. ATI cards are better cards in terms of engineering IMHO.


hmm i dont know ive been more impressed by the quality of my 6800GT than any ATI card ive owned, the cooler on the 9700, 9800 series just never seemed substantial enough to me.

ive had a few nvidia cards, and bare the stripped down MSI TNT2 which was just the bronx of video cards ive found NV to be better, but i guess with Nvidia its more about the AIB makers since nvidia dont produce the actual cards.

and so far as this round goes, Nvidias high end has overclocked much better than ATI's. my friends X800XT wont budge 5mhz, yet another guy i know has his 6800U at 465mhz on air. everyone with a gt i know routinely gets 400+ too so i guess its just a game of chance.

my leadtek 6800GT is a solid card, with alot of thought and engineering put into it, but then again maybe thats just leadtek
 

Emultra

Golden Member
Jul 6, 2002
1,166
0
0
Does the entire 6x00 series support SM3.0? Including my 6600GT?

I rely on the competition to remind either nVidia or ATi that they need to do better, whenever one of them are failing. I've used two nVidia cards in a row, and if, when the time comes to upgrade, ATi happens to be the better, I'll go that way.
Why cheat yourself out of something good?
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Zenoth
Better engineering or not, the fact is ... within the latest generation GPUs from nVidia and ATi, the winner clearly is ATi in terms of overall power consumption. It's not an opinion, it's proven black on white in reviews on the web. Especially the very high end GPUs.

My guess, and my hope, is that ATi keeps the pace.

Personally, overall power consumption is important for me. Maybe it's not to others, but in book, it is. So, so far, go ATi for me.

Errrr, yeah. When I look at video card specs, how much power they consume is always first on my list....


What the heck? Who cares about power consumption?

Personally I don't know if I've even looked at an electric bill in the last 10 years.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: Emultra
Does the entire 6x00 series support SM3.0? Including my 6600GT?

I rely on the competition to remind either nVidia or ATi that they need to do better, whenever one of them are failing. I've used two nVidia cards in a row, and if, when the time comes to upgrade, ATi happens to be the better, I'll go that way.
Why cheat yourself out of something good?

Yes, they all do SM3.0 as defined by the DirectX9.0c specification.

The 64 bit version of the 6200 won't do OpenEXR and all the 6200 family lack compression technology present on higher-end 6x00 GPU's that speeds up depth/z-culling and AntiAliasing.

Other than that though all members of the 6x00 family are feature complete. The differences lie in number of quads, number of vertex pipes gpu bus width and memory technology.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Zenoth
Burned ?

Wait you think I got somewhat "owned" by even facts ?
Dude, you put up a strange "anti new card" speech, I just pointed out the flaws.

I still prefer ATi GPUs.
Today-why? Tomorrow-maybe?


As for the X850 XT consuming more power than any single GPU on the market, yes, I know. It's actually an EXCEPTION. Generally speaking, ATi IS better at producing GPUs consuming less power.
Who cares? One costs $.03 to run for a day the other $.05?

I feel like you guys searched for facts just to make me close my rookie mouth huh ?
Nope, but if you posts things that are questionable here, they'll get questioned.

I still strongly believe ATi does a superb job.
They do. Being number 1 or 2 in the world at what you do is amazing.

Only a fool invests in video cards, they depreciate FAST.

Then I think you should be surprised, because there is many "fools" around that ARE buying the latest generation nVidia GPUs BECAUSE of the SM 3.0 over any ATi GPUs, exactly because those do not feature SM 3.0. With the number of posts you have I presume you knew this. You surely saw people around here making the decision to buy nVidia solutions over ATi's especially because of that feature, and it being a good reason for the higher price, over the X800 XL, for example.
It's one of the justifications of the higher price, not the only one.

Splinter Cell Chaos theory is SM3/SM1.1 only. Far Cry, Pacific Fighters, Pitfall are SM3. Guess you're wrong.

As far as ... Far Cry goes, IF I am not mistaken (and I think I'm not since I take the risk to write this and then getting "burned" by a fact) it doesn't support it "officially". You must change something within a game's files, I'm sure I read that somewhere. What I expect by "official" is ... you access a game's video options, and you can select yes, or no, that's it. That's the kind of "support" I'm talking about. From the developers. And not something like "OMG OMG OMG I FOUND OUT HOW TO USE SM 3.0 / HDR IN FAR CRY BUT YOU MUST CHANGE THE FOLLOWING...".
I believe Far Cry runs SM3 by default with patch 1.3? HDR does need to be switched on.

This makes no sense.

In your head perhaps, but I've read (and actually, sadly, I know someone who did) that there is rare (granted) cases where people actually bought Painkiller along with their newly bought 6800 GPU just so they could see the patched SM 3.0 effect added into the game that wasn't originally supported. Or course, that's an extreme case, but my point is ... there IS fools around, see ? Those are good types of prey for nVidia, at the moment. Yeah yeah ... it's rare. Ok, that's alright. I don't feel burned for much.
You keep talking about people being nVidia's "prey"? WTH? No one is any worse off buying a nVidia card, they get more for their money. Having better OGL, better Linux, SM3, soft stencil shadows, HDR, and SLI capability are worth cash- and in many cases nVidia card are comparably priced with ATI.

If you're happy with low settings on modern games, you can save a lot of money. A lot of us like to see the new features and play at higher settings than 10X7.

Did you own, or do you own a 9800 Pro ? People under-estimate it or something, still these days. Who tells you I'm playing my games on "Low settings" ? What about playing Half-Life 2 at 1280 x 1024, without any A-A, but with 8 x A-F ? At frames-per-second at a minium (so far from what I've played) of around 45 ? And as high as even 60 ? Yes, with a 9800 Pro. The type of game doesn't matter. My point here is ... I don't play them at "low settings". Unless you personally consider what I mentioned to be "low". Then that's in your book. And yes I agree actually with the rest of your sentence.
I had a 9700Pro and a 9800 Pro the month they came out, and am well familiar with their performance.

Word from devs is that G70 owns it.

And words from "vets" of this board makes the specs of G70 not completely official. My point is that no, I don't mean it's actually "crap", it might even be higher than what we saw. What I mean is, actually, what's your point ? What "owns" mean in your head ? Let's presume the highest G70 version has 15 f-p-s more than the highest R520 version at a specific game benchmark, at 1600 x 1200 (pure example). And let's pretend the f-p-s involved were 100 for G70 and 85 for R520. You call that "owned" ? One thing is quite for sure, that ATi will support SM 3.0 for the next generation, so, features-specific it will be even, or at least near that. Anyways ... if you want, give me your own personal definition of "owned".
I agree with what you say here to some extent.

I bought my SLI to have my games look more realistic than you can imagine usinga 9800Pro?

More realistic then I can even imagine ? Then, I got news for you. If you GPU(s) is / are capable of producing image quality and overall visual presentation that meets my imagination, then they are certainly NOT 6800 GPUs of any variants even in SLi mode.
OK- so imagination is a tough order, but you can trust me that the setting I start with on on all games (16X12 4X8X) looks way better than what you're running new games at.

Seriously, what's more realistic, by compresing the pixels at higher resolution ? The polygon count remains the same on your Doom Marine guy than mine. But because it's higher resolution with less "jaggies" it makes things more realistic ? Alright, well, actually if that's how you see that, I can't comment. It won't change how you see things.
More pixels per square inch DOES change how you see things- it adds details to textures. An image in the distance at 10X7 might have 10 pixels in it, at 16X12 it would have 30. That may be the difference in being able to tell what it is or not.

Man ...

C'mon guys it's not like I'm here to show off or anything.
Personally, I wasn't thinking that. It looked more to me like you were trying to attack nVidia for innovation, and justify your own not buying a current gen card.

And when people post things here automatically there's some sort of long time members vets or not counter-saying my stuff. I didn't want to start a debate here.
Unfortunately, if your going to call us current gen nVidia purchasers "preyed upon ignorant" we may take exception to that.

One thing is for sure, I am not interested by the actual generation GPUs. I'm happy with my 9800 Pro.
Like I said, if you can live with the compromises, you'll definitely be richer.

And later this summer or this fall, I will be the one here that will say things like "I can play my games with more realistic looking picture and overall visuals than most of you here with your mere 6800 or X800 or X850 GPUs" just for the hell of it and getting it out of my chest. Heh ...
And you mthink that the people here won't buy the next gen cards because? I have news for you, this is an enthusiast board, a lot of us buy all new cards. All you can post to us is "I play the same as you".

 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Originally posted by: ZimZum
TWIMTBP has nothing to do with engineering innovation. Its a marketing campaign. If you threw enough money at the game Devs they would plaster your logo on Splash screens as well.

Hardware TNL - Rendition V2200 was the first

SLI was the product of 3DFX. Not to mention we have yet to see AMR in action so at this point you can hardly say one is superior to the other. But ATI has had an SLI solution for professional graphics workstations for Years. They're just now making it available for the consumer market.


TWIMTBP is developer relations in addition to marketing - support with fixinf driver bugs, education on how to work with NV cards, etc.

Rendition V2200 is nice, but I meant TnL that actually worked

Yes, we all know the idea of linking consumer graphics boards together was done by 3dfx originally - I had a VooDoo2 SLI setup back in '98 - but the point is it died, and NV brought it back, now ATi is following suit.

What I'm getting down to is that between the two major players right now, one plays it substantially more risky than the other, which is what's needed.

 

Ackmed

Diamond Member
Oct 1, 2003
8,487
533
126
Once again, its pointed out that NV cards can do effects in a few select games that ATi's cards cant. And again its not mentioned that you lose AA in some, and you lose a huge chunk (lots of times 75%) of your total frames.

But hey, lets just point out the positives, and not the negatives.

edit,

Originally posted by: Rollo

Nope, but if you posts things that are questionable here, they'll get questioned.

Kinda like this?

Originally posted by: Rollo

I bought my SLI to have my games look more realistic than you can imagine usinga 9800Pro?

Um.. thats so far over exagerated and "questionable" it is not even funny. The difference image quality wise is only in a few games, and being "more realisitic" is not totally accurate.
 
Jun 14, 2003
10,442
0
0
Originally posted by: Avalon
Originally posted by: nitromullet
Originally posted by: Zenoth
Better engineering or not, the fact is ... within the latest generation GPUs from nVidia and ATi, the winner clearly is ATi in terms of overall power consumption. It's not an opinion, it's proven black on white in reviews on the web. Especially the very high end GPUs.

My guess, and my hope, is that ATi keeps the pace.

Personally, overall power consumption is important for me. Maybe it's not to others, but in book, it is. So, so far, go ATi for me.

http://www.anandtech.com/video/showdoc.aspx?i=2290&p=2

In terms of power consumption the X850 XT PE breaks new records for single card power consumption as it drew more power than its predecessor as well as NVIDIA's GeForce 6800 Ultra

http://www.gamepc.com/labs/view_content.asp?id=x850xtpt&page=12&cookie%5Ftest=1

...the Radeon X850 XT Platinum consumes quite a lot of power, in fact, it consumes the most power of any single consumer level graphics card to date...

Harrrr!


i believe the term is "mowed down"

he took that argument outside for a little talk with his AK47
 

Ackmed

Diamond Member
Oct 1, 2003
8,487
533
126
The problem with those power consumption results, is there is not ONE other that backs it up. There are several of these articles, from different hardware sites. They ALL have different results, and not just by a little.
 

trinibwoy

Senior member
Apr 29, 2005
317
3
81
Originally posted by: Ackmed
The problem with those power consumption results, is there is not ONE other that backs it up. There are several of these articles, from different hardware sites. They ALL have different results, and not just by a little.

May be better if you posted some of those contradictory results.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Ackmed
Once again, its pointed out that NV cards can do effects in a few select games that ATi's cards cant. And again its not mentioned that you lose AA in some, and you lose a huge chunk (lots of times 75%) of your total frames.

But hey, lets just point out the positives, and not the negatives.

edit,

Originally posted by: Rollo

Nope, but if you posts things that are questionable here, they'll get questioned.

Kinda like this?

Originally posted by: Rollo

I bought my SLI to have my games look more realistic than you can imagine usinga 9800Pro?

Um.. thats so far over exagerated and "questionable" it is not even funny. The difference image quality wise is only in a few games, and being "more realisitic" is not totally accurate.


Must be in the mood to be p3wned some more.....

The fact of the matter Ackie it that he can't run any modern games (Doom3, HL2, Far Cry, Riddick, etc) at 16X12 4X8X and I can run them all at that setting. Having more pixels results in more realistic rendering, whether you like it or not.

Maybe you should go back to finding one ATI card that has lower power consumption than a nVidia card and pointing out how "we can't say ALL, because here's one ATI card that uses less power, and that's IMPORTANT!" :roll:
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: Rollo
Originally posted by: Ackmed
Once again, its pointed out that NV cards can do effects in a few select games that ATi's cards cant. And again its not mentioned that you lose AA in some, and you lose a huge chunk (lots of times 75%) of your total frames.

But hey, lets just point out the positives, and not the negatives.

edit,

Originally posted by: Rollo

Nope, but if you posts things that are questionable here, they'll get questioned.

Kinda like this?

Originally posted by: Rollo

I bought my SLI to have my games look more realistic than you can imagine usinga 9800Pro?

Um.. thats so far over exagerated and "questionable" it is not even funny. The difference image quality wise is only in a few games, and being "more realisitic" is not totally accurate.


Must be in the mood to be p3wned some more.....

The fact of the matter Ackie it that he can't run any modern games (Doom3, HL2, Far Cry, Riddick, etc) at 16X12 4X8X and I can run them all at that setting. Having more pixels results in more realistic rendering, whether you like it or not.

Maybe you should go back to finding one ATI card that has lower power consumption than a nVidia card and pointing out how "we can't say ALL, because here's one ATI card that uses less power, and that's IMPORTANT!" :roll:

I wonder if Ackmed can point us to a post of his that gives Nvidia it's props. That would be interesting. I know I can find a few billion that defends ATI. I defend Nvidia not because I favor them, and not because I use them (because I use both) but because good 'ol folks like Acky here enjoys digging at them so much.

Can you point us to a post of yours Ackmed? That actually shows you not defending or deterring anyone from an nvidia card? I can show you plenty of posts where I recommend an ATI card over and NV card for whatever reasons.

 

Ackmed

Diamond Member
Oct 1, 2003
8,487
533
126
Originally posted by: Rollo

Must be in the mood to be p3wned some more.....

The fact of the matter Ackie it that he can't run any modern games (Doom3, HL2, Far Cry, Riddick, etc) at 16X12 4X8X and I can run them all at that setting. Having more pixels results in more realistic rendering, whether you like it or not.

Maybe you should go back to finding one ATI card that has lower power consumption than a nVidia card and pointing out how "we can't say ALL, because here's one ATI card that uses less power, and that's IMPORTANT!" :roll:

You must be in the mood to act 12 again?

Yes you can run games at a high setting than him, but the difference is far from, "more realistic than you can imagine ". Like you claimed. Thats just silly talk, and you know it. You sure cant run them at that setting with your touted features, that you like to seem to talk about, that are not very playable. Or even use AA with them.

And I didnt say anything about the power consumption, except that there are several tests that all have different results.

Here are a few:
http://www.spodesabode.com/content/article/6800upower
http://graphics.tomshardware.com/graphic/20040414/geforce_6800-19.html
http://www.xbitlabs.com/articles/video/display/ati-vs-nv-power_10.html
http://www.xbitlabs.com/articles/video/display/ati-vs-nv-power_2.html

As you can see, everyone has a different result, and all dont use the same testing method. Whos to say who is right?

Originally posted by: keysplayr2003

I wonder if Ackmed can point us to a post of his that gives Nvidia it's props. That would be interesting.

I have done it several times. I have said that I am very glad NV made a SM3.0 card, or else we wouldnt have any games that use it. I just dont think todays cards are fast enough to use them with playable frames.

I have also said several times that I think SLI is one of the two biggest things to hit PC gaming, to me.

What is funny is that you claim I deter people away from NV cards, which is not true. What I do, is point out (since Rollo never does) that these new features he always talks about, tank your frames, and dont always allow AA. He never says anything bad about these features, now who is deterring who?
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: Ackmed
Originally posted by: Rollo

Must be in the mood to be p3wned some more.....

The fact of the matter Ackie it that he can't run any modern games (Doom3, HL2, Far Cry, Riddick, etc) at 16X12 4X8X and I can run them all at that setting. Having more pixels results in more realistic rendering, whether you like it or not.

Maybe you should go back to finding one ATI card that has lower power consumption than a nVidia card and pointing out how "we can't say ALL, because here's one ATI card that uses less power, and that's IMPORTANT!" :roll:

You must be in the mood to act 12 again?

Yes you can run games at a high setting than him, but the difference is far from, "more realistic than you can imagine ". Like you claimed. Thats just silly talk, and you know it. You sure cant run them at that setting with your touted features, that you like to seem to talk about, that are not very playable. Or even use AA with them.

And I didnt say anything about the power consumption, except that there are several tests that all have different results.

Here are a few:
http://www.spodesabode.com/content/article/6800upower
http://graphics.tomshardware.com/graphic/20040414/geforce_6800-19.html
http://www.xbitlabs.com/articles/video/display/ati-vs-nv-power_10.html
http://www.xbitlabs.com/articles/video/display/ati-vs-nv-power_2.html

As you can see, everyone has a different result, and all dont use the same testing method. Whos to say who is right?

Originally posted by: keysplayr2003

I wonder if Ackmed can point us to a post of his that gives Nvidia it's props. That would be interesting.

I have done it several times. I have said that I am very glad NV made a SM3.0 card, or else we wouldnt have any games that use it. I just dont think todays cards are fast enough to use them with playable frames.

I have also said several times that I think SLI is one of the two biggest things to hit PC gaming, to me.

What is funny is that you claim I deter people away from NV cards, which is not true. What I do, is point out (since Rollo never does) that these new features he always talks about, tank your frames, and dont always allow AA. He never says anything bad about these features, now who is deterring who?

Then maybe you can share with the rest of us, what exactly your issues are? That would be nice. It would allow us to better understand you. Maybe you wouldn't get flamed so much. Are you from a country that has very different social cultures than here? (US).
I have no idea.

 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |