Geforce 7800GTX Confirmed Specs + Picture

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Ackmed
The problem with those power consumption results, is there is not ONE other that backs it up. There are several of these articles, from different hardware sites. They ALL have different results, and not just by a little.

Sigh- here's your "superior ATI low power engineering at work:

http://www20.graphics.tomshardware.com/graphic/20040414/geforce_6800-19.html
Gee, a 9800XT sucking more juice than 3 generations of nVidia's best! The one before it, the one after, and the comparable. Go figure.
 

Ackmed

Diamond Member
Oct 1, 2003
8,487
533
126
Originally posted by: Rollo
Originally posted by: Ackmed
The problem with those power consumption results, is there is not ONE other that backs it up. There are several of these articles, from different hardware sites. They ALL have different results, and not just by a little.

Sigh- here's your "superior ATI low power engineering at work:

http://www20.graphics.tomshardware.com/graphic/20040414/geforce_6800-19.html
Gee, a 9800XT sucking more juice than 3 generations of nVidia's best! The one before it, the one after, and the comparable. Go figure.

Please show me where I said, "superior ATI low power engineering", or else dont misquote me, thanks.

If you look at xibit, it shows the 9800XT well under the 6800U, and the 5950U. Now whos right?
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Originally posted by: Rollo
Originally posted by: Ackmed
The problem with those power consumption results, is there is not ONE other that backs it up. There are several of these articles, from different hardware sites. They ALL have different results, and not just by a little.

Sigh- here's your "superior ATI low power engineering at work:

http://www20.graphics.tomshardware.com/graphic/20040414/geforce_6800-19.html
Gee, a 9800XT sucking more juice than 3 generations of nVidia's best! The one before it, the one after, and the comparable. Go figure.



One thing I would like to point out is how funny I think it is that Nvidia always catches hell for releasing power specifications.

NV's power specs are always higher than ATi's, but ATi's cards are obviously sucking more juice. What I want to know is what does this mean: Is NV being overly cautious, or is ATi being overly risky?

The main reason I stick with NV is because of the "It just works" principle - I've never had a single compatibility or driver problem with them - but the other list of features they're sporting right now doesn't hurt either.
 

Ackmed

Diamond Member
Oct 1, 2003
8,487
533
126
Originally posted by: keysplayr2003

Then maybe you can share with the rest of us, what exactly your issues are? That would be nice. It would allow us to better understand you. Maybe you wouldn't get flamed so much. Are you from a country that has very different social cultures than here? (US).
I have no idea.

I dont know if I have issues myself, but some things that I do take issue with are people misleading, and plain lying to further their agenda.

Take for example certain people lying how ATi's cards have a 2003 feature set, over and over again. This is not true, the X800 series can do more than the 9800 series.

An example of misleading time and time again, is certain people like to tout that the current NV cards can do extra features in a select few games, when trying to pursuade people looking for help on which card to buy. Never once telling them that their frames tank to unplayable levels most of the time, and that they usually lose AA in the game. Making a claim that you will be able to play next-gen games with these new features enabled. Why should we think they can play next gen games with SM3, if they cant do it in current games very well? Especially on the lower cards? Showing only the good, and not the bad is misleading, and doesnt help the person looking for help.

1. I think the 6800 series rocks. I traded my X800 Pro for a 6800GT because I thought it was faster. I think until the X800XL came out, NV had the best lineup. Especially since the X800XT/PE was hard to get, anywhere near MSRP most of the time. Despite my opinion that the XT/PE being the fastest card, ATi didnt have anything to compete with the 6800GT, and still dont for the 6600GT, in my opinion. At $200 I wouldnt buy anything but a 6600GT, currently.

2. I am glad NV put SM3 into their cards, or else we wouldnt even have the few games that support it, even in its rudimentary form. Hopefully next gen cards do it much faster, and with better support.

3. I am glad NV gave us SLI. Despite its early and several problems, when working right its the fastest out there. Most of the problems are all worked out now, and the prices for cards are where they should be. If I was going to buy a new setup right now, it would be 2x6800GT's in SLI. That was my plan to start with, I have a SLI capable mobo, and PSU. However, the prices were $500+ for each card at the time, and too many problems that I didnt want to put up with. As I said, most are ironed out, and prices are down, and its attractive to me.

4. I think you can make a very good case, for the 6800GT, or the X800XL. The 6800U, or the XT or XT/PE. As they are all pretty close in speed, save for a few games, with pricing being the biggest factor in the decision for me. Claiming there is no good reason to buy any ATi card right now, is just ignorance, plain and simple. Do you agree, or disagree?

I could go on, but I dont feel like it. Do I favor ATi? Yes I do. Everyone has a bias, being small or big, everyone does. Nobody can say they like both the same, even if its 51% to 49%, one is your favorite. It just annoys me when certain people go out of their way, to lie and mislead, to further their agenda. In the end, I hope the new cards are even closer than they were this gen, because that only helps the consumer, as it makes them try harder, which gives us more frames, and features.
 

Ackmed

Diamond Member
Oct 1, 2003
8,487
533
126
Originally posted by: Insomniak




One thing I would like to point out is how funny I think it is that Nvidia always catches hell for releasing power specifications.

NV's power specs are always higher than ATi's, but ATi's cards are obviously sucking more juice. What I want to know is what does this mean: Is NV being overly cautious, or is ATi being overly risky?

How can you say, "obviously"? They all have different numbers.

Here is another I found; http://techreport.com/reviews/2004q4/radeon-x850xt/index.x?pg=12

It shows the X850XT/PE at 289, and the 6800U at 271. A lot different than the others in other tests.. It also shows the 6800GT at 257, and the X800 Pro at 193. With the most power being consumed by SLI, at 365. If we can have SLI numbers compared ti single cards for benchmarking games, why not compare their watts as well?

 

zendari

Banned
May 27, 2005
6,558
0
0
Originally posted by: ddogg
Originally posted by: zendari
50% increase is about all I expected.

Summer '02: 9700 Pro came out
Summer '04: 6800/x800 (2x performance of the previous)

It seems a bit early for a new batch of cards, right now we should be getting a "refresh" like the 9800 was to the 9700.

Since its only 1 year guess you don't get the 2x again.

the competition has tremendously increased between the two companies,...its not like in the days of the Geforce 3 and 4 where 2X increase took atleast 2-3years....we should definetely see close or more than 2X performance increase.

SM3 was a crappy feature to me for this generation IMO. I bought a 6800 GT last june, becaues I got a $300 deal on it.

People fuss about SM3, the truth is that Nvidia had a better offering with the 6800 GT than ATI did anyway at the time.

As for this gen, I'll wait and see ATIs offering, perhaps they can do something closer to a 2x increase.

The pricing of the 7800 will be out of this world though, and its not like the 6800s are slouching on anything thus far anyway.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: Ackmed
Originally posted by: keysplayr2003

Then maybe you can share with the rest of us, what exactly your issues are? That would be nice. It would allow us to better understand you. Maybe you wouldn't get flamed so much. Are you from a country that has very different social cultures than here? (US).
I have no idea.

I dont know if I have issues myself, but some things that I do take issue with are people misleading, and plain lying to further their agenda.

Take for example certain people lying how ATi's cards have a 2003 feature set, over and over again. This is not true, the X800 series can do more than the 9800 series.

An example of misleading time and time again, is certain people like to tout that the current NV cards can do extra features in a select few games, when trying to pursuade people looking for help on which card to buy. Never once telling them that their frames tank to unplayable levels most of the time, and that they usually lose AA in the game. Making a claim that you will be able to play next-gen games with these new features enabled. Why should we think they can play next gen games with SM3, if they cant do it in current games very well? Especially on the lower cards? Showing only the good, and not the bad is misleading, and doesnt help the person looking for help.

1. I think the 6800 series rocks. I traded my X800 Pro for a 6800GT because I thought it was faster. I think until the X800XL came out, NV had the best lineup. Especially since the X800XT/PE was hard to get, anywhere near MSRP most of the time. Despite my opinion that the XT/PE being the fastest card, ATi didnt have anything to compete with the 6800GT, and still dont for the 6600GT, in my opinion. At $200 I wouldnt buy anything but a 6600GT, currently.

2. I am glad NV put SM3 into their cards, or else we wouldnt even have the few games that support it, even in its rudimentary form. Hopefully next gen cards do it much faster, and with better support.

3. I am glad NV gave us SLI. Despite its early and several problems, when working right its the fastest out there. Most of the problems are all worked out now, and the prices for cards are where they should be. If I was going to buy a new setup right now, it would be 2x6800GT's in SLI. That was my plan to start with, I have a SLI capable mobo, and PSU. However, the prices were $500+ for each card at the time, and too many problems that I didnt want to put up with. As I said, most are ironed out, and prices are down, and its attractive to me.

4. I think you can make a very good case, for the 6800GT, or the X800XL. The 6800U, or the XT or XT/PE. As they are all pretty close in speed, save for a few games, with pricing being the biggest factor in the decision for me. Claiming there is no good reason to buy any ATi card right now, is just ignorance, plain and simple. Do you agree, or disagree?

I could go on, but I dont feel like it. Do I favor ATi? Yes I do. Everyone has a bias, being small or big, everyone does. Nobody can say they like both the same, even if its 51% to 49%, one is your favorite. It just annoys me when certain people go out of their way, to lie and mislead, to further their agenda. In the end, I hope the new cards are even closer than they were this gen, because that only helps the consumer, as it makes them try harder, which gives us more frames, and features.

Well said. But you can't go around calling people liars for expressing their opinions.
For example, saying the X800's have a 2003 feature set is pretty accurate to a lot of people. The believe that 3dc and the like are just software tweaks. And as far as I have heard, an R300 or better can perform 3dc with the latest/proper drivers.

Please give some examples of things an X800 can do that a 9800 can't.

Agree about 6600GT being the best 200.00 buy right now.
Agree the X800XL is the best 300.00 buy right now for PCI-e.
Not certain if nvidia features makes card tank. I have not experienced this yet on my 6800GT.

Thanks

 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: Ackmed
Originally posted by: Insomniak




One thing I would like to point out is how funny I think it is that Nvidia always catches hell for releasing power specifications.

NV's power specs are always higher than ATi's, but ATi's cards are obviously sucking more juice. What I want to know is what does this mean: Is NV being overly cautious, or is ATi being overly risky?

How can you say, "obviously"? They all have different numbers.

Here is another I found; http://techreport.com/reviews/2004q4/radeon-x850xt/index.x?pg=12

It shows the X850XT/PE at 289, and the 6800U at 271. A lot different than the others in other tests.. It also shows the 6800GT at 257, and the X800 Pro at 193. With the most power being consumed by SLI, at 365. If we can have SLI numbers compared ti single cards for benchmarking games, why not compare their watts as well?

Well, the answer to that is pretty simple. People care about framerates more than they care about the power they are consuming. If someone goes SLI, chances are that they don't give a flying (expletive) how much juice they are using. They care about playability and framerates. Now, people will say, "Bullsh!t, I wouldn't buy that power hungry setup if it costs 2 bucks!" You know they are full of it. Absolutely full of it no matter how much they claim they are monitoring their electric bills. That is a HUGE cop out. Nobody watches their electric meter spinning as they fire up their PC. But you can be damn sure they scour the planets internet for fps benches to see who won.

 

Ackmed

Diamond Member
Oct 1, 2003
8,487
533
126
Originally posted by: keysplayr2003


Well said. But you can't go around calling people liars for expressing their opinions.
For example, saying the X800's have a 2003 feature set is pretty accurate to a lot of people. The believe that 3dc and the like are just software tweaks. And as far as I have heard, an R300 or better can perform 3dc with the latest/proper drivers.

Please give some examples of things an X800 can do that a 9800 can't.

Agree about 6600GT being the best 200.00 buy right now.
Agree the X800XL is the best 300.00 buy right now for PCI-e.
Not certain if nvidia features makes card tank. I have not experienced this yet on my 6800GT.

Thanks

Its not an opinion. Its not true. Not only does the X series have 3Dc, but also PS2.0b support.

Yes, they tank.

Riddick++: http://www.firingsquad.com/hardware/chronicles_of_riddick_perf_2/
Farcry with HDR: http://www.firingsquad.com/hardware/chronicles_of_riddick_performance/
SC: CT with HDR: http://www.firingsquad.com/hardware/splinter_cell_chaos_theory_1/

As you can easily see, frames tank, hard. And not telling someone that, when touting these features, is misleading.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Ackmed
Originally posted by: keysplayr2003

Then maybe you can share with the rest of us, what exactly your issues are? That would be nice. It would allow us to better understand you. Maybe you wouldn't get flamed so much. Are you from a country that has very different social cultures than here? (US).
I have no idea.

I dont know if I have issues myself, but some things that I do take issue with are people misleading, and plain lying to further their agenda.
I never lie Ackmed.

Take for example certain people lying how ATi's cards have a 2003 feature set, over and over again. This is not true, the X800 series can do more than the 9800 series.
Very little? An unused compression routine? An unused by MS DX9 spec -DX9b? Woot-
http://www.anandtech.com/video/showdoc.aspx?i=2044&p=3
Looking inside an individual vertex pipeline, not much has changed from R3xx. The vertex pipeline is laid out exactly the same, including a 128bit vector math unit, and a 32bit scalar math unit. The major upgrade R420 has had from R3xx is that it is now able to compute a SINCOS instruction in one clock cycle
R420's feature set support can be described as an extended version of Shader Model 2.0, offering a few more features above and beyond the R3xx line (including more support of longer shader programs, and more registers).
Wow- and we only had to wait a year and a half for these leaps forward! :roll:


An example of misleading time and time again, is certain people like to tout that the current NV cards can do extra features in a select few games, when trying to pursuade people looking for help on which card to buy. Never once telling them that their frames tank to unplayable levels most of the time, and that they usually lose AA in the game.
Not everyone uses AA. Its better to be able to check out the features than not. New features usually put the hurts on first gen hardware. I remember well the "ATIs 9800 Pro only takes a 50% hit running DX9, 5900Us take a 75% hit!" arguments. New features imposing huge performance tradeoffs is nothing new.
That being said, any 6800U nVidia owner should be able to run Riddick with soft shadows or Far Cry with HDR at 10X7 and see what's possible at least.

Making a claim that you will be able to play next-gen games with these new features enabled. Why should we think they can play next gen games with SM3, if they cant do it in current games very well?
What are you talking about? SM3 only helps performance?

1. I think the 6800 series rocks. I traded my X800 Pro for a 6800GT because I thought it was faster.
It is.
I think until the X800XL came out, NV had the best lineup.
The X800XL looks like a desperation move from ATI to me- it launched at less than the foretold MSRP and the price plummeted from there.

Especially since the X800XT/PE was hard to get, anywhere near MSRP most of the time. Despite my opinion that the XT/PE being the fastest card, ATi didnt have anything to compete with the 6800GT, and still dont for the 6600GT, in my opinion. At $200 I wouldnt buy anything but a 6600GT, currently.
Agree.

2. I am glad NV put SM3 into their cards, or else we wouldnt even have the few games that support it, even in its rudimentary form. Hopefully next gen cards do it much faster, and with better support.
Agree.

3. I am glad NV gave us SLI. Despite its early and several problems, when working right its the fastest out there. Most of the problems are all worked out now, and the prices for cards are where they should be.
Agree with exception of "several" problems.

4. I think you can make a very good case, for the 6800GT, or the X800XL. The 6800U, or the XT or XT/PE. As they are all pretty close in speed, save for a few games, with pricing being the biggest factor in the decision for me.
Agree.

Claiming there is no good reason to buy any ATi card right now, is just ignorance, plain and simple. Do you agree, or disagree?
Disagree. Because while they're indeed close, ATI is missing too much to disprove the statement.

I could go on, but I dont feel like it. Do I favor ATi? Yes I do. Everyone has a bias, being small or big, everyone does. Nobody can say they like both the same, even if its 51% to 49%, one is your favorite. It just annoys me when certain people go out of their way, to lie and mislead, to further their agenda. In the end, I hope the new cards are even closer than they were this gen, because that only helps the consumer, as it makes them try harder, which gives us more frames, and features.
What agenda? If you're talking about me, I'm just discussing one of my hobbies.

 

Regs

Lifer
Aug 9, 2002
16,665
21
81
There all good cards. But I'm very happy I bought the 6800GT when I did. When I bought it SM3.0 was honestly the last thing on my mind. Everybody here is full of crap if SM 3.0 influenced their purchase over a year ago. Sm 3.0 didn't become a big deal until they introduced it's capabilities 6 months after they hit the shelves.

I bought the 6800GT because Nvidia and their vendors actually had them out in large enough quantities. Where was the X800? I don't know but it wasn't out and when It did come out it was the same price as the GT. The GT was clearly the better performer to me. So while ATi was playing catch up, I was enjoying playing games at 1600x1200 at max quality settings. Nvidia offered me what I wanted when I needed it. That does still count for something, doesn't it? In the long run the 6800GT really showed its value and I'm very satisfied with my purchase. The XT and PE and the x800XL are great cards, but they came out too late and came out in too little quantities. The 6800GT was 400 dollars while you were lucky to get a XT or PE on eBay for less than 700-800 dollars.

You guys are right about the next generation of cards. With the new generation coming out in a few months, who cares? However it's exactly that line of thought why I think it's hypocritical for any ATi owner right now to base any purchasing decision based on what's coming out in the next generation. How long did you have to wait for Ati to push out the X800's and XE/PE's in a competitive price and in actual quantity while Nvidia was pushing out it's 6800GT's? Did you actually wait while the 6800GT's were cleaning house? You waited all those months and now it seems the next generation is all ready here. What a waist. SM 3.0 should of been the last thing on your mind.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Ackmed
Originally posted by: keysplayr2003


Well said. But you can't go around calling people liars for expressing their opinions.
For example, saying the X800's have a 2003 feature set is pretty accurate to a lot of people. The believe that 3dc and the like are just software tweaks. And as far as I have heard, an R300 or better can perform 3dc with the latest/proper drivers.

Please give some examples of things an X800 can do that a 9800 can't.

Agree about 6600GT being the best 200.00 buy right now.
Agree the X800XL is the best 300.00 buy right now for PCI-e.
Not certain if nvidia features makes card tank. I have not experienced this yet on my 6800GT.

Thanks

Its not an opinion. Its not true. Not only does the X series have 3Dc, but also PS2.0b support.

Yes, they tank.

Riddick++: http://www.firingsquad.com/hardware/chronicles_of_riddick_perf_2/
Farcry with HDR: http://www.firingsquad.com/hardware/chronicles_of_riddick_performance/
SC: CT with HDR: http://www.firingsquad.com/hardware/splinter_cell_chaos_theory_1/

As you can easily see, frames tank, hard. And not telling someone that, when touting these features, is misleading.


The point is that you can run HDR at 10X7 or 12X10, get your 30-40fps, and at least see what it looks like.
 

Steelski

Senior member
Feb 16, 2005
700
0
0
This is the biggest pieces of"!£!£""£!£"!"£!"£!"£ thread i have seen for a while.


I like the fact that all the Nvidia Bummas are raving on about SM3.0.........Its very funny.

Yes the ATI X series does not support SM3.0.................
Big deal, Fact of the matter is that what SM3.0 offers is only slight optimization and not real innovation............to this point that is.
ATI were right when they said that there would be no real differance in performance or quality to this point. Granted there are some things that look nice, HDR in Farcry, But check out the performance hit.........I dont know any others that have such a large difference in visuals.
The point is that that you have a real hard time running chronicles of ridick with full features. HA HA....Too bad
Accept it, this generation reall was not cut out for SM3.0 as much as you had hoped.
I am in no dout that G70 will be far more optimized for it than the last genaration........ That would just be too cool for words seeing how one card can dimmolish your performance lead with sli in really complex SM3.0

We all know that that will happen.
There is no way that the next gen is not twice as fast, The R520 looks to be a 32 pipe sollution with faster pipes and as rich a featurelist as the G70.
As the R500 for the Xbox 2 has been compared to a 32 pipe normal machine then it would seem to be a sure thing.
The G70 will also probably be 32 pipe because the PS3 sollution has been compared to sli'd ultras.
I dont know how anyone can bash AMR as it is not even out and looks to be very promising with 4 ways of rendering.
one that is likley to really increas realism. the ultra multisample anti-aliasing. modes that go to 12x and 14x AA.
even though there will be master cards, that is still more choice than SLI ever gives.
TWO CARDS EXACTLY THE SAME for SLI, different cards for AMR. if that is true and Nvidia are stopping production of the 6800 series then where is the futureproofing in that?Basically this means that EVERY PCI express card that ATI has made is AMR ready with an the Master card.
What i really dont get is how some people bought two 6600GT/Ultra cards to use in SLI.
If there is anyone here that has done that, WHAT A GEEK you really are. ....similar performance to single top end cards, yet crappy performance with AA and anestropic filtering. and not to mention the cost of getting a nice PSU to power it. correct me if i am wrong.
I am not dissing its future potential use. but why now?


By the way i have a 9800 pro and am also very happy with the performance. in any game............including Doom 3 which was really crap (content wise).
HL2 is a non issue as i was even using AA with hardly any hit.
DOOM3 is the biggest game to take advantage of SLI right. and it does not look any better. with a higher end rig, just occasionally slightly smoother.

Yes Nvidia has had better technology in the past but they are equally as inovative as each other now. The free Anti-Aliasing on the XBOX2 with 10 MB embeded ram?
The unified shader architecture.
3Dc, not usefull yet but will be very usefull in future hardware. (I know what your gonna think SM3.0 is futureproofing 6800's, and you would be right to a certain extent but again that was not nvidia but microsoft)
temperal antialiasing is decent.

I am bantering on about something that i cant remember anymore.
Ah yes.
My 9800 pro has been good enough for two genarations after it.
In the future the most noticable thing that will hold my card back is the 128mb ram that it has. the x800 xl 512 has proven that there is a case to be made with much higher minimal frame rates in almost everything. especially RTS games. i even noticed a change going from a 9800 xt to a 9800 pro. 128mb differeance in FAR CRY, but not enought to justify the greater cost.
I think that the biggest problem in the industry is the LCD display pannel. Its a fundamentally flawed desighn.
Low image quality for a high price
Its replacement the OLCD is far better a design.

Anyway back to cards.
ATI and Nvidia are good companies with great products if you can desipher which is good, and which is bad.

Let me make a list of good and bad products that are currantly available at their respected price range.

dual 6600 single boards, why not just spend slightly more and get a decent single gpu card?
dual 6800 will probably rule, but ultimately be out too late as the next gen cards arrise.
SLI 6800 GT great if you can put up with the limitations and the extra cost.
SLI 6800 Ultra, not much better than 6800 no reason to go for this instead of that unless you are penile and a very high earner or a professional of some sort.
6800 Ultra, Bad for price, not the best for performance,
X850 XT bad price slightly better performance.
x850 xt pe, really bad price, only slightly better performance,
x800 xt, resonable price, as good as 6800 ultra
x800 pro vivo. good price, as good as xt if modded (very highly lightly)
6800 GT good price resonable performance
x800 XL v good price performance around 6800 gt
6600 any v good price bad performance compared to
x800 vanilla v good price very good performance considering price.
9800 pro good price, slightly worse than 6600, can not recomend.
9500 pro great price, performance is slightly starting to show.

FX range of cards, ,,,,,,,,utter pantaloons at DX9 resonable price for what you get.


BY the way to anyone that says that power consumption does not matter.
I bet your computer raises the temp in that room at least a few more degreese than a single card.

anyway i am bored now...
Ciao
 

Ackmed

Diamond Member
Oct 1, 2003
8,487
533
126
Originally posted by: Rollo

1. I never lie Ackmed.

2. Very little? An unused compression routine? An unused by MS DX9 spec -DX9b? Woot-
http://www.anandtech.com/video/showdoc.aspx?i=2044&p=3

3. That being said, any 6800U nVidia owner should be able to run Riddick with soft shadows or Far Cry with HDR at 10X7 and see what's possible at least.

4. Disagree. Because while they're indeed close, ATI is missing too much to disprove the statement.

1. Saying the X800 cards have the same feature set as a 2003 card, is a lie. They are close, but not the same.

2. An unused DX9 sepc? Thats pretty funny. I seem to remember you making a pretty big argument for the 6800's with the Farcry patch, when it gave them an increase in speed, with PS3.0. Guess what, the X800 go the same boost with 2.0b. So why doesnt it matter for ATi, but it does for NV?

3. And what about people with LCD's? We cant just lower our settings, without distortion. There are A LOT of gamers with LCD's, with a native res well above 1024x768. Here is what I think you would say if you were to argue for it. "I don't think using a 2000 resolution with a 2005 card is how it was ment to be played." Even at that low res, the jaggies would DRIVE ME CRAZY. Yes it is nice to at least see it. I was pretty excited to try HDR in Farcry. However that was soon gone, sonce my frames were in the teens, and didnt care how it looked all of the time. I used this "feature" ll of about 30 mins in the 6 months I had my 6800GT.

4. It doesnt matter if you disagree or not, your opinion is not the same of everyone else. Many people find the XL's lower price, and virtually same performance as the 6800GT, a better choice. Saying there is no reason to get a ATi card is plain ignorant.

edit, numbering mine.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Ackmed
1. Saying the X800 cards have the same feature set as a 2003 card, is a lie. They are close, but not the same.
It's technically a "lie" but not by much. The differences aren't worth noting as they have no official support in games, and it's unlikely developers will start retro-coding for them.

2. An unused DX9 sepc? Thats pretty funny. I seem to remember you making a pretty big argument for the 6800's with the Farcry patch, when it gave them an increase in speed, with PS3.0. Guess what, the X800 go the same boost with 2.0b. So why doesnt it matter for ATi, but it does for NV?[/quote]
Because 2b has no official support in any game AFAIK.

3. And what about people with LCD's? We cant just lower our settings, without distortion. There are A LOT of gamers with LCD's, with a native res well above 1024x768. Here is what I think you would say if you were to argue for it. "I don't think using a 2000 resolution with a 2005 card is how it was ment to be played." Even at that low res, the jaggies would DRIVE ME CRAZY. Yes it is nice to at least see it. I was pretty excited to try HDR in Farcry. However that was soon gone, sonce my frames were in the teens, and didnt care how it looked all of the time. I used this "feature" ll of about 30 mins in the 6 months I had my 6800GT.
I've been playing Riddick with Soft Shadows, I've played Far Cry with HDR. It's cool to check out, not preferred way to run the game yet, to be sure.

4. It doesnt matter if you disagree or not, your opinion is not the same of everyone else. Many people find the XL's lower price, and virtually same performance as the 6800GT, a better choice. Saying there is no reason to get a ATi card is plain ignorant.

edit, numbering mine.

I can think of one instance it MIGHT make sense to buy an ATI card now- everyone's favorite freebie budget card the PCIE X800XL.

 

Ackmed

Diamond Member
Oct 1, 2003
8,487
533
126
Originally posted by: Rollo

It's technically a "lie" but not by much. The differences aren't worth noting as they have no official support in games, and it's unlikely developers will start retro-coding for them.

Because 2b has no official support in any game AFAIK.

Farcry's HDR is not officially supported either. You have to enable it in the console, why does it count for you? With 2.0b and 3.0, you dont need to do anything special to turn it "on". How is that no official?


 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
HDR = a lot of techniques. Bloom is one of them, but tone-mapping is the big one that nV can do and ATI can't, ATM.

I guess it makes jaggies more apparent b/c the higher dynamic range allows for greater contrast. The reason you can't use AA with HDR in nV's cards is probably b/c they figured it wouldn't be worth the extra cost in transistors and, thus, yields.

Rollo, Far Cry makes use of PS2b, and the X800XL actually debuted above its $300 MSRP--as do most new cards, whether it's for a few weeks (in the case of the XL) or a few months (in the cases of the initial X800s and 6800s).
 
Jun 14, 2003
10,442
0
0
Originally posted by: FinalFantasy
Originally posted by: otispunkmeyer
Originally posted by: Sentential
430MHz Clock Speed
24 pixel pipelines
8 vertex shaders
Shader Model 3.0
1.4ghz DDR3
256MB of RAM

http://www.tweakers.net/ext/i.dsp/1117060919.jpg


hmmm i dunno i still think them specs are a little tooo modest

If these specs are right, the R520 is going to destroy the G70.


theres more to things than just raw numbers

PS2 as an example
5800ultra is another

nvidia have had a SM3 part out for a year so it is possible to assume that they have the upper hand when it comes to dealing and working with it.

i believe NV had the most efficient gpu this time round, and i believe they'll have it again. if they can nearly double performance while only increasing pipe numbers by half (ie 8) then that will say alot about the efficiency of the gpu they have designed
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |