G92 9800GTX and GTS

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: keysplayr2003
Sorry Coldpower, I hit the edit button instead of the quote button. Your post was not altered.

what I wanted to say was, isn't that just a CPU limitation or no?

No because of the game I am indeed referring to is F.E.A.R where 144 FPS is indeed possible at lower resolution so the game in completely GPU bound.

http://anandtech.com/video/showdoc.aspx?i=2870&p=22

Oblivion is the exception rather then the norm, I don't see anything close to 3x anywhere else, it is indeed generally 1.6x-2.0x compared to the 7900 GTX overall.
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,377
126
Originally posted by: coldpower27
Originally posted by: keysplayr2003
Sorry Coldpower, I hit the edit button instead of the quote button. Your post was not altered.

what I wanted to say was, isn't that just a CPU limitation or no?

No because of the game I am indeed referring to is F.E.A.R where 144 FPS is indeed possible at lower resolution so the game in completely GPU bound.

http://anandtech.com/video/showdoc.aspx?i=2870&p=22

Oblivion is the exception rather then the norm, I don't see anything close to 3x anywhere else, it is indeed generally 1.6x-2.0x compared to the 7900 GTX overall.

Well taking into account minimum fps numbers and other details, I'd say that the 8800 is truly double the performance of the 7900 when taken as an average, so long as your cpu doesn't suck. Maybe a little more / little less in certain situations, but .. I think 8800 = 2x 7900 is a fair assessment.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Man you guys must have a broken BS meter, this one is off the charts

The only thing that might be accurate is the transition to 65nm.
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Originally posted by: coldpower27
Originally posted by: keysplayr2003
Sorry Coldpower, I hit the edit button instead of the quote button. Your post was not altered.

what I wanted to say was, isn't that just a CPU limitation or no?

No because of the game I am indeed referring to is F.E.A.R where 144 FPS is indeed possible at lower resolution so the game in completely GPU bound.

http://anandtech.com/video/showdoc.aspx?i=2870&p=22

Oblivion is the exception rather then the norm, I don't see anything close to 3x anywhere else, it is indeed generally 1.6x-2.0x compared to the 7900 GTX overall.



If you check out Anandtech's 8800Ultra review in which the GTX is compared to the x1950xtx (which most of us can agree I suspect I faster than 7900GTX) the GTX more than doubles (at all resolutions) the performance of the x1950xtx. Sometimes it's a tad less, generally more than 2x

To be truly accurate the 8800gtx is on average 2x as fast, sometimes less, sometimes more
 

f4phantom2500

Platinum Member
Dec 3, 2006
2,284
1
0
Originally posted by: Extelleron
Originally posted by: manimal
Maybe they intended all along to let the 8800 series to be the midrange ports. Would sort of explain why the 8600 line suck so bad for gaming. I hope they offer the 8800s for cheap since it would offer alot of people a nice upgrade since 400 bucks is NOT midrange.

The 8800GTS 320MB is midrange, at around the $250-300 price point. Prices for cards range from about $60 to $600, and the GTS 320 is right in the middle of that. It is the "upper" range of midrange, but it is definately midrange.

As for the 8800 becoming a budget-midrange card... on a 90nm process as it is now, not a hope of that becoming true. The G80 die is estimated to be 480mm^2 in size, which is absolutely huge and it would simply not be possible to offer a chip that size at any price lower than where the GTS 320 is.

On a 65nm process, it is possible, but unlikely. 240mm^2 is still pretty big for a low-midrange card in the $100-200 price point. I can't see nVidia offering a card with a bigger die size than a 7900GTX at that low of a price... it's simply not economical, especially since nVidia can easily make a chip based on the (surely) more efficient G90 architecture, with higher clockspeeds but less shaders, and make a chip just as powerful as the 8800's, but at a much lower cost. nVidia has always made small, cheap cards for the 6600-7600-8600 series, and I can't see why a "9600" series would be any different.

I agree 100%. I think the 8800 cards will drop in price, but they won't really be economical compared to the new cards. Eventually they'll stop selling and the leftovers will just be ridiculously expensive because the retailers don't want to sell it for that much cheaper than nVidia sold it to them.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: Acanthus
Man you guys must have a broken BS meter, this one is off the charts

The only thing that might be accurate is the transition to 65nm.

And dual precision. This has been already confirmed. (reference to the CUDA documentations)
 

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
Its fake spec :!
eDRAM die for "FREE 4xAA". >> Yeah right... Go look up why ATI never put this feature on 2900xt
Its fake spec :!

This is basically a spec taken from all the good features from current product :!

the dude took some bits of spec from the new Tesla Server , 2900XT , XBOX 360 and fusion.
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0


From Xtremesystems G92 thread. Supposedly the guy who posted the specs you see in the OP posted the following a full month before HD 2900XT release:



"The specs were "leaked" by a Erik Larsson, same name appears on a comment on TGDaily news piece back in April 20th (AMD?s R600 event ? We?re packing our bags for Tunisia):

Quote:
AMD's ATI R600 will NOT be any better than GeForce 8800 GTX - NO NEED TO WAIT
Apr 22, 2007 13:12

My cusin is an nVIDIA engineer (and I'll be too) and he told me that nVIDIA got hold of an R600 sample.they tested it,benchmarked it and played with it and found that R600 HD 2900 XT is no better than nVIDIA's GeForce 8800 GTX GPU..Infact the 8800 GTX proved superior to R600 XT in shader intensive apps and pc games due to the G80 better implemintation of the unified shader architecture and also due to the fact the G80 is 100% efficient scalar & streaming design..performance wise, 8800 GTX is 15-20% faster than R600 XT in shader intensive apps and games...so for all ATI fanboys the wait for R600 was for nothing..G80 is much more powerful than R600 from architectural perspective..G80 is scalar vs R600 vector etc...
the only edge R600 XT has over 8800 GTX is its price standing at MSRP of 399 USD...another news is that R600 XTX 1GB will not make it to the public..The retail version of R600,the XT model will have only 512MB GDDR memory (1600DDR) yeilding in at 102.4 GB/s bamdwith..compared to 8800 GTX 768MB GDDR3 memory..so in summary :
GeForce 8800 GTX 768MB vs Radeon HD 2900 XT 512MB is a lost fight for AMD/ATI..

Erik Larsson

_____________"
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
The thing is that hes wrong in couple of points. R600 has infact higher theoretical shader performance than G80 (much more i believe). This is FACT. The reason it underperforms in shader intensive apps is because a) it is a VLIW architecture i.e each individual apps needs to be opitmised to see the real potential of the R600 (well theres others things e,g scheduler and such that also can be optimized for further performance) b) weak AA performance that makes the R600 fall behind the pack in most of the games.

Its a rather bold if not foolish statement to say which has the better unified architecture. I mean how can we judge which implementation is better? (maybe if they create a demo that has a scene that uses ALOT of PS, then VS then GS etc etc making the GPU do much more work to sort out which shaders are being used for that particular scene).

And the fact that he will be an engineer with that kind of post makes me think theres some BS involved. I also dont believe nVIDIA has a R600 sample because a) no drivers b) why did they release the 8800 ultra just before the R600?
 

lopri

Elite Member
Jul 27, 2002
13,212
597
126
I predict this time NV will definitely try to up the prices of ultra-high end video cards. Since NV is almost a generation ahead of AMD (not talking about technology, but talking about product cycles), they have arguably the best chance to try it in a lot more convincing way. Not like 8800 Ultra vs 8800 GTX, mind you, but more like 7800 GTX vs 7800 GTX 512. Say, they can release 9800 GTX for $600, then a couple months later 9800 Ultra which will be significantly faster than 9800 GTX for $800~900. Again, think of 7800 GTX vs 7800 GTX 512 but this time without any threat from AMD. If performance difference is enough to justify the price, people will buy - and NV will effectively create a new high end market. (They have done it in the past already with SLI) If successful, this will bring many positive effects for NV (not just the extra margins). It will further differentiate NV from AMD and Intel at high-end discrete GPU market (like *Porche* of GPU as they elegantly put it), and it can help them against the lawsuit regarding the price fixing, etc.

It's just one of the possible scenarios without the presence of competition. Only time will tell but I'd be glad if at least we see the performance jump we used to see, regardless of the prices.. Too pessimistic?
 

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
Originally posted by: lopri
I predict this time NV will definitely try to up the prices of ultra-high end video cards. Since NV is almost a generation ahead of AMD (not talking about technology, but talking about product cycles), they have arguably the best chance to try it in a lot more convincing way. Not like 8800 Ultra vs 8800 GTX, mind you, but more like 7800 GTX vs 7800 GTX 512. Say, they can release 9800 GTX for $600, then a couple months later 9800 Ultra which will be significantly faster than 9800 GTX for $800~900. Again, think of 7800 GTX vs 7800 GTX 512 but this time without any threat from AMD. If performance difference is enough to justify the price, people will buy - and NV will effectively create a new high end market. (They have done it in the past already with SLI) If successful, this will bring many positive effects for NV (not just the extra margins). It will further differentiate NV from AMD and Intel at high-end discrete GPU market (like *Porche* of GPU as they elegantly put it), and it can help them against the lawsuit regarding the price fixing, etc.

It's just one of the possible scenarios without the presence of competition. Only time will tell but I'd be glad if at least we see the performance jump we used to see, regardless of the prices.. Too pessimistic?

Actually they aren't a generation head of AMD :! Its that AMD engineer screwed up by some of decision they made on the architecture. Problem is that all the major developers are in TWIMTBP program which means the game gets tweaked for SLI and Nvidia GPU. Also nvidia sends couple of engineers to development team to tweak the code so it uses more texture instead of being shader heavy. Major problem for ATI is that engine like Unreal 3.0 uses texture fill rate to do their effect.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Actually, the R600 finally made it to retail about the time Nvidia was due for a refresh, but Nvidia didn't need to bother this time around.
AMD IS a product cycle behind, but not a whole generation.
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
* 2x performance of G80? Hmm.. Doubtful.. Not every day is Sunday as history has taught us..
* eDRAM ? We hear it every time since Xenos, who knows maybe this time it will be true.. After all isn't 4xAA a requirement for 10.1 or am I mistaken?
* 512bit/1024 ? Who knows? As stated though nVIDIA's G80 is certainly not bandwith limited..
For me the most important rumour is the built in core tesselation unit

Who knows? Let the rumours spread.. It's still too early..
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: Frackal
Originally posted by: coldpower27
Originally posted by: keysplayr2003
Sorry Coldpower, I hit the edit button instead of the quote button. Your post was not altered.

what I wanted to say was, isn't that just a CPU limitation or no?

No because of the game I am indeed referring to is F.E.A.R where 144 FPS is indeed possible at lower resolution so the game in completely GPU bound.

http://anandtech.com/video/showdoc.aspx?i=2870&p=22

Oblivion is the exception rather then the norm, I don't see anything close to 3x anywhere else, it is indeed generally 1.6x-2.0x compared to the 7900 GTX overall.



If you check out Anandtech's 8800Ultra review in which the GTX is compared to the x1950xtx (which most of us can agree I suspect I faster than 7900GTX) the GTX more than doubles (at all resolutions) the performance of the x1950xtx. Sometimes it's a tad less, generally more than 2x

To be truly accurate the 8800gtx is on average 2x as fast, sometimes less, sometimes more

No, that is incorrect again Battlefield 2, Prey, and Supreme Commander the 8800 Ultra again is not double the X1950 XTX, and that is 8800 Ultra I am talking about the 8800 GTX. you cannot extrapolate performance like that, you also need the actual card in the test. You start introducing errors that way.

http://www.techarp.com/showart....aspx?artno=403&pgno=7

1.6x - 2.0x on average remains accurate and I am only taking 16x12 and 19x12 results with 4xAA results against the comparison of the 7900 GTX to 8800 GTX.
 

Rusin

Senior member
Jun 25, 2007
573
0
0
Originally posted by: jim1976
* 2x performance of G80? Hmm.. Doubtful.. Not every day is Sunday as history has taught us..
Who knows? Let the rumours spread.. It's still too early..
Well if we believe Nvidia G92 will be more than 1 teraflops monster. Weren't G80 two times faster than G71?
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Originally posted by: Rusin
Originally posted by: jim1976
* 2x performance of G80? Hmm.. Doubtful.. Not every day is Sunday as history has taught us..
Who knows? Let the rumours spread.. It's still too early..
Well if we believe Nvidia G92 will be more than 1 teraflops monster. Weren't G80 two times faster than G71?

Theoretical outputs are far from real gaming situations as we have seen many many times..There are many things that can go wrong in a gpu architecture that will give the final picture and performance.. Granularity ftw..
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Originally posted by: coldpower27
Originally posted by: Frackal
Originally posted by: coldpower27
Originally posted by: keysplayr2003
Sorry Coldpower, I hit the edit button instead of the quote button. Your post was not altered.

what I wanted to say was, isn't that just a CPU limitation or no?

No because of the game I am indeed referring to is F.E.A.R where 144 FPS is indeed possible at lower resolution so the game in completely GPU bound.

http://anandtech.com/video/showdoc.aspx?i=2870&p=22

Oblivion is the exception rather then the norm, I don't see anything close to 3x anywhere else, it is indeed generally 1.6x-2.0x compared to the 7900 GTX overall.



If you check out Anandtech's 8800Ultra review in which the GTX is compared to the x1950xtx (which most of us can agree I suspect I faster than 7900GTX) the GTX more than doubles (at all resolutions) the performance of the x1950xtx. Sometimes it's a tad less, generally more than 2x

To be truly accurate the 8800gtx is on average 2x as fast, sometimes less, sometimes more

No, that is incorrect again Battlefield 2, Prey, and Supreme Commander the 8800 Ultra again is not double the X1950 XTX, and that is 8800 Ultra I am talking about the 8800 GTX. you cannot extrapolate performance like that, you also need the actual card in the test. You start introducing errors that way.

http://www.techarp.com/showart....aspx?artno=403&pgno=7

1.6x - 2.0x on average remains accurate and I am only taking 16x12 and 19x12 results with 4xAA results against the comparison of the 7900 GTX to 8800 GTX.



No, you are still incorrect even with those somewhat arbitrary resolution and AA limitations. (I don't know that site you linked to.)


According to anand's latest review, with the latest drivers:

http://anandtech.com/video/showdoc.aspx?i=2988&p=24


The 8800GTX/Ultra vs the x1950XTX at 1600x1200 and 1920x1280 is:

BF2 - Irrelevant, clear CPU limitation since it gets 144fps in both resolutions, though at 2560x1600 4xaa the Ultra is around 190% the performance of the X1950xtx


Oblivion - It is more than 200% above the x1950xtx (2x+ perf)


Prey - Ultra is ~160% perf of x1950xtx


Rainbow Six Vegas - Ultra is more than 200% perf of x1950xtx (2x+)


Stalker - Ultra is between 2x and greater than 2x as fast as X1950xtx


Supreme Commander - Ultra is 180% as fast as x1950xtx 1.8x as fast.


So to summarize:


BF2 - 1.9x as fast
Oblivion - More than 2x as fast
Prey - 1.6x as fast (only time it actually is 1.6x faster)
Vegas - More than 2x as fast
Stalker - Between more than 2x as fast and 2x as fast
Supreme Commander - 1.8x as fast (and some concern of CPU limitiation here too)


To say 1.6-2.0x as fast isn't really representative. It's generally 2x as fast, sometimes a tad less, sometimes a tad more.




 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
The only way i can see them gaining this kind of increase is the massive die shrink that is likely to coincide with it.

Keeping the same architecture, tossing on some logic, and literally doubling the number of everything....

Im expecting G92 to be nothing more than a refresh with a die shrink and some extra logic, with the amount of R&D behind G80 i seriously doubt they will abandon the architecture quickly.

What was the figure? 5 years and $480m? I remember hearing about NV50 (which later became G80) all the way back when the Geforce 2 GTS was out.
 

firewolfsm

Golden Member
Oct 16, 2005
1,848
29
91
Originally posted by: Acanthus
Im expecting G92 to be nothing more than a refresh with a die shrink and some extra logic, with the amount of R&D behind G80 i seriously doubt they will abandon the architecture quickly.

Of course they're not dumping the architecture, why should they? G80 is a great chip. It'll be like the move to G70, which isn't necessarily a bad thing.

 

zephyrprime

Diamond Member
Feb 18, 2001
7,512
2
81
People have got to remember one thing when considering claims like "2x the performance!". These info leaks are usually from the marketing department and are controlled and artificially made to look like clandestine leaks. So "2x the performance!" in marketing speak really means "Up to 2x the performance in a few handpicked games that show the product in its best light." The real performance gain is probably ~1.6x on average or something like that.

A new architechture generation like the G80 probably has a lot of room for clockspeed gain from circuit optimization. This plus the clockspeed gains from going to 65nm plus an increase in functional units afforded by going to 65nm could reasonably net the G92 a big performance advance over the G80.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: coldpower27

1.6x - 2.0x on average remains accurate and I am only taking 16x12 and 19x12 results with 4xAA results against the comparison of the 7900 GTX to 8800 GTX.

In these games:

Anno 1701
Call of Duty 2
Call of Juarez
Company of Heroes
Doom 3
F.E.A.R.
Gothic 3
HL2: Lost Coast
Oblivion
Prey
Rainbow Six Vegas
The Chronicles of Riddick
Serious Sam 2
Splinter Cell 3
Splinter Cell 4
Stalker
Tomb Raider: Legend
Direct3D-10-Benchmarks
Call of Juarez D3D10

8800GTX vs. 7900GTX

1280x1024 4AA/16AF = +246%
1600x1200 4AA/16AF = +254%
2560x1600 4AA/16AF = +299%

8800GTX is AT LEAST 2X faster than 7800GTX. 8800GTS 320mb is 1.6x faster rather.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |