*** Unofficial 8800 GTS review thread ***

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: MTDEW
ArchAngel777,
I'd have to disagree, if we had an 8800GTX killer, i'd say its pretty safe you could purchase a used 8800GTX now for $350 - $380. (which is what the new GTS is going for)

But thats all speculation so we could argue that all day.

My point is.... Since Ati/Amd arent "pushing" Nvidia, we arent getting much performance increases this year.

As all the benchmarks show , we just have a bunch of cards all bunched together with similar performance.

I guess I wasn't dissagreeing about the fact that high end cards serve to bring us GTX performance at 1/2 the price. I was just argueing that the actual GTX itself does not drops in price. They would introduce a new card that would be on par with a 8800GTX, for 1/2 the price, but the G80 8800GTX would still be an overpriced pig.

Anyhow, not anything worthy to argue about... I am just waiting for the 9800GTX. I can't wait two months to purchase the card, so I broke down and bought a GTS... Then, if all goes well maybe I will try this 'step' up program... Otherwise, my dad will get this card and I will sell his 8800GTS640MB.

I also tend to agree with people here... It seems they should have went with GDR4 memory... :-/
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: MTDEW
So for anyone who's buying an 8800GTS and thinks all 8800GT owners think you're crazy....
Dont worry , its not true...we all want your dual slot cooler! :thumbsup:

That is the only reason I purchased a GTS over the GT was for the dual slot cooler. I even had the Dell 8800GT on order at $207 (who knows when it would ship though) and cancelled it to order a $350 GTS. The GTS should have had GDR4 in the range of 2400Mhz... Looks like they just put a new core on the same PCB, even though that isn't exactly what happened, it sure seems like it.

The real question is... Is it safe to overclock the memory on these? Jonnyguru did make a post about the memory controller crapping out on the 8800GT's, even though I am skeptical (still am) it doesn't mean I can discount that totally. Curious that so far no 8800GTS has went higher than 1940Mhz memory... Even the 8800GT has 1950 and 2000 for the SSC variant...
 

stapuft

Member
Nov 8, 2007
30
0
66
So for anyone who's buying an 8800GTS and thinks all 8800GT owners think you're crazy....
Dont worry , its not true...we all want your dual slot cooler! :thumbsup:

Cooler envy...


However, for many of us, an 8800GTS is NOT $30-50 more, it's more like $80-120 more on top of what we paid (at launch, the 8800GT was $229-240 and subsequent deals that's come out since [BB with coupons: $210-ish and Dell: $208]) for a better cooler and slightly higher clocks?

You represent a lucky minority. Those prices represent an exception to the current market rule. In your situation the idea of paying these prices for a GTS seems insane. To others it seems logical to pay $60 more for the better cooler, power and SP's. I envy you as I hesitated on GT launch day.


Please. Clock-for-clock, the GTS is the same thing as the GT.

Yes, as the benchies indicate. Yet a comparison of max overclock would be in order. This will play an important role in the debate. We need a new overclock thread :evil:
 

Hauk

Platinum Member
Nov 22, 2001
2,806
0
0
I feared there'd be a GT vs. GTS clash. The smartest guys in the room at nVidia are laughing at you right now!! Single vs. dual slot coolers, saving $50-75 dollars, clock speeds, benchmarks.

Imagine this conversation taking place in a room filled with hot women. How many would be falling over you, Mr. Best Bang for the Buck Guy, or you, Mr. Dual Slot Cooler Guy? :lips:

Paying for last years performance, is that the new catch phrase? Last years performance was $500+ last year. Any way you slice it, both G92 offerings are a pretty damn good deal.

Now quit bickering and put those cards to use! Play some games will ya??
 

Nanobaud

Member
Dec 9, 2004
144
0
0
Originally posted by: SteelSix
I feared there'd be a GT vs. GTS clash. The smartest guys in the room at nVidia are laughing at you right now!! Single vs. dual slot coolers, saving $50-75 dollars, clock speeds, benchmarks.

Imagine this conversation taking place in a room filled with hot women. How many would be falling over you, Mr. Best Bang for the Buck Guy, or you, Mr. Dual Slot Cooler Guy? :lips:

Paying for last years performance, is that the new catch phrase? Last years performance was $500+ last year. Any way you slice it, both G92 offerings are a pretty damn good deal.

Now quit bickering and put those cards to use! Play some games will ya??

That'll get 'em swooning for sure....

 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
I reckon the GTS is the better Overclocker. Hothardware manage to get the G92 core (with all 128SP enabled of course) to 800~MHz on core and 2.2Ghz on Memory. Kinda surprised to see some sites hitting 2.2GHz on memory, since thats a bandwidth of 71GB/s which is alot from 62GB/s at stock.

Here are some things that i think make the GTS worthy to buy:
-Overclocking headroom! both on core (800MHz+) and memory (2200MHz+)
-Dual slot cooler of course
-More stability with higher OCes thanks to its 3 phase power design.
-Since it has all its clusters enabled i.e 128SPs compared to GTs 112, OCing will have a slightly better effect on the GTS compared to the GT.

But then if your using this as stock, i dont see much point because the prices seem pretty gouged right now. A 8800GT and a zalman vf1000 seems like the better deal and cheaper too.


You know what everybody really wants for christmas?
8800GTX v2.

What is this?
Full G92 + higher vgpu + 700~MHz core (or higher) + 1GB of GDDR4 memory clocked at 2400MHz (or higher) + dual slot + dual SLi connector for tri SLi + a price tag of $499 to replace the old GTX/ultra.

Ahh the glories of wishful thinking
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,996
126
The new cooler looks interesting but if the card is plagued by availability issues like the GT it won't really help anyone.
 

R3MF

Senior member
Oct 19, 2004
656
0
0
Why don't any of the vendors overclock the memory on the Factory OC versions of the GTS 512?

Its more memory bandwidth that's really needed!
 

AzN

Banned
Nov 26, 2001
4,112
2
0
It still trails the GTX which was expected. The only constraint on these G92 core over the gtx is the memory bandwidth and its inability to saturate G92 core massive fillrate over GTX.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: chizow
Well, read over a few more reviews and I think this is the first time I can say that an AT review and a review by Anand isn't even worth reading. Really getting tired of reviewers falling into the "stock clock testing" crap and then touting one part as better than the other.

A MUCH better comparison done at FiringSquad (maybe my new favorite bench site) shows:


Firing Squad Review

When looking over this review, pay special attention to:

Leadtek GeForce 8800 GT Extreme (680MHz core/1.0GHz memory)

vs.

XFX GeForce 8800 GTS 512MB XXX (678MHz core/986MHz memory)

Almost no difference at all in performance.......

Those numbers are ever so slight. You won't see any kind of REAL difference even when benchmarked.
 

aldamon

Diamond Member
Aug 2, 2000
3,280
0
76
Originally posted by: R3MF
Why don't any of the vendors overclock the memory on the Factory OC versions of the GTS 512?

Its more memory bandwidth that's really needed!

Because they still have GTXs to sell.
 

stapuft

Member
Nov 8, 2007
30
0
66
Guru3D review.

Not the best, comparing 3850 directly to the 8800's on COD4 (no 3870?), then leaving GT out on Crysis. It's nice to see people still using BF2 to benchmark. :roll:

Overclocking report just plain sucks, one game? C'mon.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,996
126
It still trails the GTX which was expected
Not always; for example there are cases where it's faster even at 2560x1600.

The results are generally very close which tells me memory bandwidth isn't the main issue given the GTX has 39% more of it.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: BFG10K
It still trails the GTX which was expected
Not always; for example there are cases where it's faster even at 2560x1600.

The results are generally very close which tells me memory bandwidth isn't the main issue given the GTX has 39% more of it.

1. G92GTS has faster shader clocks with same number of shaders.
2. More fillrate
3. lower memory bandwidth

Yet G92GTS is still slower than a gtx with lower fillrate and slower shaders. I thought shader is what matters like you applied but in the real world it doesn't matter so much does it?

If the G92 had faster memory you don't think it will beat a GTX right? You don't have the slightest clue and I TOLD YOU SO!!!
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Azn
Originally posted by: BFG10K
It still trails the GTX which was expected
Not always; for example there are cases where it's faster even at 2560x1600.

The results are generally very close which tells me memory bandwidth isn't the main issue given the GTX has 39% more of it.

1. G92GTS has faster shader clocks with same number of shaders.
2. More fillrate
3. lower memory bandwidth

Yet G92GTS is still slower than a gtx with lower fillrate and slower shaders. I thought shader is what matters like you applied but in the real world it doesn't matter so much does it?

If the G92 had faster memory you don't think it will beat a GTX right? You don't have the slightest clue and I TOLD YOU SO!!!

  • 1. G92 does show higher gains in shader intensive games. Crysis is an excellent example where the 20-30% increase in shader ops/sec do shine on the G92 and tend to give it the edge vs. the G80, especially since pixel fillrate isn't as important since shader ops/sec seem to be more of a bottleneck.
    2. G92 only has higher texel fillrate; G80 still leads significantly in pixel fillrate at 24x575 (13,800Gflops) vs. 16x650 (10,400Gflops).
    3. Memory bandwidth *still* shows no significant gains in performance on a G80 or G92. Honestly this is the part that you don't seem to understand. If you had a G80 or G92 you'd see this is the case, not to mention it would be common sense for NV to simply use GDDR4 if they (or anyone) saw any benefit from increased memory bandwidth with this generation parts.

Its pretty obvious NV made the move to G92 for cost-cutting purposes and designed G92 as such. They cut ROPs and memory controllers as those are the most transistor-expensive features of the GPU. With the smaller process they were able to increase performance in other areas, like increasing Texture Mapping Units to match Texture Address Units and increasing core/shader clocks.

If there was a need for more bandwidth, as you seem to think, they would've simply used faster RAM; either faster GDDR3 found on the Ultra or GDDR4. There is of course the possibility for a faster part with faster RAM based on G92 but I think NV knows what everyone else who has a G80/G92 knows, that increasing memory speed alone yields little to no gains in performance.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,996
126
Yet G92GTS is still slower than a gtx with lower fillrate and slower shaders.
Again it is faster in some situations:

In Fear it's faster at 1280x1024, 1600x1200, 1600x1200 8xAA/16xAF (faster than the Ultra there too) and at 2560x1600.

It also ties with the GTX at 2560x1600 8xAA/16xAF (you can?t get much more memory bandwidth limited than that) and it's pretty close the rest of the time.

If your texturing + memory bandwidth theories were correct the GTS should be absolutely blowing away the GTX at 1280x1024 with no AA since memory bandwidth isn't an issue there and Fear doesn't use HDR rendering.

Yet that isn't happening.

I thought shader is what matters like you applied but in the real world it doesn't matter so much does it?
It does matter which is why the GTS is so close to the GTX despite having less ROPs, texturing and memory bandwidth.

If memory bandwidth mattered the GTX would be 39% faster pretty much across the board but that isn't the case. The fact is the GTS wins in some cases and is extremely close in most others.

Again I never said memory bandwidth never mattered, just that it's not as important as shader performance.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: BFG10K
Yet G92GTS is still slower than a gtx with lower fillrate and slower shaders.
Again it is faster in some situations:

In Fear it's faster at 1280x1024, 1600x1200, 1600x1200 8xAA/16xAF (faster than the Ultra there too) and at 2560x1600.

It also ties with the GTX at 2560x1600 8xAA/16xAF (you can?t get much more memory bandwidth limited than that) and it's pretty close the rest of the time.

If your texturing + memory bandwidth theories were correct the GTS should be absolutely blowing away the GTX at 1280x1024 with no AA since memory bandwidth isn't an issue there and Fear doesn't use HDR rendering.

Yet that isn't happening.

I thought shader is what matters like you applied but in the real world it doesn't matter so much does it?
It does matter which is why the GTS is so close to the GTX despite having less ROPs, texturing and memory bandwidth.

If memory bandwidth mattered the GTX would be 39% faster pretty much across the board but that isn't the case. The fact is the GTS wins in some cases and is extremely close in most others.

Again I never said memory bandwidth never mattered, just that it's not as important as shader performance.

Don't you think that has something to do with massive fillrate over GTX abilities as well? Raw fillrate gives you better RAW performance especially when memory bandwidth isn't constraint with AA, post processing, etc.. FEAR isn't really a Shader intensive game as any of the modern games like Crysis or unreal Tournament because even a 7900gtx does really well in this game which you know all 7 series are prone of having weak shaders.

Shader is important but it's not as important as having better fillrate abilites. Memory acts as a carrier to a GPU how fast you can relay that information. That is why 8800gtx wins. Not because 8800gtx is a more powerful GPU. G92 has more power but it is being tied down much like 8600gt is being tied down its full potential.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: chizow
Originally posted by: Azn
Originally posted by: BFG10K
It still trails the GTX which was expected
Not always; for example there are cases where it's faster even at 2560x1600.

The results are generally very close which tells me memory bandwidth isn't the main issue given the GTX has 39% more of it.

1. G92GTS has faster shader clocks with same number of shaders.
2. More fillrate
3. lower memory bandwidth

Yet G92GTS is still slower than a gtx with lower fillrate and slower shaders. I thought shader is what matters like you applied but in the real world it doesn't matter so much does it?

If the G92 had faster memory you don't think it will beat a GTX right? You don't have the slightest clue and I TOLD YOU SO!!!

  • 1. G92 does show higher gains in shader intensive games. Crysis is an excellent example where the 20-30% increase in shader ops/sec do shine on the G92 and tend to give it the edge vs. the G80, especially since pixel fillrate isn't as important since shader ops/sec seem to be more of a bottleneck.
    2. G92 only has higher texel fillrate; G80 still leads significantly in pixel fillrate at 24x575 (13,800Gflops) vs. 16x650 (10,400Gflops).
    3. Memory bandwidth *still* shows no significant gains in performance on a G80 or G92. Honestly this is the part that you don't seem to understand. If you had a G80 or G92 you'd see this is the case, not to mention it would be common sense for NV to simply use GDDR4 if they (or anyone) saw any benefit from increased memory bandwidth with this generation parts.

Its pretty obvious NV made the move to G92 for cost-cutting purposes and designed G92 as such. They cut ROPs and memory controllers as those are the most transistor-expensive features of the GPU. With the smaller process they were able to increase performance in other areas, like increasing Texture Mapping Units to match Texture Address Units and increasing core/shader clocks.

If there was a need for more bandwidth, as you seem to think, they would've simply used faster RAM; either faster GDDR3 found on the Ultra or GDDR4. There is of course the possibility for a faster part with faster RAM based on G92 but I think NV knows what everyone else who has a G80/G92 knows, that increasing memory speed alone yields little to no gains in performance.

I never said shader doesn't do anything but without Fillrate and memory bandwidth saturation it is useless as a 8600gt with 128SP. The whole flop process doesn't tell you how powerful a GPU is. That is only when dealing with Single textures. When you use multiple textures like any game since the days of 3dfx G92 has a huge advantage over G80.

G92 can easily handle an ultra or gtx if it did have that extra bandwidth but 8800gt or GTS is really supposed to be midrange card anywho.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Azn
I never said shader doesn't do anything but without Fillrate and memory bandwidth saturation it is useless as a 8600gt with 128SP. The whole flop process doesn't tell you how powerful a GPU is. That is only when dealing with Single textures. When you use multiple textures like any game since the days of 3dfx G92 has a huge advantage over G80.
Huh? The whole point in measuring in gigapixels and gigatexels per second is to give you an idea how many pixels/textures can be rendered per second since multiple textures are simply a composite of rendered pixels/texels. Since the 3dfx days fillrate and pixel pipelines has always been the main measuring stick of a GPU's performance. Only with G80 was there a divergence from the in-line pipe with pixel/vertex shaders going to a unified architecture while being separated from the render back-ends running at independent clock speeds.

And again, bandwidth saturation means absolutely nothing, since bandwidth is only important when there isn't enough of it. "Saturating" it vs. "non-saturating" yields no benefit in performance; if anything "saturating" your memory bandwidth will result in worst performance.

There's no doubt that all aspects of a GPU needs to be proportionate in order to maximize its performance to eliminate potential bottlenecks, but again, I've yet to see a single benchmark published or from personal experience with a G80 or G92 that shows any significant increase in performance by increasing memory bandwidth. The same cannot be said if you increase the core (render back ends) or shader clock (shader ops/sec). Can't really put it in simpler terms than that.

G92 can easily handle an ultra or gtx if it did have that extra bandwidth but 8800gt or GTS is really supposed to be midrange card anywho.
Once again, show me a single benchmark or user-test that shows a benefit from only increased memory bandwidth. There's a relatively new G92 GTS OC'ing thread that's just started up and many users are expecting their G92 GTS' in the next few days. Its really simple. Ask a few people to run some tests by simply increasing memory clockspeeds vs. stock memory clock speeds and see if there is any difference in performance.
 

ManWithNoName

Senior member
Oct 19, 2007
396
0
0
Originally posted by: ArchAngel777

The real question is... Is it safe to overclock the memory on these? Jonnyguru did make a post about the memory controller crapping out on the 8800GT's, even though I am skeptical (still am) it doesn't mean I can discount that totally. Curious that so far no 8800GTS has went higher than 1940Mhz memory... Even the 8800GT has 1950 and 2000 for the SSC variant...

You seemed just a bit more than skeptical, the last time I made a thread about what he said ......

Originally posted by: ManWithNoName
Originally posted by: ArchAngel777
Then why don't you question him before spreading the word? Sure, he knows a lot about power supplies, but I am not sure that qualifies him as being an expert in graphics cards and/or overclocking.

I think we can be sure that the interface can handle it, otherwise you are going to see RMA's through the roof. Also, keep in mind that damage done to parts isn't an on/off type of thing, it is gradual. Thus, if 2000 memory causes these things to fail at 2 weeks, you would expect to see 1950 fail after 4 weeks, maybe 6 and so on. It may not be linear, but you can be sure damage does in fact occour. So who knows... Maybe, he is right, maybe these cards are going to die on you after 2 weeks. But I doubt it.

So sorry, thought I might be doing a few of you a favor or at least giving you a heads-up on a "potential" issue and I thought I would leave you to make up your own mind on the validity of his comments since "most" of everyone here on the forum appears to be intelligent. As to why I didn't question him myself, see post above where I already stated the reason. I provided a link to the thread, please feel free to question him at your leisure.......

 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: ManWithNoName

**useless banter**

Do you even know what skeptical is? Moreover, do people remain static in their views? I guess you seem to think so...
 

ManWithNoName

Senior member
Oct 19, 2007
396
0
0
Originally posted by: ArchAngel777
Originally posted by: ManWithNoName

**useless banter**

Do you even know what skeptical is? Moreover, do people remain static in their views? I guess you seem to think so...

Sorry, I'm not playing this time and I won't be suckered into saying anything which will get me Mod-ified either. Hope you get your 8800GTS real soon and I hope you enjoy your card. Mine arrives next week and I picked-up mine for basically the same reasons as you. Oh, and that "useless banter" thing, very creative. Take Care.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |