nVidia GT200 Series Thread

Page 16 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

superbooga

Senior member
Jun 16, 2001
333
0
0
Originally posted by: bryanW1995
if I ever open up a video card company, remind me to invite charley to my media days...

Video card company? Like eVGA or BFG? Or are you talking about a semiconductor company?
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: ghost recon88
Didn't see this posted anywhere in the past couple pages I skimmed, so sorry if its been posted.

Nvidia changes GT200 NDA date

One of the more interesting parts reads:

From: Philip Scholz [mailtoScholz@nvidia.com]

Sent: Wednesday, May 21, 2008 1:33 PM

Subject: IMPORTANT - Revised launch date: D10U (GeForce GTX 280 & 260)

Importance: High

All,

Please note the new launch date (pulled in by 1 day to align with additional Corporate Marketing initiatives): June 17th.

There are no other changes to this launch ?just the 1-day date change.

Please note there are ***2 embargo dates*** for this part:

1) Channel Customer Embargo: June 3rd, 2008 (06:00 PST |14:00 GMT)*

2) Consumer/Press Embargo: June 17th, 2008 (06:00 PST |14:00 GMT)

* Channel Customer Embargo (June 3rd, 2008): To ensure there is no disruption of your current products in the channel, we are imposing an embargo for discussing these new products with your other etail, distributor and retail Channel Partners; this is the date NVIDIA will begin discussing with these Channel Partners.

PLEASE MAKE A NOTE OF THIS NEW DATE AND LET ME KNOW IF YOU HAVE ANY QUESTIONS.

Thank you

Phil

So.... "pulled in by 1 day to align with additional Corporate Marketing initiatives"
Means... pulled back by one day to be released on the same day as AMD....

Actually I like that change. releasing one day earlier is gonna do very little (just slightly less initial stock). But that means both cards are released on the same day.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Hopefully AMD will go nuclear on them and release on may 31... then nvidia will retaliate and release on may 29...forcing amd to release on may 24...oops, never mind...

@superbooga: sorry, I didn't realize that the semiconductor slang police were on patrol. What I meant to say for the nuance-challenged is: "If I ever start a company with the intent of competing with nvidia and/or amd in the gpu market, I'll try to be extra nice to Charlie at the inq. He's just a mean person and I fear ever getting him mad at me." Was that better?
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: n7
I'm nicely asking that this thread derailment cease.

Make a new topic if so desired.

Just as i had to mention in the HD 4000 series thread, please don't take things way off topic, as this thead (like the AMD one) is for discussing the upcoming cards, not arguing over DX10 vs. 10.1, etc.

Thank you in advance.

n7
Video Mod

Not sure why dx 10.1 is off topic in a thread speculating about a new gpu? Could be an important feature and it would be nice to know if it will be supported and if not, why not.

Has to beat 80 pages of nvidia will be faster than amd and if not they will release a faster version........ but amd may be able to compete in the value market....... I have inside information but I can't divulge
 

Rusin

Senior member
Jun 25, 2007
573
0
0
Reminds me of those good old days when first Radeon-card was released. AMD took crown with it since it performed very well against Geforce 256..but less than 48 hours and Nvidia launched Geforce 2-cards..
 

golem

Senior member
Oct 6, 2000
838
3
76
Originally posted by: ronnn


Not sure why dx 10.1 is off topic in a thread speculating about a new gpu? Could be an important feature and it would be nice to know if it will be supported and if not, why not.

Has to beat 80 pages of nvidia will be faster than amd and if not they will release a faster version........ but amd may be able to compete in the value market....... I have inside information but I can't divulge

Did you read the thread? It wasn't really about DX10.1 in the GT200, it was a conspiracy theory that Microsoft removed DX10.1 features from DX10 to benefit Nvidia... It touched briefly on whether DX10.1 would be included in GT200 or not, but mostly it was just conspiracy theory and how it was wrong for MS to remove DX10.1 features from the shipping DX10.

That was the off topic part.
 

Rusin

Senior member
Jun 25, 2007
573
0
0
http://www.nordichardware.com/news,7809.html
Performance numbers`..don't know about these

Vantage-scores on Xtreme-profile (1920x1200 4xAA)
GeForce GTX 280 41xx
GeForce GTX 260 38xx
GeForce 9800GX2 36xx
GeForce 8800 Ultra 24xx
Radeon HD 4870 XT 26xx
Radeon HD 3870X2 25xx
Radeon HD 4850 Pro 20xx
Radeon HD 3870 14xx

Only 7% difference between GTX 280 and 260? Well I know this Vantage doesn't test gaming performance, but still
 

Hauk

Platinum Member
Nov 22, 2001
2,808
0
0
Originally posted by: Rusin
http://www.nordichardware.com/news,7809.html
Performance numbers`..don't know about these

Vantage-scores on Xtreme-profile (1920x1200 4xAA)
GeForce GTX 280 41xx
GeForce GTX 260 38xx
GeForce 9800GX2 36xx
GeForce 8800 Ultra 24xx
Radeon HD 4870 XT 26xx
Radeon HD 3870X2 25xx
Radeon HD 4850 Pro 20xx
Radeon HD 3870 14xx

Only 7% difference between GTX 280 and 260? Well I know this Vantage doesn't test gaming performance, but still

Interesting if real. I hope real benchies surface soon.
Most Wanted: 4870 CF vs. GTX 280
 

Rusin

Senior member
Jun 25, 2007
573
0
0
If those are real then GTX 280 shouldn't exists. I mean if it has 30% higher power consumption and over 30% higher price..and 7% difference in performance compared to GTX 260?
 

Hauk

Platinum Member
Nov 22, 2001
2,808
0
0
Originally posted by: Rusin
If those are real then GTX 280 shouldn't exists. I mean if it has 30% higher power consumption and over 30% higher price..and 7% difference in performance compared to GTX 260?

Good point. Probably why the 55nm was cast, with Ultra widening the gap down the road. If numbers hold, 260 should do well.

 

Rusin

Senior member
Jun 25, 2007
573
0
0
It would be nice to where did Andreas G get this "HD4k's low scores are due to driver problems"? HD4000's scores are pretty much what was expected if you compare against HD3870-scores. Why do they think that Nvidia has ready Vantage drivers? There are no official support for GT200-cards in their drivers. Out of these new cards these beta-drivers have some support for upcoming 9800 GT, but not GT200.

For consumer point of view GTX 260 would be better than 9800 GX2 (even if these scores would be right); cheaper, slightly better performance and smaller power consumption [GTX 260's TDP is smaller than 9800 GX2's realistic max power consumption..not to mention TDP which is close to GTX 280 levels]
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: SteelSix
Originally posted by: Rusin
http://www.nordichardware.com/news,7809.html
Performance numbers`..don't know about these

Vantage-scores on Xtreme-profile (1920x1200 4xAA)
GeForce GTX 280 41xx
GeForce GTX 260 38xx
GeForce 9800GX2 36xx
GeForce 8800 Ultra 24xx
Radeon HD 4870 XT 26xx
Radeon HD 3870X2 25xx
Radeon HD 4850 Pro 20xx
Radeon HD 3870 14xx

Only 7% difference between GTX 280 and 260? Well I know this Vantage doesn't test gaming performance, but still

Interesting if real. I hope real benchies surface soon.
Most Wanted: 4870 CF vs. GTX 280


If 4870s scale at that applike 3870s, the score would be 46xx.

At that point your question would be:

"Is 12% higher performance at this app worth the pitfalls of a multicard rig?"

Most notably that not all apps scale 78%, or even at all, and in those apps you'd be significantly slower at reputedly similar cost.

 

uribag

Member
Nov 15, 2007
41
0
61
Are these numbers for the HD4870 with GDDR5?
If they were with GDDR4 would there be any significant differences using GDDR5?
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: golem
Originally posted by: ronnn


Not sure why dx 10.1 is off topic in a thread speculating about a new gpu? Could be an important feature and it would be nice to know if it will be supported and if not, why not.

Has to beat 80 pages of nvidia will be faster than amd and if not they will release a faster version........ but amd may be able to compete in the value market....... I have inside information but I can't divulge

Did you read the thread? It wasn't really about DX10.1 in the GT200, it was a conspiracy theory that Microsoft removed DX10.1 features from DX10 to benefit Nvidia... It touched briefly on whether DX10.1 would be included in GT200 or not, but mostly it was just conspiracy theory and how it was wrong for MS to remove DX10.1 features from the shipping DX10.

That was the off topic part.

not off topic so much as off his rocker...
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Rusin
If those are real then GTX 280 shouldn't exists. I mean if it has 30% higher power consumption and over 30% higher price..and 7% difference in performance compared to GTX 260?

More like they didn't trim enough off and underclock the GTX 260 as much as they have in the past to create a largely superficial gap between parts. Keep in mind the G80 GTS was severely underclocked relative to the G80 GTX (513 vs 576), but NV and the review community successfully tricked everyone into thinking fewer SP was the main reason for performance differences. The GTX 260 does look to be very interesting not only in terms of pure performance, but to see just how much core architecture improvements have come since G80 and G92.

I don't know if 3DMark Vantage is any more accurate at gauging actual game performance than its predecessor, but those results do seem to be in-line with what I was expecting based on leaked specs and info. My only real concerns at this point are if rumors of a 55nm refresh of GT200 in the next 6 mos has any validity and to a lesser degree, how overclockable GT200 is (and how quickly EVGA can get OC versions out).
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
1. physx for cuda should release at the same time so SP should matter.
2. the GTS has smaller bus, less SP, underclocked, and a few other shortcomings, it was not a "trick" by nbvidia AND reviewers to convince anyone of anything.
3. the GTX 260 has a smaller bus, so it will suffer more on very high res, on multi GPU, and when AA/AF is enabled.. in other words, the performance gap widens the more "high end" things you try to do with it.
As long as you are doing mid range stuff with it, it is indeed very close to the 280. A really ingenious design principle.
 

Avalon

Diamond Member
Jul 16, 2001
7,567
152
106
Originally posted by: Rusin
http://www.nordichardware.com/news,7809.html
Performance numbers`..don't know about these

Vantage-scores on Xtreme-profile (1920x1200 4xAA)
GeForce GTX 280 41xx
GeForce GTX 260 38xx
GeForce 9800GX2 36xx
GeForce 8800 Ultra 24xx
Radeon HD 4870 XT 26xx
Radeon HD 3870X2 25xx
Radeon HD 4850 Pro 20xx
Radeon HD 3870 14xx

Only 7% difference between GTX 280 and 260? Well I know this Vantage doesn't test gaming performance, but still

I believe this is as real as it can possibly look.

That puts the 4800 series where I have been expecting, and like Rollo said, if you assume previous generation 78% CF scaling, the 4870X2 will be about 4600.

Continuing to assume that this is real, that means we can say that the 4870 is 85% faster than the 3870, and the GTX 280 is 71% faster than the 8800Ultra.

All in Vantage only, of course. And this is with AA applied, which seems to still be a problem for the 4800 series. (guessing)
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: taltamir
1. physx for cuda should release at the same time so SP should matter.
2. the GTS has smaller bus, less SP, underclocked, and a few other shortcomings, it was not a "trick" by nbvidia AND reviewers to convince anyone of anything.
3. the GTX 260 has a smaller bus, so it will suffer more on very high res, on multi GPU, and when AA/AF is enabled.. in other words, the performance gap widens the more "high end" things you try to do with it.
As long as you are doing mid range stuff with it, it is indeed very close to the 280. A really ingenious design principle.

1. It may be years before we see a game that supports PhysX via CUDA. There's what? 5-6 games that support PhysX via hardware so I wouldn't hold your breath on this one.

2. If you read older reviews you will clearly see they emphasize SP over all else with regards to performance difference between the GTS and GTX. Its still a commonly held misconception on these forums as a result, even after reviews and countless parts have proven otherwise (G92, G94 most notably with fewer SP and much higher shader clocks than G80). Realistically, it would've been as simple as testing the GTS and GTX at the same clock speeds to see any real difference between actual core and SP differences, yet, few if any reviewers actually did that. Only later when OC versions of the GTS and 320MB versions (widely available OC'd) were released did reviews show the GTS wasn't as far behind the GTX as initial reviews made it out to be.

3. Again, the GTX 260 isn't nearly as neutered from either a core or clock perspective compared to the GTX 280 as the GTS G80 was compared to the GTX G80 and the smaller difference is already showing in at least one leaked benchmark. Sure the difference may become more obvious at higher resolutions and settings, but it still won't be as pronounced as with G80 making the GTX 260 a very interesting part at $449.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
1. No, any phyx game out today, and any in development, will all work with phyx cuda as if it was a physX card with no changes needed to the game. And the drivers enabling it are supposed to be released on the same day as the G200
2. Much less of a compiracy, and more of an oversight... In crysis the SP DO matter a whole lot, and it was just assumed that more and more games will be extremely shader heavy. But most aren't.
3. Yea, that is the point. It is neutered in such a way that it is very close in low end tasks, but falls behind on high end tasks. Making it a much more desirable part, while maintaing the 280s supremacy (most high end users would get a GTX instead of an Ultra, I doubt any high end user will get a 260)
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
I will withhold judgement until actual games are benchmarked, but if 260 is legitimately going to hang that closely to 280 at 19x12 with 4xAA in such a demanding situation then it's going to be tough to make a case for spending the extra $150+ imho. The 260 at $450 could very well convince many potential ati customers to spend the extra $80 if 4870 is actually released at $369...
 

Hauk

Platinum Member
Nov 22, 2001
2,808
0
0
Originally posted by: nRollo
Originally posted by: SteelSix
Originally posted by: Rusin
http://www.nordichardware.com/news,7809.html
Performance numbers`..don't know about these

Vantage-scores on Xtreme-profile (1920x1200 4xAA)
GeForce GTX 280 41xx
GeForce GTX 260 38xx
GeForce 9800GX2 36xx
GeForce 8800 Ultra 24xx
Radeon HD 4870 XT 26xx
Radeon HD 3870X2 25xx
Radeon HD 4850 Pro 20xx
Radeon HD 3870 14xx

Only 7% difference between GTX 280 and 260? Well I know this Vantage doesn't test gaming performance, but still

Interesting if real. I hope real benchies surface soon.
Most Wanted: 4870 CF vs. GTX 280


If 4870s scale at that applike 3870s, the score would be 46xx.

At that point your question would be:

"Is 12% higher performance at this app worth the pitfalls of a multicard rig?"Most notably that not all apps scale 78%, or even at all, and in those apps you'd be significantly slower at reputedly similar cost.

Solid point indeed. Despite having an X48 board, I'm fairly certain GTX 280 is my next choice.

 

Foxery

Golden Member
Jan 24, 2008
1,709
0
0
There's a little tidbit buried in an Inquirer article today... the article itself is junk, but to summarize, TSMC will produce a GT200 die shrink from 65nm to 55nm in the fall. Area should drop from 576mm2 to ~400mm2, and this lower power+heat may come with a higher-clocked refresh.

Of course, all of this should fall under the category of "well, duh. Didn't need an inside source to guess that."
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: taltamir
1. No, any phyx game out today, and any in development, will all work with phyx cuda as if it was a physX card with no changes needed to the game. And the drivers enabling it are supposed to be released on the same day as the G200
2. Much less of a compiracy, and more of an oversight... In crysis the SP DO matter a whole lot, and it was just assumed that more and more games will be extremely shader heavy. But most aren't.
3. Yea, that is the point. It is neutered in such a way that it is very close in low end tasks, but falls behind on high end tasks. Making it a much more desirable part, while maintaing the 280s supremacy (most high end users would get a GTX instead of an Ultra, I doubt any high end user will get a 260)

1. Again, emphasizing CUDA PhysX support as a significant reason for more SP is about as relevant to gamers as news of a NV Folding@Home client. We'll see about the driver support, not that it really matters, as it'll probably be limited to GT200 parts and the 5 games that support PhysX.

2. Well I disagree here. I think it is a corporate initiative of Nvidia's to pressure reviewers not to disclose and publish information beyond the specs and default settings of a given part. I've read enough reviews that show how high a part overclocks or a brief comparison of stock vs. overclock performance, but rarely will you find in-depth comparison between parts with detailed clock speed comparisons. There's also been rumors of NV pressuring AIB partners to cease production of OC parts (which also typically launch later to avoid direct comparison to stock parts), most notably lately with the GTS 512MB. Not surprisingly, NV launched their own OC'd GTS 512MB a few months later but named it the 9800GTX.

I don't fault NV for doing it, they're obviously in the business of selling video cards so the more perceived differences the better for their bottom-line, I just don't like how reviewers and hardware sites that used to revel in making such comparisons suddenly seem indifferent or even intimidated from doing such comparisons. Just as bad as it is for NV or ATI to miss a product cycle, it'd be equally damaging for a review site to miss a product launch because they got cut off for breaking NDA or corporate guidance.

3. Again, I disagree here. GTX 260 sees much smaller reductions in clock speed and core components than G80 GTS. Without getting into all the details, you can see G80 GTS saw closer to 20% reduction across the core with ~1/5th fewer ROP, bandwidth, TMU and shaders along with ~12% difference in clock speed. GTX 260 is closer to 1/8th fewer ROP, bandwidth, TMU and shaders but only ~4% difference in clock speed. Also, I'm pretty sure GTX 260 has a lower shader clock than GTX 280 which again hints that GT200 will be limited by other factors before SP become the main bottleneck.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
I don't think they'll do a die shrink to 55nm, they'll go straight to 45nm but it will be more like 1yr + before it happens for the high end.
 

Leon

Platinum Member
Nov 14, 1999
2,215
4
81
I love the Inquirer...

R700 will crush the GT280 in just about every conceivable benchmark and likely cost less. ?

Going back a year or so from a same writer...


G80 is not going to be a converged shader unit like the ATI R500/XBox360 or R600, it will do things the 'old' way. In any case, NV is heavily downplaying the DX10 performance, instead shouting about DX9 to anyone who will listen. If the G80 is more or less two 7900s like we hear, it should do pretty well at DX9, but how it stacks up to R600 in DX9 is not known. R600 should annihilate it in DX10 benches though. We hear G80 has about 1/3 of it's die dedicated to DX10 functionality.

In any case, the G80 is shaping up to be a patchy part. In some things, it may absolutely scream, but fall flat in others.The architecture is looking to be quite different from anything else out there, but again, that may not be a compliment. In either case, before you spend $1000 on two of these beasts, it may be prudent to wait for R600 numbers.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |