nVidia GT200 Series Thread

Page 18 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Janooo

Golden Member
Aug 22, 2005
1,067
13
81
Originally posted by: taltamir
that inquirer article reads more like a "madam cleo" predition... and is probably just as accurate.. GT200's SUCCESSOR tapped out? firstly, its it G200, there is no T. And secondly, the G200 isn't even being sold yet, so how exactly are we to beleive that they finished it's successor. And the... what the hell am I doing... There is no need for me to go into point by point analysis, it is the INQ. That says it all.

No T when they missed the teraflop mark.

Well, the inq is presenting NV NDA info. It's just in a special way to show their "love".
 

Janooo

Golden Member
Aug 22, 2005
1,067
13
81
Originally posted by: taltamir
also... so what if it costs them 150$ to make each one? those buggers are set to sell (and perform at a level of) 600$ a piece... that is 300% MARGIN (aka, selling at 400%).

Not too terrible if you ask me.

150 is only chip; then the board, memory (16x), ... It's much less than 300%.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
still more then AMD makes per chip.
I am seeing the sales figures and AMD makes about 100% profit, nvidia about 200% profit.
 

HOOfan 1

Platinum Member
Sep 2, 2007
2,337
15
81
Satan Clara? was that a typo or an intentional misspelling? Either way, extremely unprofessional. I hope noone ever gives these guys press credentials.
 

Janooo

Golden Member
Aug 22, 2005
1,067
13
81
Originally posted by: taltamir
still more then AMD makes per chip.
I am seeing the sales figures and AMD makes about 100% profit, nvidia about 200% profit.

So it was 300%. Now it's 200%. What's next number?
 

ddarko

Senior member
Jun 18, 2006
264
3
81
Originally posted by: taltamir
also... so what if it costs them 150$ to make each one? those buggers are set to sell (and perform at a level of) 600$ a piece... that is 300% MARGIN (aka, selling at 400%).

Not too terrible if you ask me.

You've completely missed the forest for the trees. Whatever that markup is on this particular chip is - 100%, 200%, 300% - it's incredibly important how much it costs to manufacture because the cost of manufacture determines how many chips can ultimately be sold. Or are you under the delusion that the market for $600+ chip is vast? I mean, seriously, you cannot be this naive to think that manufacturing cost is immaterial just because the markup is high.

Oh, and the gross margins for Nvidia as a company is around 40-50%. This is the only figure that matters, not the margin on a single line of chips:

http://seekingalpha.com/articl...plains-margin-problems

Insofar as the general point of the Inquirer piece was that yields will be low on a chip this size, there is nothing controversial about it. It is an axiom of chip manufacturing, bigger parts means fewer chip per wafer means more opportunities for defects means lower yields means higher costs. This is reality. And it is not sustainable. Nvidia will simply price themselves out of business. Intel went through this with their Pentium architecture a couple years ago. Nvidia will have to make the "right turn" as well.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
if the performance it gives is so far ahead of anything smaller...
I am sure they are getting awesome yields on the G92 chips... those can continue to remain in the low end while they are selling those GTX 280s at the high end.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: ddarko
Oh, and the gross margins for Nvidia as a company is around 40-50%. This is the only figure that matters, not the margin on a single line of chips:

http://seekingalpha.com/articl...plains-margin-problems

That linked article actually deflects blame away from G80, the last monolithic GPU from NV, and onto G92's yield problems (in explaining Q1 FY09 margin problems). It also emphasizes the point NV's monstrous earnings in FYE '08 were largely built on the success of G80, which again shows an expensive chip can be profitable as long as its performance justifies its hefty price tag and the product sells well. Despite similar rumors about G80 being unprofitable, NV managed to package one at an affordable sub-$300 price point only a few months after release in the form of the GTS 320MB.

There's certainly going to be those that argue NV isn't focused on the high-end for their profits but I think the last few years have proven otherwise. We can see from their stockholder letter that ~75% of their business is focused on discrete desktop GPUs (61% desktop, 14% professional) with at least 25% of that being mid to high-end parts (AMD Gamer's Alliance Slide from AT). That's @65 million PCs classified as "Mainstream" or better and ~13 million of that classified as "Enthusiast". To put this into perspective, next-gen consoles have sold ~12m for PS3 and 18m for XBox360 to-date.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: HOOfan 1
Satan Clara? was that a typo or an intentional misspelling? Either way, extremely unprofessional. I hope noone ever gives these guys press credentials.

judging by the tone of the article I think that we can assume that it was intentional. the inq is complete trash and don't deserve press credentials imho.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
well... there is always dual and tri SLI... If the GTX280 is really all that, then they don't need a GX2 version. anyone who spends 600$ on one would spend a little more to get an nforce mobo (and pay nvidia MORE money) to SLI it...
 

ddarko

Senior member
Jun 18, 2006
264
3
81
Originally posted by: chizowWe can see from their stockholder letter that ~75% of their business is focused on discrete desktop GPUs (61% desktop, 14% professional) with at least 25% of that being mid to high-end parts (AMD Gamer's Alliance Slide from AT). That's @65 million PCs classified as "Mainstream" or better and ~13 million of that classified as "Enthusiast". To put this into perspective, next-gen consoles have sold ~12m for PS3 and 18m for XBox360 to-date.

I don't really understand what you're trying to say. You seem to mix apples-and-oranges. The AMD charts purport to show the estimated number of PC gamers worldwide: 263 million. 52.6 million of those gamers are classified as mainstream and an additional 13.15 million as enthusiast. There are no explicit definitions provided for "mainstream" and "enthusiast" but the Anandtech writeup suggests that it isn't solely technical ("The Enthusiast market is dominated by those who are already investing in good gaming PCs and have some of the highest requirements for performance/visual quality. The mainstream gaming market, however, is composed of those users who want to play more demanding games on their PCs but aren't always aware of what they need to do so."). In other words, "mainstream" users includes people with integrated graphics parts; in fact, I'd wager that the vast majority of these 65 million "mainstream" and "enthusiast" PC gamers have integrated graphics. That means most of these people aren't addressed by Nvidia's focus on "discrete desktop GPUs", meaning that if you were trying to demonstrate that Nvidia's potential market is 75% of 65 million, that conclusion is not supported by the data. The 65 million figure has to cut way down to eliminate all the people with integrated graphics.

My point isn't that the G80, G92 and now the GT200 were or will be unprofitable. They weren't and won't be. It's just that these big chips aren't profitable enough on their own to fuel the company because they're only suitable for a relatively small market. Airlines love the margins from their ten first-class customers but it's the 90 huddled masses in steerage who pays for the fuel. The same is true for chip companies; Intel isn't paying for its fabs with its Extreme processors. taltamir's suggestion that Nvidia will use the G92 chips as its mainstream part sounds right, that does appear to be their strategy. The question is whether this strategy will work if AMD offers equivalent or better performance at a cheaper price (its new chips are manufactured on a smaller process than the G92 so they should be cheaper). If Nvidia can't move its G92 chips, then it gets its knees cut out from under it. Only time will tell what happens. But what is disconcerting is that the GT200 appears to be another revision of the G80 architecture, its performance achieved principally by adding more and more shaders. It is eerily reminiscent of the dead-end approach pursued by Intel with the Pentium; make it faster, bigger, hotter, more expensive, etc. I have full confidence that Nvidia knows it can't continue and that the next gen chip will faster, smaller, more efficient, cheaper, etc.

 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Certainly all the signs are there, that nvidia is worried about future profits. Enthusiast, Computers and gaming - has to be a shrinking market. Most gamers seem to prefer consoles with only a trickle of games that take advantage of enthusiast hardware. Ati and Nvidia might be best to work together to increase this market, rather than the all the negative fud that just helps kill the category.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: ddarko
Originally posted by: chizowWe can see from their stockholder letter that ~75% of their business is focused on discrete desktop GPUs (61% desktop, 14% professional) with at least 25% of that being mid to high-end parts (AMD Gamer's Alliance Slide from AT). That's @65 million PCs classified as "Mainstream" or better and ~13 million of that classified as "Enthusiast". To put this into perspective, next-gen consoles have sold ~12m for PS3 and 18m for XBox360 to-date.

I don't really understand what you're trying to say. You seem to mix apples-and-oranges. The AMD charts purport to show the estimated number of PC gamers worldwide: 263 million. 52.6 million of those gamers are classified as mainstream and an additional 13.15 million as enthusiast. There are no explicit definitions provided for "mainstream" and "enthusiast" but the Anandtech writeup suggests that it isn't solely technical ("The Enthusiast market is dominated by those who are already investing in good gaming PCs and have some of the highest requirements for performance/visual quality. The mainstream gaming market, however, is composed of those users who want to play more demanding games on their PCs but aren't always aware of what they need to do so."). In other words, "mainstream" users includes people with integrated graphics parts; in fact, I'd wager that the vast majority of these 65 million "mainstream" and "enthusiast" PC gamers have integrated graphics. That means most of these people aren't addressed by Nvidia's focus on "discrete desktop GPUs", meaning that if you were trying to demonstrate that Nvidia's potential market is 75% of 65 million, that conclusion is not supported by the data. The 65 million figure has to cut way down to eliminate all the people with integrated graphics.
Huh? The point of highlighting that 75% discrete desktop GPU was to illustrate that very little of NV's business is directed at integrated graphics, whether its desktop or mobile. There's very little chance any of that 65% includes integrated graphics given the breakdown of GPU % among the big 3 (Intel 38%, NV 36%, AMD 20%). Knowing Intel makes no discrete GPU allows you to account for ~40% of that 263 million (105m). With even a conservative 60% (its probably closer to 70-75%) discrete GPU estimate for NV and AMD from that remainder(158m x .6 = 95m), that still leaves you with far more than the ~65 million mainstream and enthusiast gamers. All the numbers I've quoted are readily available in various reports over the last 6 months, but the Steam Survey is still one of the best samples of game hardware available for free. Its up to 1.7m discrete samples and shows very clearly that people that consider themselves gamers (enthusiast and mainstream) do not use integrated graphics.

My point isn't that the G80, G92 and now the GT200 were or will be unprofitable. They weren't and won't be. It's just that these big chips aren't profitable enough on their own to fuel the company because they're only suitable for a relatively small market.
Problem is that every shred of data, including the article you linked indicates G80 was in fact profitable, so much so as NV's flagship that it led the firm to record earnings and a dominant share of the desktop market. G80 didn't need to be profitable on its own, that's the beauty of high margins, you can sell low volume and still make substantial profit instead of relying on low margin, high volume parts. The article you linked supports this, as an inventory issue caused G80 to become unprofitable as they were forced to drop prices due to G92's release. With G92, the yields may not have necessarily been bad, just bad relative to their sales price. Anyone would've recognized this as 8800GT and GTS selling for $300+ were going for $200 and less as inventory improved.

Before you insist G80 didn't sell well, again, take a look at the Steam survey which shows ~10% of the gaming market purchashing an 8800-class GPU over the survey's life. That doesn't say much on its own, but it is compelling evidence when compared to competitors and residual high-end parts from the previous generation (X1900s and G70/71). There is some G92 penetration there, but again, the overwhelming majority of that 8800 figure is comprised of G80 (just add up 320/640/768MB GPUs). Couple this with the fact that G84 was late and underwhelming and the fact NV enjoyed record-breaking profits during G80s life span and its pretty obvious that success was largely attributed to the success of G80.
 

emilyek

Senior member
Mar 1, 2005
511
0
0
Originally posted by: ronnn
Certainly all the signs are there, that nvidia is worried about future profits. Enthusiast, Computers and gaming - has to be a shrinking market. Most gamers seem to prefer consoles with only a trickle of games that take advantage of enthusiast hardware. Ati and Nvidia might be best to work together to increase this market, rather than the all the negative fud that just helps kill the category.

Enthusiasts are still there (perhaps in smaller numbers), and computer owners are still there (in much larger numbers).

There are fewer PC games made, one of the main reason being of course consoles, which have a standardized hardware set that eliminates tons of problems; they also doesn't suffer from the same piracy rates that PC games do.

On top of that, contra to what you've said about games not made for folks with enthusiast hardware; its the opposite: most of the games made for PC have absurdly high hardware requirements.

They're so high in fact that the average computer is totally inadequate.

And it's preposterous that gaming boxes with up to $1000+ of graphics-dedicated hardware still can't run modern games at absolute maximum settings and expect to have all the frames per second they might want.

Perhaps developers think that simply because SLI exists that they should code their game on the bleeding edge, just for bragging rights or something.

But it is killing PC games at least as fast as the console wars.

Perhaps SLI matched with super-expensive GPUs is a kind of end-game strategy for discrete GPU makers who see a console-only future as inevitable; making all they can while the getting is still good.

But game developers especially need to rethink their strategies, that is, if they want to keep PC games viable.

Oh, and this INQ article looks bogus to me.
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: Janooo
HD 3870 X2 against the GeForce GTX 280 and GTX 260

Mixed impression...

Actually looks very good for AMD. 4870 X2 should be ~2x 3870 X2 in performance, so it should beat the GTX 260 just about everywhere (except World in Conflict) and it will beat the GTX 280 in a lot of cases as well. Add in that those were tests done by nVidia using old AMD drivers, and it's actually looking good for the 4000-series.

If R700 rumors prove true as well, HD 4870 X2 should > GTX 280 in Crysis.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
consoles are very easy to pirate... if good enough games become console exclusives, then pirates will just buy a console and pirate games for the console...

PC gaming is not dying, it is on the rise, growing like never before. But failing companies blame their failing on piracy and yell that the market is dying. as if.
 

Rusin

Senior member
Jun 25, 2007
573
0
0
Originally posted by: Extelleron
Originally posted by: Janooo
HD 3870 X2 against the GeForce GTX 280 and GTX 260

Mixed impression...

Actually looks very good for AMD. 4870 X2 should be ~2x 3870 X2 in performance, so it should beat the GTX 260 just about everywhere (except World in Conflict) and it will beat the GTX 280 in a lot of cases as well. Add in that those were tests done by nVidia using old AMD drivers, and it's actually looking good for the 4000-series.

If R700 rumors prove true as well, HD 4870 X2 should > GTX 280 in Crysis.
Well if HD4870 is around HD3870 X2 level in performance as expected, then HD4870 X2 would sit pretty close to GTX 260 level depending on:
1. How efficient crossfire is with RV770 [4870 X2 is traditional crossfire setting with bridge chip and all). HD3870 X2 was around 65% faster than HD3870
2. What clocks they have for HD4870 X2? I mean HD4870's TDP is already 157W..16%
----
With those numbers GTX 260 is (on average) 65% faster in games than HD3870 X2 and GTX 280 90%. Smallest differencies on games which HD3870 X2 can run decently at 2560x1600 4xAA 16xAF..
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: Janooo
It just could be that 770 can equal 280 in some games.

Well that's pretty unlikely, an Oc'd HD 4870 might get equal or surpass a GTX 260 in some games like UT3.

Note that Catalyst 8.5 might change some of the numbers here.

Catalyst 8.5 Release Notes:

* Call of Juarez DX10: Performance increases up to 12% on systems containing an ATI Radeon? HD 3xx0 series of product
* Halo: Performance increases by 10-30% across all of the supported ATI Radeon? series of products
* Lost Planet DX10: Performance increases from 5 to 35% on systems containing an ATI Radeon? HD 3xx0 series of product
* Stalker DX9: Performance increases by 20-50% when HDR is enabled in the game; across all ATI Radeon? HD38x0 series of products
* World in Conflict DX10: Performance increases up to 25% on systems containing an ATI Radeon? HD36x0 and/or an ATI Radeon? 38x0 series of product. Higher performance gains are noticed on systems contianing an ATI Radeon? 3870x2 series of product

So Call of Juarez sees ~12% improvement and World in Conflict can see 25%+ on HD 3870 X2. That would make a big difference in the WiC results.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |