nVidia GT200 Series Thread

Page 17 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

bryanW1995

Lifer
May 22, 2007
11,144
32
91
3. Yea, that is the point. It is neutered in such a way that it is very close in low end tasks, but falls behind on high end tasks. Making it a much more desirable part, while maintaing the 280s supremacy (most high end users would get a GTX instead of an Ultra, I doubt any high end user will get a 260)

interesting point, though I think that with such a huge price delta you'll see quite a few high end users settling for the 260.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
that makes them mid range users, mid-high at best.

If you have a 30 inch display and tri SLI I Doubt you would go with the 260.
But many people made tri sli with the 8800GTX instead of the 8800 ultra.

1. Again, emphasizing CUDA PhysX support as a significant reason for more SP is about as relevant to gamers as news of a NV Folding@Home client. We'll see about the driver support, not that it really matters, as it'll probably be limited to GT200 parts and the 5 games that support PhysX.

CUDA = C code on the SP of ANY DX10 capable nvidia card. The work with ALL of them... the G80, G92, and any newer card. it will NOT be limited to the G200... go ahead to nvidias site and download the beta cuda drivers / SDK and see for yourself.

Anyways, there are quite a bit more then 5 games by now, but the key is how many games are coming, when nvidia bought ageia, opened the standard, and started porting it to CUDA it meant that would finally be a base of hardware capable of doing physX, which means there will finally be a reason to code for it. Supposedly a lot of games are now being developed with physX.

While it is indeed not a HUGE issue, it is an issue.
 

JPB

Diamond Member
Jul 4, 2005
4,064
89
91
Nvidia GT200 sucessor tapes out



THE LAST TIME Nvidia's CPU mouthed off about Intel, the firm followed it up with the stunning NV5800 'Dustbuster'. This time, he mouthed off, and the successor to the GT200 had already taped out. NV is in deep trouble once again.

You heard that right, the successor for the GT200 chip has already taped out, and it too will be nothing special. Documents seen by the INQ indicate that this one is called, wait for it, the GT200b, it is nothing more than a 55nm shrink of the GT200. Don't expect miracles, but do expect the name to change.

There are several problems with the GT200, most of which are near fatal. The first is the die size, 576mm^2, bigger than most Itanics. One might trust Intel to make a chip that big with decent yields, especially if it is basically an island of logic in the middle of a sea of cache. Nvidia using a foundry process doesn't have a chance of pulling this off.

Word has come out of Satan Clara that the yields are laughable. No, make that abysmal, they are 40 per cent. To add insult to injury, that 40 per cent includes both the 280 and the 260 yield salvage parts. With about 100 die candidates per wafer, that means 40 good dice per wafer. Doing the maths, a TSMC 300mm 65nm wafer runs about $5000, so that means each good die costs $125 before packaging, testing and the like. If they can get away with sub-$150 costs per GPU, we will be impressed.

So, these parts cost about $150, and the boards will sell for $449 and $649 for the 260 and 280 respectively, so there is enough play room there to make money, right? Actually, most likely yes. There are costs though, but not enough to kill profit for any one touching these.

The biggest cost is memory. The 512b memory width means that they will have to use at least 16 chips. This ends up making the board very complex when you have to route all those high speed signals, and that means more layers, more cost, and more defect fallout with the added steps. You also have to solder on eight more memory chips which costs yet more.

To add insult to injury, the TDPs of the 260 and 280 are 182W and 236W respectively. This means big copper heatsinks, possibly heatpipes, and high-end fans. Those parts cost a lot of money to buy, assemble and ship. Not fatal, but not a good situation either. It also precludes a dual GPU board without losing a lot of speed.

Basically, these boards are going to cost a lot of money to make, not just to buy. The $449 price is justified by the cost. The last round of GX2 boards cost OEMs about $425, meaning that NV charges OEMs about 70 per cent of retail for high-end parts. After packaging, shipping and add-ins, there is almost nothing left for the OEMs, quite possible explaining why one of their biggest one is teetering on the verge of bankruptcy, kept alive because NV won't call their debt while still shiping to them. Watch for this to melt down once NV loses the high end.

So, you end up with an expensive chip on an expensive board that makes few if any people money. Fair enough, bleeding-edge parts mean bleeding-edge prices. The problem is that ATI is going to make a chip that competes with GT200, and lines up with it point for point. NV wins big Z Fill, ATI crushes them on Shader Flops. What this translates to in the real world is still up in the air, but it looks like the 770 and the 260 will be about equal for most things.

The GT200 is about six months late, blew out their die size estimates and missed clock targets by a lot. ATI didn't. This means that buying a GT260 board will cost about 50 per cent more than an R770 for equivalent performance. The GT280 will be about 25 per cent faster but cost more than twice as much. A month or so after the 770 comes the 700, basically two 770s on a slab. This will crush the GT280 in just about every conceivable benchmark and likely cost less. Why? Because.

So, what is a company to do when it promised the financial world that ATI was lost, and GT200 would raise their margins by 100 basis points or so? Surely they knew what was coming a few weeks ago during their financial call, right? I mean, if word was leaking days later, the hierarchy surely was aware at the time, right?

The answer to that is to tape out the GT200b yesterday. It has taped out, and it is a little more than 400mm^2 on a TSMC 55nm process. Given that TSMC tends to price things so that on an equivalent area basis, the new process is marginally cheaper than the old, don't look for much cost saving there. Any decrease in defectivity due to smaller area is almost assuredly going to be balanced out by the learning curve on the new process. Being overly generous, it is still hard to see how the GT200b will cost less than $100 per chip. Don't look for much cost savings there.

The new shrink will be a much better chip though, mainly because they might fix the crippling clock rate problems of the older part. This is most likely not a speed path problem but a heat/power issue. If you get a better perf/watt number through better process tech, you can either keep performance the same and lower net power use, or keep power use the same and raise performance.

Given NV's woeful 933GFLOPS number, you can guess which way they are going to go. This means no saving on heatsinks, no savings on components, and a slightly cheaper die. For consumers, it will likely mean a $50 cheaper board, but no final prices have come my way yet. It will also mean a cheaper and faster board in a few months.

The GT200b will be out in late summer or early fall, instantly obsoleting the GT200. Anyone buying the 65nm version will end up with a lemon, a slow, hot and expensive lemon. Kind of like the 5800. It would suck for NV if word of this got out. Ooops, sorry.

What are they going to do? Emails seen by the INQ indicate they are going to play the usual PR games to take advantage of sites that don't bother checking up on the 'facts' fed to them. They plan to release the GT200 in absurdly limited quantities, and only four AIBs are going to initially get parts.

There is also serious talk of announcing a price drop to put them head to head with the R770 and giving that number to reviewers. When the boards come out, the reviews are already posted with the lower numbers, and no reviewer ever updates their pricing or points out that the price performance ratio was just blown out of the water. There is also internal debate about giving a few etailers a real price cut for a short time to 'back up' the 'MSRP'.

We would hope the reviewers are able to look at the numbers they were given on editors' day, $449 and $649, and take any $100+ last minute price drop with the appropriate chunk of NaCl. Just given the component cost, there is no way NV can hit the same price point as the 770 boards. "We lose money on each one, but we make it up in volume" is not a good long term strategy, nor is it a way to improve margins by 100 basis points.

In the end, NV is facing a tough summer in the GPU business. They are effectively out of the Montevina market, and they are going to lose the high end in a spectacular way. Nvidia has no effective countermeasure to the R770, the GT200 was quite simply botched, and now they are going to pay for it. When all you have is a hammer, everything looks like a 5800. µ
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
alright 55nm G100, (gt200B)
Who would've guessed it?


Crysis defeating monster comes in august! Woot.
 

Rusin

Senior member
Jun 25, 2007
573
0
0
There was rumours of 55nm version for long time..and that this 65nm version would be just "safe bet" if 55nm wouldn't work.
Hope that they would get those cheaper GT200-gen cards out with 55nm soon.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Charlie is not very good at this game. If he was really hoping to hurt nvidia then he would have been better-served by a slightly different strategy. I guess they REALLY REALLY hurt his feelings. Jensen, if you're reading this, quit picking on poor inq!!
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: PC Surgeon
I will never understand why people link or go to sites such as the inquirer or fudzilla. Too much BS.

I think it's just in lieu of any real info to talk about.

What I really don't get is how the INQ views ATI's strategy of essentially only competing in the mid range as better. I don't doubt that building lesser cards incurs less cost. Hell, I could sell you shit on a shingle for next to nothing, but would you buy it?

The one thing that they mention that does sort of concern me though (I guess it's not called FUD for nothing), and that is the planned die shrink to 55nm relatively soon... Now, I've never run away from spending a few $$$ on a high end card, but when your recently purchased card all of sudden holds very little resale value it does get a bit annoying. I can deal with it every now and then, but situations like this where the GX2 had a lifespan of about 3-4 months and is getting replaced by another card that is (apparently) going to be obsolete in another 3-4 months is a little ridiculous. I'd at least like the fan on my card to get dusty before it's worth about half of what I paid for it. This alone makes for a good argument for the GTX 260 over the GTX 280.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
I typically read charlie and fuad and believe the opposite. If they're predicting 55nm in 3 mos then nvidia will probably NEVER go 55nm. If they're predicting 4870 at $229 and soon after under $200 then figure $350+...you get the idea.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
that inquirer article reads more like a "madam cleo" predition... and is probably just as accurate.. GT200's SUCCESSOR tapped out? firstly, its it G200, there is no T. And secondly, the G200 isn't even being sold yet, so how exactly are we to beleive that they finished it's successor. And the... what the hell am I doing... There is no need for me to go into point by point analysis, it is the INQ. That says it all.
 

shabby

Diamond Member
Oct 9, 1999
5,781
42
91
Originally posted by: taltamir
that inquirer article reads more like a "madam cleo" predition... and is probably just as accurate..

Caaaaaaaaall me naw for a freeeeeeeeeeeee readin!
 

Hauk

Platinum Member
Nov 22, 2001
2,808
0
0
Originally posted by: mharr7
Originally posted by: PC Surgeon
I will never understand why people link or go to sites such as the inquirer or fudzilla. Too much BS.

.

+2

I've never cared to comment either way regarding inq, but my god, what a load of crap. How could someone go so far with a speculation unless they have some shred of evidence to back it up? Damn man, give something, anything, that would support such an article.

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: bryanW1995
Charlie is not very good at this game. If he was really hoping to hurt nvidia then he would have been better-served by a slightly different strategy. I guess they REALLY REALLY hurt his feelings. Jensen, if you're reading this, quit picking on poor inq!!

From his writing, I don't think Charlie has what it takes upstairs to be "good at the game" .

He doesn't have any real info. He has his usual rumors, dated info, and half truths that he pieces together his FUD "articles" with.

I can understand pretty easily why he's acting like a petulant child about NVIDIA though-
without press status, his business is damaged.

Why would anyone go to The Inquirer when the reputable journalists get the accurate info, and he has to talk to "some guy whose wife's cousin mops the warehouse at EVGA" and the like.

If you're an advertiser, why would you pay to post ads on a site that doesn't have enough credibility to be in a standard press conference? So your product can be associated with rumor mongering?

I'd note this after reading Charlie's little tirade:

The 8800GTX, over a year and a half later, is still the second best video card ever made. Dropping a die size won't make a GTX 280 obsolete, and if there are no competing products, it's doubtful dropping a die size will make NVIDIA cut the price in half.


I know more than a bit about this "nothing special" GPU, and Charlie could replace every ad in magazines, rewspaper, radio, and tv with his FUD and they'll sell all they can make and more based on the reviews.

I have real press status- I know.



 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
when I read the INQ saying "we were not invited to nvidias press release" I thought to myself "GOOD! Finally those hacks are treated like what they are, and not given the status of legitimate reporters".
 

geoffry

Senior member
Sep 3, 2007
599
0
76
Originally posted by: JPB


THE LAST TIME Nvidia's CPU mouthed off about Intel µ

You guys missed the most entertaining part of his trash, I didn't know a CPU could talk, let alone mouth off. I know he meant CEO but any "news site" with that poor editing has little to no credibility to me. Even the average blog is atleast glanced over before being published I reckon.
 

SolMiester

Diamond Member
Dec 19, 2004
5,331
17
76
Originally posted by: nitromullet
Originally posted by: PC Surgeon
I will never understand why people link or go to sites such as the inquirer or fudzilla. Too much BS.

I think it's just in lieu of any real info to talk about.

What I really don't get is how the INQ views ATI's strategy of essentially only competing in the mid range as better. I don't doubt that building lesser cards incurs less cost. Hell, I could sell you shit on a shingle for next to nothing, but would you buy it?

The one thing that they mention that does sort of concern me though (I guess it's not called FUD for nothing), and that is the planned die shrink to 55nm relatively soon... Now, I've never run away from spending a few $$$ on a high end card, but when your recently purchased card all of sudden holds very little resale value it does get a bit annoying. I can deal with it every now and then, but situations like this where the GX2 had a lifespan of about 3-4 months and is getting replaced by another card that is (apparently) going to be obsolete in another 3-4 months is a little ridiculous. I'd at least like the fan on my card to get dusty before it's worth about half of what I paid for it. This alone makes for a good argument for the GTX 260 over the GTX 280.

ATI expected their customers to put up with it......

I think the Inq is still p8ssed at not being allowed to editors day...LOL
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: SolMiester


I think the Inq is still p8ssed at not being allowed to editors day...LOL

Just a scooch...

He would have done better to say nothing at all. He is damaging his site more than doing
any good. If he really didn't care whether or not he wasn't invited to Editors Day, then he wouldn't have mentioned it at all.

 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
I would have said something sarcastic about "how can you make crap crappier simply by adding more of it"... but those last tirades have really sunk The INQ to a new low... they were the worst before, now they are still the worst, only even worst then they were before.

They went from "village idiot" to "insane/incoherent village idiot"
 

Tabby

Member
Mar 14, 2007
44
0
0
Originally posted by: JPB
Nvidia GT200 sucessor tapes out



THE LAST TIME Nvidia's CPU mouthed off about Intel, the firm followed it up with the stunning NV5800 'Dustbuster'. This time, he mouthed off, and the successor to the GT200 had already taped out. NV is in deep trouble once again.

You heard that right, the successor for the GT200 chip has already taped out, and it too will be nothing special. Documents seen by the INQ indicate that this one is called, wait for it, the GT200b, it is nothing more than a 55nm shrink of the GT200. Don't expect miracles, but do expect the name to change.

There are several problems with the GT200, most of which are near fatal. The first is the die size, 576mm^2, bigger than most Itanics. One might trust Intel to make a chip that big with decent yields, especially if it is basically an island of logic in the middle of a sea of cache. Nvidia using a foundry process doesn't have a chance of pulling this off.

Word has come out of Satan Clara that the yields are laughable. No, make that abysmal, they are 40 per cent. To add insult to injury, that 40 per cent includes both the 280 and the 260 yield salvage parts. With about 100 die candidates per wafer, that means 40 good dice per wafer. Doing the maths, a TSMC 300mm 65nm wafer runs about $5000, so that means each good die costs $125 before packaging, testing and the like. If they can get away with sub-$150 costs per GPU, we will be impressed.

So, these parts cost about $150, and the boards will sell for $449 and $649 for the 260 and 280 respectively, so there is enough play room there to make money, right? Actually, most likely yes. There are costs though, but not enough to kill profit for any one touching these.

The biggest cost is memory. The 512b memory width means that they will have to use at least 16 chips. This ends up making the board very complex when you have to route all those high speed signals, and that means more layers, more cost, and more defect fallout with the added steps. You also have to solder on eight more memory chips which costs yet more.

To add insult to injury, the TDPs of the 260 and 280 are 182W and 236W respectively. This means big copper heatsinks, possibly heatpipes, and high-end fans. Those parts cost a lot of money to buy, assemble and ship. Not fatal, but not a good situation either. It also precludes a dual GPU board without losing a lot of speed.

Basically, these boards are going to cost a lot of money to make, not just to buy. The $449 price is justified by the cost. The last round of GX2 boards cost OEMs about $425, meaning that NV charges OEMs about 70 per cent of retail for high-end parts. After packaging, shipping and add-ins, there is almost nothing left for the OEMs, quite possible explaining why one of their biggest one is teetering on the verge of bankruptcy, kept alive because NV won't call their debt while still shiping to them. Watch for this to melt down once NV loses the high end.

So, you end up with an expensive chip on an expensive board that makes few if any people money. Fair enough, bleeding-edge parts mean bleeding-edge prices. The problem is that ATI is going to make a chip that competes with GT200, and lines up with it point for point. NV wins big Z Fill, ATI crushes them on Shader Flops. What this translates to in the real world is still up in the air, but it looks like the 770 and the 260 will be about equal for most things.

The GT200 is about six months late, blew out their die size estimates and missed clock targets by a lot. ATI didn't. This means that buying a GT260 board will cost about 50 per cent more than an R770 for equivalent performance. The GT280 will be about 25 per cent faster but cost more than twice as much. A month or so after the 770 comes the 700, basically two 770s on a slab. This will crush the GT280 in just about every conceivable benchmark and likely cost less. Why? Because.

So, what is a company to do when it promised the financial world that ATI was lost, and GT200 would raise their margins by 100 basis points or so? Surely they knew what was coming a few weeks ago during their financial call, right? I mean, if word was leaking days later, the hierarchy surely was aware at the time, right?

The answer to that is to tape out the GT200b yesterday. It has taped out, and it is a little more than 400mm^2 on a TSMC 55nm process. Given that TSMC tends to price things so that on an equivalent area basis, the new process is marginally cheaper than the old, don't look for much cost saving there. Any decrease in defectivity due to smaller area is almost assuredly going to be balanced out by the learning curve on the new process. Being overly generous, it is still hard to see how the GT200b will cost less than $100 per chip. Don't look for much cost savings there.

The new shrink will be a much better chip though, mainly because they might fix the crippling clock rate problems of the older part. This is most likely not a speed path problem but a heat/power issue. If you get a better perf/watt number through better process tech, you can either keep performance the same and lower net power use, or keep power use the same and raise performance.

Given NV's woeful 933GFLOPS number, you can guess which way they are going to go. This means no saving on heatsinks, no savings on components, and a slightly cheaper die. For consumers, it will likely mean a $50 cheaper board, but no final prices have come my way yet. It will also mean a cheaper and faster board in a few months.

The GT200b will be out in late summer or early fall, instantly obsoleting the GT200. Anyone buying the 65nm version will end up with a lemon, a slow, hot and expensive lemon. Kind of like the 5800. It would suck for NV if word of this got out. Ooops, sorry.

What are they going to do? Emails seen by the INQ indicate they are going to play the usual PR games to take advantage of sites that don't bother checking up on the 'facts' fed to them. They plan to release the GT200 in absurdly limited quantities, and only four AIBs are going to initially get parts.

There is also serious talk of announcing a price drop to put them head to head with the R770 and giving that number to reviewers. When the boards come out, the reviews are already posted with the lower numbers, and no reviewer ever updates their pricing or points out that the price performance ratio was just blown out of the water. There is also internal debate about giving a few etailers a real price cut for a short time to 'back up' the 'MSRP'.

We would hope the reviewers are able to look at the numbers they were given on editors' day, $449 and $649, and take any $100+ last minute price drop with the appropriate chunk of NaCl. Just given the component cost, there is no way NV can hit the same price point as the 770 boards. "We lose money on each one, but we make it up in volume" is not a good long term strategy, nor is it a way to improve margins by 100 basis points.

In the end, NV is facing a tough summer in the GPU business. They are effectively out of the Montevina market, and they are going to lose the high end in a spectacular way. Nvidia has no effective countermeasure to the R770, the GT200 was quite simply botched, and now they are going to pay for it. When all you have is a hammer, everything looks like a 5800. µ





didn't they say something similar wen the G80 first came out ?
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
also... so what if it costs them 150$ to make each one? those buggers are set to sell (and perform at a level of) 600$ a piece... that is 300% MARGIN (aka, selling at 400%).

Not too terrible if you ask me.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |