Dual GT200 samples in December

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: nRollo
Originally posted by: AdamK47
Originally posted by: apoppin
Dual GTX260+

it is impossible to do gtx280 on 55nm

Why is it impossible? Did I miss something?

It's just impossible , just like lowering the cost of the GTX260/280 was, due to yields and higher die size!

Pay attention!

LOL- Apoppin is speculating the thermals and power consumption of such a beast would prevent it, he has a 50% chance of being right. The forum "engineers" have specualted it could not be done for a while.

yes, thermals and power consumption forbid using GTX280

theoretically, they *could* do it but it would make absolutely no sense when the GTX260+ is so relatively close in performance and has MUCH less of an engineering/design/manufacture challenge

Pragmatically, Nvidia *only* needs to beat the heck out of 4870x2 ... and 2 x GTX260++ at 55nm will do it easily in a sandwich

My prediction


 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: chizow
If its anything like the 9800GX2 and 7950GX2, its better to wait for the next-gen single-GPU or just go native SLI. These sandwich cards meant only to recapture performance crowns are the first to be forgotten when it comes to driver updates. Driver support may never be as bad as the 4870X2, since you can always create your own profiles, but if you look around the 9800GX2 is also very prone to driver issues for newly launched games. Just read over some of the threads over on EVGA around the time GTX 280 came out. 9800GX2 owners couldn't Step-Up fast enough (maybe Step-Away is more appropriate).

I'm going to agree
 

Zap

Elite Member
Oct 13, 1999
22,377
2
81
Originally posted by: chizow
There's certainly going to be some decrease in both heat and power consumption going to 55nm, but powering the full 10/10 clusters might just push it over the edge in terms of power consumption.

Who knows?

I have a bit of interesting information... the original 9800 GTX was 65nm. The 9800 GTX+ was 55nm, but used the PCB of the original GTX.

There is a "new" PCB design specifically for the 55nm GTX+. Guess what? It only uses one 6 pin PCIe power plug while the original GTX PCB used two 6 pin plugs.

(Also, for those interested, the card will be shorter and will fit into cases like the Antec Solo which won't fit the older card.)

Originally posted by: apoppin
Pragmatically, Nvidia *only* needs to beat the heck out of 4870x2 ... and 2 x GTX260++ at 55nm will do it easily in a sandwich

My prediction

If NVIDIA comes out with a "GTX 280 GX2 55nm" part...*

My prediction is that it will be underclocked GTX 280. I base this on historical evidence. The previous two GX2 cards were feature-complete as the top end single GPU cards of their generation, but clocked lower.

Of course we are all being "forum engineers" and speculating at this time, myself included. And no, even if I knew for sure, I wouldn't tell y'all due to NDA.

*I always consider stuff speculation and vaporware until it is available for purchase.
 

zerocool84

Lifer
Nov 11, 2004
36,041
472
126
Originally posted by: Zap
Originally posted by: chizow
There's certainly going to be some decrease in both heat and power consumption going to 55nm, but powering the full 10/10 clusters might just push it over the edge in terms of power consumption.

Who knows?

I have a bit of interesting information... the original 9800 GTX was 65nm. The 9800 GTX+ was 55nm, but used the PCB of the original GTX.

There is a "new" PCB design specifically for the 55nm GTX+. Guess what? It only uses one 6 pin PCIe power plug while the original GTX PCB used two 6 pin plugs.

(Also, for those interested, the card will be shorter and will fit into cases like the Antec Solo which won't fit the older card.)

Is this one of those cards you're talking about??
GIGABYTE 9800 GTX+
I saw this today and was like wtf. It's shorter, has one PCIe power connector, one SLI connector so no 3-way SLI, and no SPDIF connector.

I also found this EVGA 9800 GTX+ which has 2 SLI connectors but no SPDIF and one PCIe connector.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
Originally posted by: Zap

There is a "new" PCB design specifically for the 55nm GTX+. Guess what? It only uses one 6 pin PCIe power plug while the original GTX PCB used two 6 pin plugs.
Very interesting; even the GTX260 needs two connectors so they must've reduced consumption quite a bit.

Still, I'm glad I picked up a GTX260+ as having to wait until January would be just nasty.
 

rjc

Member
Sep 27, 2007
99
0
0
Originally posted by: BFG10K
Originally posted by: Zap

There is a "new" PCB design specifically for the 55nm GTX+. Guess what? It only uses one 6 pin PCIe power plug while the original GTX PCB used two 6 pin plugs.
Very interesting; even the GTX260 needs two connectors so they must've reduced consumption quite a bit.

I might be mistaken but i think Zap was referring to a new board design for the 55nm 9800GTX+.

His statement is kind of ambiguous though...if the 55nm GTX260 board now only has 1 power connector that implies quite dramatic power savings.

Apparently the 9800gtx had a 12 layer board and the current GTX260 14 layers. Guessing they new boards have been redesigned to use less layers for future life in sub $200 land.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
I wonder if they are going GDDR5 with a 256 bit bus. Or keeping the 448 bit wide with GDDR3. LOL or 448 bit wide with GDDR5!!
Fun fun fun!!!
 

rjc

Member
Sep 27, 2007
99
0
0
Originally posted by: apoppin
yes, thermals and power consumption forbid using GTX280

Also the 280 would have to design a board with 2x512bit memory interfaces compared to 2x448bit memory for the 260.

All the dual cards so far 7950, 3870, 9800, 4870 appear to have been 256 bit mem interfaces jumping to 448bit in roughly the same area would be hard enough let alone 512.

The first GT206 tesla card the CX also had a 384bit interface so i suppose they could even cut it down that far if the above more complicated board design didnt work out.

Originally posted by: keysplayr2003
I wonder if they are going GDDR5 with a 256 bit bus. Or keeping the 448 bit wide with GDDR3. LOL or 448 bit wide with GDDR5!!
Fun fun fun!!!

The telsa GT206 FX card seemed to have a 512bit interface, no idea about gddr5 support but i dont think they would disable 50% of that(ie 256bit) for the new GTX260 part - 448bit would be the simplest path(just be able to drop the new chip into the current board).

[Edit From previously posted link expreview says the GTX295 is based on 2 x 55nm GTX260 216]
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: cmdrdredd
Originally posted by: chizow
If its anything like the 9800GX2 and 7950GX2, its better to wait for the next-gen single-GPU or just go native SLI. These sandwich cards meant only to recapture performance crowns are the first to be forgotten when it comes to driver updates. Driver support may never be as bad as the 4870X2, since you can always create your own profiles, but if you look around the 9800GX2 is also very prone to driver issues for newly launched games. Just read over some of the threads over on EVGA around the time GTX 280 came out. 9800GX2 owners couldn't Step-Up fast enough (maybe Step-Away is more appropriate).

I'm going to agree

Based on any experience with GX2s? Or because you want to?

I had two 7950GX2s and two 9800GX2s, and like I said, I had no issues with anything other than the first gen quad drivers. (and those were more due to DX9 render ahead issues than driver problems- when ATi produces a "SFR of AFR" driver I'll believe it's easy)

I saw the review in which the 9800GX2 had scaling issues in a game or two, but given with CF you have to wait on profiles I think we can probably assume that some instances of scaling issues are part of the deal with multi GPU solutions.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
Originally posted by: nRollo

Based on any experience with GX2s? Or because you want to?
You left out option #3: because it's the truth. I agree with both of them.

I had two 7950GX2s and two 9800GX2s, and like I said, I had no issues with anything other than the first gen quad drivers.
That?s quite interesting since Quad SLI didn?t even function on Vista until about six months after said OS launched. Didn?t you move to Vista shortly after launch?

And now, which would you rather have? A sandwich 7950 GX2 based on obsolete tech with inferior AF and AA modes that can?t even do DX10, or a single 8800 GTX free from multi-GPU issues with markedly superior image quality?

I mean sure, someone that gets video cards handed to them for free probably doesn?t care about such facts, but it?s a major issue for those that actually spend money on them.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: BFG10K
Originally posted by: nRollo

Based on any experience with GX2s? Or because you want to?
You left out option #3: because it's the truth. I agree with both of them.

I had two 7950GX2s and two 9800GX2s, and like I said, I had no issues with anything other than the first gen quad drivers.
That?s quite interesting since Quad SLI didn?t even function on Vista until about six months after said OS launched. Didn?t you move to Vista shortly after launch?

And now, which would you rather have? A sandwich 7950 GX2 based on obsolete tech with inferior AF and AA modes that can?t even do DX10, or a single 8800 GTX free from multi-GPU issues with markedly superior image quality?

I mean sure, someone that gets video cards handed to them for free probably doesn?t care about such facts, but it?s a major issue for those that actually spend money on them.

1. I was using 8800GTX SLi when Vista launched, my remaining 7950GX2 was in a XP machine.

2. Your point about 8800GTXs is invalid. I've said many times here that it's preferable to achieve a level of performance with one GPU. You can't achieve the level of performance with one GPU this card will provide, to say "What would you rather have- a GPU that doesn't exist or two that do?" leaves me with the answer "two that do".

3. As a reviewer for a website presumably hoping to get "free cards" himself, you have an interesting take on the situation. Like it or not, I'm "press" and have been for years.
 

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
You are press ? Dude, if you are press, then the inq is the wallstreet journal.

Also, I highly doubt it's BFG's wish to get free stuff handed to him just for the sake of it being free. He has decent hardware allready, so does apoppin for example. What he wants is to review it because he takes an interest in it, and maybe it will put a smile on his face if hes under NDA and can play with toys other enthusiasts have to wait for till launch date.

I think you're borderling offensive here, assuming bfg is nothing more but a greedy shill ...
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
Originally posted by: nRollo

1. I was using 8800GTX SLi when Vista launched, my remaining 7950GX2 was in a XP machine.
So why didn't you upgrade your XP box to Vista? Oh that's right, you couldn't or else Quad SLI would stop working, so you really didn't have a choice.

If someone wanted to move to Vista with Quad SLI, that would pose a significant problem wouldn't you say?

Do you feel this level of support is acceptable for such a high end config?

2. Your point about 8800GTXs is invalid. I've said many times here that it's preferable to achieve a level of performance with one GPU. You can't achieve the level of performance with one GPU this card will provide, to say "What would you rather have- a GPU that doesn't exist or two that do?" leaves me with the answer "two that do".
That's very interesting considering in the 4870 X2 thread you're arguing against the 4870X2. Does the 4870X2 not provide more performance than that of one GTX280?

I'm arguing against multi-GPU in both threads, but your stance changes depending on which thread you're in.

3. As a reviewer for a website presumably hoping to get "free cards" himself, you have an interesting take on the situation. Like it or not, I'm "press" and have been for years.
Interesting? Hardly. If I didn't pay for something then I'm less likely to be annoyed if it doesn?t work properly than if I paid for it. Likewise, if card A stops getting support, why would I care if someone hands me card B for free to replace it?
 

AdamK47

Lifer
Oct 9, 1999
15,548
3,250
136
Originally posted by: apoppin
Originally posted by: nRollo
Originally posted by: AdamK47
Originally posted by: apoppin
Dual GTX260+

it is impossible to do gtx280 on 55nm

Why is it impossible? Did I miss something?

It's just impossible , just like lowering the cost of the GTX260/280 was, due to yields and higher die size!

Pay attention!

LOL- Apoppin is speculating the thermals and power consumption of such a beast would prevent it, he has a 50% chance of being right. The forum "engineers" have specualted it could not be done for a while.

yes, thermals and power consumption forbid using GTX280

theoretically, they *could* do it but it would make absolutely no sense when the GTX260+ is so relatively close in performance and has MUCH less of an engineering/design/manufacture challenge

Pragmatically, Nvidia *only* needs to beat the heck out of 4870x2 ... and 2 x GTX260++ at 55nm will do it easily in a sandwich

My prediction

So then you were referring to a 55nm dual GTX 280 one slot sandwich solution being "impossible"? I'd really like to see a single card one slot 55nm GTX 280. Two of those in SLI would be nice. Then again I don't... I wouldn't want my current SLI setup to be dethroned.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: keysplayr2003
I wonder if they are going GDDR5 with a 256 bit bus. Or keeping the 448 bit wide with GDDR3. LOL or 448 bit wide with GDDR5!!
Fun fun fun!!!

again .. a prediction

*all 3*
[the ultra has GDDR5 with 448bit; the sandwich is GDDR5 with a 256 bus as are most of the 55nm; a few of the top 280/290s will be also 448 and GDDR5]



no way we will see a gtx280 in a sandwich; no need

3. As a reviewer for a website presumably hoping to get "free cards" himself, you have an interesting take on the situation. Like it or not, I'm "press" and have been for years.

What "free" cards do we need?

- and some of us already do get evaluation video cards

yes .. we know all about "press"; i am press also - so big whoop
- so is the Inq's Charley and their janitor too

there is a name for your kind of lop-sided 'journalism'

 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: nRollo
Originally posted by: chizow
If its anything like the 9800GX2 and 7950GX2, its better to wait for the next-gen single-GPU or just go native SLI. These sandwich cards meant only to recapture performance crowns are the first to be forgotten when it comes to driver updates. Driver support may never be as bad as the 4870X2, since you can always create your own profiles, but if you look around the 9800GX2 is also very prone to driver issues for newly launched games. Just read over some of the threads over on EVGA around the time GTX 280 came out. 9800GX2 owners couldn't Step-Up fast enough (maybe Step-Away is more appropriate).

I have a 9800GX2, and one of my buddies is still using my 7950GX2 to this day. I don't recall any issues personally, beyond the initial 7950GX2 quad drivers.

I came across a few reviews specifically citing 9800gx2 scaling issues, I'm talking teens when single G92s were more than doubling the score. Was probably one of the 180.48 driver reviews, I'll link it when I come across it again.

As for the 9800GX2 and other single-slot multi-GPU cards, the problem is they're a band-aid by design slapped on top of cut-down implementations of dated tech. Given how quickly things change in the industry, the obvious conclusion is that something that offers similar performance sans multi-GPU band-aid is just around the corner. This is historical fact and cannot be reasonably argued against.

As someone who gets parts for free every 6-9 months, you have to acknowledge there is going to be less attachment to any new part, but for someone who has to weigh and live with their buying decisions the story is very different. Luckily the 9800GX2 step-up period fell within the GTX 280's launch, otherwise there may have been many more who felt cheated by their 9800GX2 purchase. Here's one example of how much happier GX2 owners were with a GTX 280, even though the GX2 often produced more FPS on paper.
EVGA Forum Re: 9800GX2 Step-Up

There's TONS more threads on exactly the same topic in both the GT200 and Step-Up forums on EVGA's site, just look at the period from around 6/18 to 7/18 and you'll get lots of relevant hits.

X58 has given us even less reason to consider G/X2 solutions even if multi-GPU is something you're interested in. Again, greater than 3 GPU has historically shown poor scaling, whether due to drivers, OS limitations or CPU bottlenecks. Most X58 boards can handle up to 3 dual-slot GPU and also offers the flexibility to throttle performance 1 card at a time.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Zap
Originally posted by: chizow
There's certainly going to be some decrease in both heat and power consumption going to 55nm, but powering the full 10/10 clusters might just push it over the edge in terms of power consumption.

Who knows?

I have a bit of interesting information... the original 9800 GTX was 65nm. The 9800 GTX+ was 55nm, but used the PCB of the original GTX.

There is a "new" PCB design specifically for the 55nm GTX+. Guess what? It only uses one 6 pin PCIe power plug while the original GTX PCB used two 6 pin plugs.

(Also, for those interested, the card will be shorter and will fit into cases like the Antec Solo which won't fit the older card.)
Yeah I did see a link to that 9800GTX+, but its key to note its a non-reference board design and probably has much fewer PWM components. The high-end 14 layer board was probably overkill for the 9800GTX and used only because of 2xSLI connectors, so non-ref boards probably scaled down power requirements accordingly, similar to G92 GTS draw. I'd be interested to see power consumption characteristics between the parts for sure.

I could certainly see a single 55nm GTX 280 running off 2x6-pin given a 65nm C216 can manage it, but given how much that extra 1/10th seems to push up power draw on the 280, I'd be surprised if they could manage 2 intact GT200 cores on a GX2.

 

solofly

Banned
May 25, 2003
1,421
0
0
BTW I forgot to ask, is GDA-nv introducing anything new besides more speed, perhaps DX10.1 or DX11 standards or is it the same old outdated tech from 2006?
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Given the complexity of the GT200, I can't see anything other than a die-shrink or minor tweaking. Anything major such as adding new features would require a respin. Given the low yield/high cost of the GT200, Nvidia is probably going to wait for their new GPU to arrive rather than try to add features to their existing die.
 

Broken

Platinum Member
Apr 16, 2000
2,458
1
81
Yeah, after owning both a 7950GX2 and 9800 GX2 I will never, ever buy another dual card made by Nvidia again. Because of those cards I was very hesitant to get a 4870X2. Night and day better. Not an ATI fanboi either, I have owned way, way more Nvidia cards. Nvidia's dual card driver support is a joke, and two months later when they release a better card you can kiss decent timely driver updates that help the dual cards out goodbye. They are like the stepchildren you never really wanted but you liked the mom enough that you married her anyway.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: solofly
BTW I forgot to ask, is GDA-nv introducing anything new besides more speed, perhaps DX10.1 or DX11 standards or is it the same old outdated tech from 2006?

There's no need for NV to move to DX10.1, they've already shown their DX10 parts are capable of using the only worthwhile and implemented feature of the API: reading back data from the multisample depth buffer. I suspect implementing this feature in the 180 drivers helps explain some of the big gains with AA, especially in recent titles.

GT206/208 were supposed to be the 55nm die-shrinks. GT216 was supposed to come later in Q1 with perhaps some tweaked architecture and coincide with 40nm transition rumors, but I'm not sure if that's going to happen with 55nm just hitting the channel. GT300 is rumored for Q3 next year as a DX11 part and a whole new architecture.
 

solofly

Banned
May 25, 2003
1,421
0
0
Originally posted by: Broken
I have owned way, way more Nvidia cards.

Me too, by far. This year was a little bit different though, 7 Ati cards and only 3 nvidia. Boy how things change...
 

solofly

Banned
May 25, 2003
1,421
0
0
Originally posted by: chizow
GT300 is rumored for Q3 next year as a DX11 part and a whole new architecture.

I have a feeling lil dragon will out before that... (providing it supports dx11 which we don't know yet)
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: MarcVenice
You are press ? Dude, if you are press, then the inq is the wallstreet journal.
I am invited to every NVIDIA launch event, supplied with hardware and software before release date, attend many of the press web conferences, get all the product review kits, post unique information direct from NVIDIA, and am under NDA. The Inquirer gets none of the above. (at least from NVIDIA, I don't know if ATi degrades themselves with that site- although you'd guess they owned it lately) While I may not do this for a living, I'm definitely not the average consumer either.

Originally posted by: MarcVenice
Also, I highly doubt it's BFG's wish to get free stuff handed to him just for the sake of it being free. He has decent hardware allready, so does apoppin for example. What he wants is to review it because he takes an interest in it, and maybe it will put a smile on his face if hes under NDA and can play with toys other enthusiasts have to wait for till launch date.

I think you're borderling offensive here, assuming bfg is nothing more but a greedy shill ...

I think you're putting word in my mouth, and guessing my motives erroneously.

I never said BFG wanted to be a shill, or assumed it. Why do you say that?

I assumed that as a person trying to get into the product review business BFG would be pursuing the press access I've enjoyed for years, so I was surprised he'd make an issue of me getting free hardware.

Does this help, Marc?


In any case, all this "free hardware" talk you ABT guys are putting forth seems Off Topic to me, I'd prefer to discuss the Dual GT200. You guys can PM me if you'd like to chat about other issues.


 

betasub

Platinum Member
Mar 22, 2006
2,677
0
0
Originally posted by: nRollo
In any case, all this "free hardware" talk you ABT guys are putting forth seems Off Topic to me, I'd prefer to discuss the Dual GT200.

Plenty of us would like to hear what you have to say about Dual GT200, so go ahead and discuss, no one's stopping you.

Sorry, I forgot there's an NDA stopping you.

 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |