Nvidia reveals Specifications of GT300

Page 14 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: MarcVenice
The point being that GT200 ended up so big, because by your reasoning it is because of it's gpgpu-capabilities, that is taking up extra diespace. And I'm saying that's incorrect. You forget easily it seems.

Don't stress yourself, he's like Wreckage, nVidia = the Gods of videocards
ATi = teh suck - No reasoning, no understanding, no impartiallity, I bet their neurons have the nVidia logo inside.

Since CUDA and OpenCL are tailored for the GPU's stream processors for computing, the only thing really needed in the GPU level besides flexibility in the register management is cache to keep the data flowing through the stream processors, specially when not very parallel code is used like branchy code with jump, hops and subroutines or general purpose code, and the cache uses too little die space (256K in total). ATi did the same, GPU's stream processors are massively parallel and sacrificing that for multi purpose GPGPU performance will also impact the performance in games, leave that kind of work for the CPU's. So like you stated, I found doubtfull that the nVidia's huge die size is because of the GPGPU capabilities, you can see the GT200 diagram and will see that the stream processors are identical in complexity and size compared to the G92 GPU.

Both companies are necessary to avoid monopoly, better pricing, more innovative technologies and more choices, the 8800/9800 series of cards is what made ATi to wake up and release the HD 4800 series which is the reason why the GTX series dropped to almost competetively prices, heck a now you can get a nice GTX 260+ for less than $190 when it was released for more than $400. Now with the next generation of card like the GT300, we just have to wait how this is gonna end.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Without reading 17 pages of pr fud, are these specifications posted accurate?
 

jandlecack

Senior member
Apr 25, 2009
244
0
0
Originally posted by: ronnn
Without reading 17 pages of pr fud, are these specifications posted accurate?

Of course they are. Would you like to place a pre-order on the new GTX367.4 graphics card straight away?
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: dreddfunk
IDC - it's really not like your comparisons of memory or CPUs. What he's saying is more like this: if we design a truck to haul lumber, which has a large, flat bed, behind the cab, it would be no surprise that it would also be good at hauling bricks. After all, it's good at hauling things that can fit into large, flat beds.

That would be a general purpose trailer then, right?

What I want to know though is if they designed them shorter than usual, but daisy-chained two of them together, would it get better "tons hauled per mpg" than those CSX trains? :laugh:
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: evolucion8
Originally posted by: MarcVenice
The point being that GT200 ended up so big, because by your reasoning it is because of it's gpgpu-capabilities, that is taking up extra diespace. And I'm saying that's incorrect. You forget easily it seems.

Don't stress yourself, he's like Wreckage, nVidia = the Gods of videocards
ATi = teh suck - No reasoning, no understanding, no impartiallity, I bet their neurons have the nVidia logo inside.

Since CUDA and OpenCL are tailored for the GPU's stream processors for computing, the only thing really needed in the GPU level besides flexibility in the register management is cache to keep the data flowing through the stream processors, specially when not very parallel code is used like branchy code with jump, hops and subroutines or general purpose code, and the cache uses too little die space (256K in total). ATi did the same, GPU's stream processors are massively parallel and sacrificing that for multi purpose GPGPU performance will also impact the performance in games, leave that kind of work for the CPU's. So like you stated, I found doubtfull that the nVidia's huge die size is because of the GPGPU capabilities, you can see the GT200 diagram and will see that the stream processors are identical in complexity and size compared to the G92 GPU.

Both companies are necessary to avoid monopoly, better pricing, more innovative technologies and more choices, the 8800/9800 series of cards is what made ATi to wake up and release the HD 4800 series which is the reason why the GTX series dropped to almost competetively prices, heck a now you can get a nice GTX 260+ for less than $190 when it was released for more than $400. Now with the next generation of card like the GT300, we just have to wait how this is gonna end.

Now that you've gotten your daily personal digs in, something you can't seem to have a conversation without, can we continue with the. Discussion in a respectful and civil manner?
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
What purpose does DP support provide for gaming accelerators?

What purpose does IEEE 754 compliance serve for gaming accelerators?

Denormalized double precission useful in games?

There is absolutely no doubt whatsoever that both ATi and nVidia were taking GPGPU functionality into consideration in the design phase of these parts. We don't have to guess at this or use a crystal ball, simple functionality that is available on the chips and very well documented tells us rather explicitly that GPGPU functionality was most certainly a consideration during the design phase. Precisely how much die space it takes up in end effect is something that we could speculate quite a bit on, that it was without a doubt an intended design goal isn't.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: Keysplayr
Now that you've gotten your daily personal digs in, something you can't seem to have a conversation without, can we continue with the. Discussion in a respectful and civil manner?

Meh, tell that to yourself and your bussiness partner Wreckage. "Am I asking you" reply wasn't really civilized at all. So can we continue with the discussion in a civilized way without nVidia marketing propaganda? This is a forum, not a TV/Ad broadcast.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
I'm really looking forward to what this card will do with PhysX and other CUDA applications. There probably won't be any DirectX 11 games for awhile or even games that could stress the GT300. But physics, folding@home, video transcoding, could hit a level miles above where they are now.
 

allies

Platinum Member
Jun 18, 2002
2,572
0
71
I'm very interested in the new nvidia part.

If the specs quoted are true, it'll be a monster. I'm really hoping a repeat of the GTX280/260 and 4870/4850 occurs, with nvidia having the highend locked up with ATi bringing the competition forcing nvidia to lower their prices. I'll probably be purchasing a card around then and wouldn't mind having some options
 

lopri

Elite Member
Jul 27, 2002
13,221
608
126
My wild guess:

If GT300 is going to have 512-bit memory interface,

1) NV will stick to GDDR3, or
2) GT300 may be an external unit. (like QuadroPlex)

There is no factual basis for this guess.
 

Blazer7

Golden Member
Jun 26, 2007
1,136
12
81
I can?t see how nV will go for any of this. If they really want to make life miserable for ATI they will most certainly opt for something faster than GDDR3. Of course this will raise the price but I don?t think that this will stop them. As for the external unit, nV will probably release an external GT300 based quadroplex but this won?t be targeted towards us ?normal? users.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
I am looking forward to this chip. But of course it will require me to play something other than WoW or Call of Duty W@W to take advantage of it
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: evolucion8
Originally posted by: Keysplayr
Now that you've gotten your daily personal digs in, something you can't seem to have a conversation without, can we continue with the. Discussion in a respectful and civil manner?

Meh, tell that to yourself and your bussiness partner Wreckage. "Am I asking you" reply wasn't really civilized at all. So can we continue with the discussion in a civilized way without nVidia marketing propaganda? This is a forum, not a TV/Ad broadcast.

Ok, I've got to laugh at this one. My "Why am I asking you" comment toward DF was because I was asking DF what Marc thinks, when I could have been just asking Marc!
Get it? How in the name of all that's holy wasn't that civilized? Oh I know, you took it out of context and made it to mean something else. Really my friend, if you are just here to sling insults and slurs, just go away. We don't want any. We gave at the office. The check is in the mail, etc. etc. All this stuff happens when you have nothing left to argue with. Well, find something if you feel so strongly about your point of view. If you can't find anything, then maybe there wasn't much to back up your point of view in the first place.

Though, I am telling you right now. Any further insults, slurs, whatever, coming from anybody at all, is going to be forwarded to the mods. No if's and's or buts. I suggest you do the same. It will clean up this bullshit that is present in the forum and deter others from following suit.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: Genx87
I am looking forward to this chip. But of course it will require me to play something other than WoW or Call of Duty W@W to take advantage of it

Yeah this is where I am at. I will wait for a $200 GTX360.

Hey then I can tell everyone that I play WoW on my 360.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: lopri
My wild guess:

If GT300 is going to have 512-bit memory interface,

1) NV will stick to GDDR3, or
2) GT300 may be an external unit. (like QuadroPlex)

There is no factual basis for this guess.

I'm not certain I agree with NV still using GDDR3 for higher end cards. GDDR5 has been out for a good while now, and I think it's pricing isn't as cost prohibitive as it once was, and may be a contributing reason why 4870 and up have been coming down in price. Also look at 4770 utilizing GDDR5 now.

I'm thinking GDDR5 will be the standard for high end cards. Whether or not Nvidia will continue to use the 512-bit memory controller is a mystery, although if history repeats itself, as G80 to G92 went from 384 to 256 bit, We might be seeing a return of the 256 bus. But then again, this is a new architecture and not a core revision as G80 to G92 was. Very tough to speculate on what's going to go down this time around.

If GT300 is GDDR5 and a 512 bit memory controller, the bandwidth would be off the chain. GDDR5 prices coming down doesn't make using it that big of a deal anymore, but the PCB design for 512bit may be just a pricey as current GT200 boards. Yeeeaarrrgghh.. Brain.... hurts.....
 

error8

Diamond Member
Nov 28, 2007
3,204
0
76
Originally posted by: Keysplayr


I'm not certain I agree with NV still using GDDR3 for higher end cards. GDDR5 has been out for a good while now, and I think it's pricing isn't as cost prohibitive as it once was, and may be a contributing reason why 4870 and up have been coming down in price. Also look at 4770 utilizing GDDR5 now.

Thank god for ATi, because if it wasn't for them, we would have still seen ddr3 on Nvidia high end cards, even in 2012. :laugh:
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
Originally posted by: Keysplayr


Though, I am telling you right now. Any further insults, slurs, whatever, coming from anybody at all, is going to be forwarded to the mods. No if's and's or buts. I suggest you do the same. It will clean up this bullshit that is present in the forum and deter others from following suit.

/in

Oh no it wont, not until they perma ban a certain member who has been temp banned several times already for baiting, misquoting, and being a general troll

/out
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: ShadowOfMyself
Originally posted by: Keysplayr


Though, I am telling you right now. Any further insults, slurs, whatever, coming from anybody at all, is going to be forwarded to the mods. No if's and's or buts. I suggest you do the same. It will clean up this bullshit that is present in the forum and deter others from following suit.

/in

Oh no it wont, not until they perma ban a certain member who has been temp banned several times already for baiting, misquoting, and being a general troll

/out

dadach was just given a vacation as far as I know. And there are quite a few people here who's bliss is to provoke one another. If you see something you don't like that breaks TOS or guidelines, report it. If it's just a heated conversation without getting personal, then there's no point. You should be able to tell the difference.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: error8
Originally posted by: Keysplayr


I'm not certain I agree with NV still using GDDR3 for higher end cards. GDDR5 has been out for a good while now, and I think it's pricing isn't as cost prohibitive as it once was, and may be a contributing reason why 4870 and up have been coming down in price. Also look at 4770 utilizing GDDR5 now.

Thank god for ATi, because if it wasn't for them, we would have still seen ddr3 on Nvidia high end cards, even in 2012. :laugh:

AFAICT, GDDR3 seems to be doing fine. Time to move on? For sure.
 

error8

Diamond Member
Nov 28, 2007
3,204
0
76
Originally posted by: Keysplayr
Originally posted by: error8
Originally posted by: Keysplayr


I'm not certain I agree with NV still using GDDR3 for higher end cards. GDDR5 has been out for a good while now, and I think it's pricing isn't as cost prohibitive as it once was, and may be a contributing reason why 4870 and up have been coming down in price. Also look at 4770 utilizing GDDR5 now.

Thank god for ATi, because if it wasn't for them, we would have still seen ddr3 on Nvidia high end cards, even in 2012. :laugh:

AFAICT, GDDR3 seems to be doing fine. Time to move on? For sure.

Yeah, Nvidia squeezed everything out of ddr3. To further improve the memory bandwidth, they would probably need to use an 1024 bit interface, which I don't think it will happen this year. GDDR5 is the only way to go.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: BenSkywalker
What purpose does DP support provide for gaming accelerators?

What purpose does IEEE 754 compliance serve for gaming accelerators?

Denormalized double precission useful in games?

There is absolutely no doubt whatsoever that both ATi and nVidia were taking GPGPU functionality into consideration in the design phase of these parts. We don't have to guess at this or use a crystal ball, simple functionality that is available on the chips and very well documented tells us rather explicitly that GPGPU functionality was most certainly a consideration during the design phase. Precisely how much die space it takes up in end effect is something that we could speculate quite a bit on, that it was without a doubt an intended design goal isn't.

Beat me to it. Nvidia has dedicated HW in the G200 architecture for the purpose of GPGPU applications. HW which takes up space, and serves no purpose in gaming. Along with Nvidia's marketing pimping the GPU as a faster alternative to a CPU, it doesn't take much genius to see where they're going with that strategy.
 

roid450

Senior member
Sep 4, 2008
858
0
0
Originally posted by: ibex333
Good thing I didn't get a new video card. Thanks to the GT300 I should be able to buy a 295gtx for less than $100 less than a year from now.


youll prolly be able to get a GTX260 192 steam processor for 100$, maybe wait 2 or 3 yrs to get a 295 for less than 100$... tard :roll: lol, what makes you think in 1 yr a gtx295 will be 100$- .... -_-
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Originally posted by: roid450
Originally posted by: ibex333
Good thing I didn't get a new video card. Thanks to the GT300 I should be able to buy a 295gtx for less than $100 less than a year from now.


youll prolly be able to get a GTX260 192 steam processor for 100$, maybe wait 2 or 3 yrs to get a 295 for less than 100$... tard :roll: lol, what makes you think in 1 yr a gtx295 will be 100$- .... -_-

Sounds about right. In 2-3 years there will probably be some mid level card (built on a smaller process) that runs cooler and uses less power than 295 GTX.

What is Today's equivalent of 7950 GX2? <----Doesn't 9600 GT come close to this?
 

Blazer7

Golden Member
Jun 26, 2007
1,136
12
81
Originally posted by: roid450
Originally posted by: ibex333
Good thing I didn't get a new video card. Thanks to the GT300 I should be able to buy a 295gtx for less than $100 less than a year from now.


youll prolly be able to get a GTX260 192 steam processor for 100$, maybe wait 2 or 3 yrs to get a 295 for less than 100$... tard :roll: lol, what makes you think in 1 yr a gtx295 will be 100$- .... -_-

There're so few 295s out there that in a year from now the price may reach a new high due to their antique nature.
 

yacoub

Golden Member
May 24, 2005
1,991
14
81
Originally posted by: error8
Originally posted by: Keysplayr
Originally posted by: error8
Originally posted by: Keysplayr


I'm not certain I agree with NV still using GDDR3 for higher end cards. GDDR5 has been out for a good while now, and I think it's pricing isn't as cost prohibitive as it once was, and may be a contributing reason why 4870 and up have been coming down in price. Also look at 4770 utilizing GDDR5 now.

Thank god for ATi, because if it wasn't for them, we would have still seen ddr3 on Nvidia high end cards, even in 2012. :laugh:

AFAICT, GDDR3 seems to be doing fine. Time to move on? For sure.

Yeah, Nvidia squeezed everything out of ddr3. To further improve the memory bandwidth, they would probably need to use an 1024 bit interface, which I don't think it will happen this year. GDDR5 is the only way to go.

It's just a shame ATi went the wrong way and tightened the bus bandwidth on their 4770 and cranked up the clockspeed to make up for it, instead of running a cooler, lower wattage part with wider bus and lower clockspeeds. 256-bit DDR5 would have been a winner, at least according to my understanding of how that would perform compared to 128-bit w/ higher clocks.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |