G80 Stuff

Page 13 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: Creig
Originally posted by: jiffylube1024
I don't get why you think that the GDDR3 will be such a constraint. Bandwidth is bandwidth, and as long as the GPU can take full advantage of the 384-bit bus, G80 should be golden. 900 MHz on a 384-bit bus is just like 1350 MHz (2700 actual) on a 256-bit bus. Which is a big improvement over the previous gen. 1920X1200 should be a breeze for G80

Exactly. Why spend big $ on the fastest memory chips when you can go with cheaper, slower chips yet still improve overall bandwidth. The only question is how much cost savings there is between twelve GDDR3 chips versus eight GDDR4.

Don't get me wrong G80 will be great, but it's only the first chip in a new architecture and I believe that for me at least G85 (or whatever the refresh will be known as) will be time to buy. G80 is all about time to market.

There won't be any savings to be had by using GDDR3 over GDDR4 - what you gain in cheaper chips you lose thru an increased BOM and PCB complexity (unless G85 will use GDDR4 in the same configuration as g80 uses GDDR3 in which case it truly will be a perfromance monster).

Hopefully G85 will also support Dx10.1, something the initial DX10 chips will lack.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: thilan29
Originally posted by: Creig
Originally posted by: Gstanfor
One the first point, that's my thoughts precisely The fanatics hate picked picked on though - they don't take the medicine they were so keen to dish out in the nv3x era happily at all (not that that will prevent me rubbing their faces in it every opportunity I get...)

On the second point fanatics are like vermin - impossible to eradicate. 97% of all ATi fanatics were previously 3dfx fanatics, and if ATi goes under they'll all just abandon ship and board whatever new one comes along, and keep right on attacking nvidia as they do so...

If they were to shut their cakeholes, they might find I'd leave them alone.


Greg, I don't know how to break this to you, but YOU are a fanatic. So the comments about vermin and shutting cakeholes applies to you as well.

QFT. As you pointed out earlier...how ironic. I really don't understand how he takes it so personally.

No, creig - never had a bad word to say about any video card vendor - UNTIL the fanATics started their bullshit. I just return fire.
 

Justarius

Member
Jul 25, 2001
35
0
0
Originally posted by: jiffylube1024
Originally posted by: lopri
Can we all stick to the topic and take off-topics to PMs? It is quite annoying to scroll thrugh the useless posts to find something meaningful.

Now on topic - Actually if those leaked 3DMark scores bear any truth, it is not very impressive as long as DX9 performance is concerned. SLI'ed 7900GTO w/ E6600 @3.6GHz will give more than 10K in 3DMark06. So we're looking at 2x single GPU perfornace increase. And 7950GX2 users aren't missing alot. Now, DX10 support and enhanced image quality is a different subject and again my opinion is on the assumption that the said 3DMark scores are true.

Some people will never be happy. Faster than double the performance of dual 7900GTX's (or dual 7900 GTO's on a seriously overclocked Conroe) is very impressive. It's a bit better than going from the 6800U to the 7800GTX.
Since when this thread is about people's happiness and well-being? Also 'not impressive' and 'not happy' are not mutually exclusive. My comment is mostly based on the leaked specs which boasts 128 shader processors (?) et al. as well as the benchmark. We do not know how much transistor budget has been allocated for DX10 / SM 4.0 / VCAA, etc. and the efficiency of the core per sillicon budget.

Some people will always twist others' posts and post off-topics. I guess that's a necessary evil of internet forums.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: keysplayr2003
Originally posted by: Creig
Exactly. Why spend big $ on the fastest memory chips when you can go with cheaper, slower chips yet still improve overall bandwidth. The only question is how much cost savings there is between twelve GDDR3 chips versus eight GDDR4.


Isn't GDDR4 still 32-bit chips? I think it would still be 12 chips no? Honestly don't know.

The implication is that with GDDR4 they would get a faster grade of memory, thus using 8 chips instead of 12.

But I think Creig is right, availability was the big issue with GDDR4 - it just isn't available in the quantities NVidia needs. Especially > 1000 MHz.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: Justarius
Originally posted by: jiffylube1024
Originally posted by: lopri
Can we all stick to the topic and take off-topics to PMs? It is quite annoying to scroll thrugh the useless posts to find something meaningful.

Now on topic - Actually if those leaked 3DMark scores bear any truth, it is not very impressive as long as DX9 performance is concerned. SLI'ed 7900GTO w/ E6600 @3.6GHz will give more than 10K in 3DMark06. So we're looking at 2x single GPU perfornace increase. And 7950GX2 users aren't missing alot. Now, DX10 support and enhanced image quality is a different subject and again my opinion is on the assumption that the said 3DMark scores are true.

Some people will never be happy. Faster than double the performance of dual 7900GTX's (or dual 7900 GTO's on a seriously overclocked Conroe) is very impressive. It's a bit better than going from the 6800U to the 7800GTX.
Since when this thread is about people's happiness and well-being? Also 'not impressive' and 'not happy' are not mutually exclusive.

Ah, a logic issue! Also: where did I mention well-being?

My comment is mostly based on the leaked specs which boasts 128 shader processors (?) et al. as well as the benchmark. We do not know how much transistor budget has been allocated for DX10 / SM 4.0 / VCAA, etc. and the efficiency of the core per sillicon budget.

Some people will always twist others' posts and post off-topics. I guess that's a necessary evil of internet forums.


Well, it's 128 unified shaders - they can't all be pixel shaders.

Back to my point, I can't see how >2X performance improvement over the previous gen, or faster than SLI in the previous gen is unimpressive.

The card will support DX10, yet still be over twice as fast in DX9 as the previous gen. Seems to be doing the job. I guess people were expecting miracles this generation .
 

lopri

Elite Member
Jul 27, 2002
13,221
612
126
Oh I'm not expecting miracles. And I see what you can't see and vice versa. I'll probably buy the card when it comes out and will be quite happy. 2x performance should be impressive in theory but the reality is not. Also we should note that I am talking under the assumption that the performance increase in DX9 is x2. I personally expect it'll be more than x2.

So far in this thread:

- 128 unified shaders (thanks for the correction)
- 700M transistors
- 150W~200W estimated peak power consumption along with the request on PCI-SIG regarding PCI-E spec modification
- Proposed MSRP $650

Depending on how DX10 is implemented, and knowing that there will not be any meaningful applications that'll take advantage of DX10 for its shelf-lifetime, this part can be a very expensive and inefficient DX9 performer, compared to current G71 variants, if the performance increase in DX9 is only x2. Same level of gaming experience can be achieved with NV's current crop of offerings with less cost. (technically and financially)

My comment came from this context. I may have had to elaborate this more clearly for you. Oh before you go on and talk about DX10 and other features, I'd like to make it clear that my comment was made with those out of the equation. Furthermore, I still think the performance increase in DX9 will be bigger than the factor of 2 as above mentioned.

I kindly request you to refrain from personal attack.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: jiffylube1024
Originally posted by: keysplayr2003
Originally posted by: Creig
Exactly. Why spend big $ on the fastest memory chips when you can go with cheaper, slower chips yet still improve overall bandwidth. The only question is how much cost savings there is between twelve GDDR3 chips versus eight GDDR4.


Isn't GDDR4 still 32-bit chips? I think it would still be 12 chips no? Honestly don't know.

The implication is that with GDDR4 they would get a faster grade of memory, thus using 8 chips instead of 12.

But I think Creig is right, availability was the big issue with GDDR4 - it just isn't available in the quantities NVidia needs. Especially > 1000 MHz.

I know GDDR4 will be faster, use less power and have some features over GDDR3, thats a given. What I don't understand is, for the same amount of memory for GDDR3 & GDDR4, (768MB for GTX) why would only 8 chips be utilized on a 384-bit wide bus? That is the reason for my GDDR4 32-bit per chip question. On a 384 bit bus, using eight GDDR4 chips, would each chip have to be 48-bits wide? 8x48=384. GDDR3 is currently 12x32=384.
What am I missing? I read the specs on GDDR4 in a few reviews. Mainly the differences between GDDR3 and 4, and they seem to indicate that GDDR4 is 32 bit memory chips.
This is why I think a G80 going from GDDR3 to GDDR4 will still utilize 12 chips for 768MB of memory. This is not really mind blowingly important, I'm just curious now. Tell me where I messed up here.

Keys

 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
No, I think you've got it right keys. I keep forgetting about that odd 768mb memory configuration.
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Originally posted by: Jugernot
Originally posted by: Crusader

No, I dont want ATI out. I just like to see them hurting and their fanboys on the run. Dont want an NV monopoly though.


I'm guessing someone else would step up to Nvidia anyway, even if AMD backs down.
There definitely room for at least 2 in this market.
So I havent been fretting the possibility like the ATI crones have been. If ATI goes, who cares.. this is capitalism.. someone else will step up, and ATI and their fanboys will have been vanquished.

One the first point, that's my thoughts precisely The fanatics hate picked picked on though - they don't take the medicine they were so keen to dish out in the nv3x era happily at all (not that that will prevent me rubbing their faces in it every opportunity I get...)

On the second point fanatics are like vermin - impossible to eradicate. 97% of all ATi fanatics were previously 3dfx fanatics, and if ATi goes under they'll all just abandon ship and board whatever new one comes along, and keep right on attacking nvidia as they do so...

If they were to shut their cakeholes, they might find I'd leave them alone.
Hah, I love how the fanatics always agree with whatever the other fanatics (for their company, of course) have said, no matter how dumb it is.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Personally the thing that will really sway my decision to buy into G80 or not is going to be if it addresses the AA issues DX9 suffered (Splinter cell, MOH : PA etc) or not as well as if it addresses shader based aliasing (hopefully without requiring shader rewrites).

Anyone else find they were logged out just now after refreshing the forum?
 

zephyrprime

Diamond Member
Feb 18, 2001
7,512
2
81
Originally posted by: lopri
Oh I'm not expecting miracles. And I see what you can't see and vice versa. I'll probably buy the card when it comes out and will be quite happy. 2x performance should be impressive in theory but the reality is not.
I know what you mean. I'm kind of a little disappointed if all the rumors are true. If the G80 is really dual die, than it is the successor to the gx2 not the gtx. And if the rumored performance is true, than the G80 only has about ~20% over the gt2 which isn't so great. If the rumors about the number of shaders and the shader clock speeds are true, I don't see how the G80 could perform so poorly.

Of course, it's possible that all these rumors are false. But then again, I think it's pretty safe to say that a lot of these rumors are intentionaly information leaks from nVidia.

 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Oh, you can bet your bottom dollar nvidia has deliberately lead some leakers up the garden path - it's been a favorite tactic of theirs since before the nv40 launch, and the aim appears to be to keep competitors from knowing what to expect, making it hard for them to deliver a product that matches or roughly approaches the specs at the same time nvidia does.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Gstanfor
Personally the thing that will really sway my decision to buy into G80 or not is going to be if it addresses the AA issues DX9 suffered (Splinter cell, MOH : PA etc) or not as well as if it addresses shader based aliasing (hopefully without requiring shader rewrites).

Anyone else find they were logged out just now after refreshing the forum?
yes

do you never go ANYwhere other than video ?
[try P&N . . .you're a 'natural' and would fit in perfectly]

and it's been posted in this thread SEVERAL times:

Login changes

Forum Problems

looks like i may get a g80/81 about the same time as you do
:Q

--or a r640


==============
Originally posted by: Gstanfor
Oh, you can bet your bottom dollar nvidia has deliberately lead some leakers up the garden path - it's been a favorite tactic of theirs since before the nv40 launch, and the aim appears to be to keep competitors from knowing what to expect, making it hard for them to deliver a product that matches or roughly approaches the specs at the same time nvidia does.
long before nv40 . . . Nv30 was a "real surprise" . . . ATi was shocked to see they had no competition for a couple of years.
:shocked:

that 'trash' company from Canada doesn't need to know what nvidia is doing . . . they work on their OWN agenda . . . and very strangely they have managed to keep a completely competitive product in the GPU marketplace - for years.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
For a company that "doesn't need to know" what their competition is up to (that's a new one on me - all companies that have an interest in survival keep tabs on their competition, especially if they are commodity based, like nvidia & ati are) they sure like to spend a lot of time and effort on slagging said competition - kind of hard to do if they don't attempt to find out whats going on don't you think?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Gstanfor
For a company that "doesn't need to know" what their competition is up to (that's a new one on me - all companies that have an interest in survival keep tabs on their competition, especially if they are commodity based, like nvidia & ati are) they sure like to spend a lot of time and effort on slagging said competition - kind of hard to do if they don't attempt to find out whats going on don't you think?

everyone else will admit . . . ati has been very successful in keeping a competitive product in the marketplace

 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Ever hear the phrase "time to market" apoppin? It bit ATi in the ass with highend r4xx and ATi were lucky not to have a limb amputed with R5xx. Competitive products are useless if they are months late to market.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: Creig
Originally posted by: jiffylube1024
I don't get why you think that the GDDR3 will be such a constraint. Bandwidth is bandwidth, and as long as the GPU can take full advantage of the 384-bit bus, G80 should be golden. 900 MHz on a 384-bit bus is just like 1350 MHz (2700 actual) on a 256-bit bus. Which is a big improvement over the previous gen. 1920X1200 should be a breeze for G80

Exactly. Why spend big $ on the fastest memory chips when you can go with cheaper, slower chips yet still improve overall bandwidth. The only question is how much cost savings there is between twelve GDDR3 chips versus eight GDDR4.

Nearly nothing, if not an actual increased cost. They also have to add layers to the card and it complicates the design process.

However, if we are going to see GDDR4 on the refresh (which i would expect, but its pure speculation by me and dont quote me on it) i can completely understand the decision to stick with GDDR3 for now.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Gstanfor
Ever hear the phrase "time to market" apoppin? It bit ATi in the ass with highend r4xx and ATi were lucky not to have a limb amputed with R5xx. Competitive products are useless if they are months late to market.

why yes i have . . . and the market share has moved back-and-forth between ATI and nvidia . . . . 3-5% often at the expense of Intel.

strangely - according to your fantasy - 'luck' is on ATi's side

they have not only suvived, they have prospered and right now - the x1900 series is a worthy competitor to the 7800 series . .. the xtx being the single GPU performance champ.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: keysplayr2003
Originally posted by: jiffylube1024
Originally posted by: keysplayr2003
Originally posted by: Creig
Exactly. Why spend big $ on the fastest memory chips when you can go with cheaper, slower chips yet still improve overall bandwidth. The only question is how much cost savings there is between twelve GDDR3 chips versus eight GDDR4.


Isn't GDDR4 still 32-bit chips? I think it would still be 12 chips no? Honestly don't know.

The implication is that with GDDR4 they would get a faster grade of memory, thus using 8 chips instead of 12.

But I think Creig is right, availability was the big issue with GDDR4 - it just isn't available in the quantities NVidia needs. Especially > 1000 MHz.

I know GDDR4 will be faster, use less power and have some features over GDDR3, thats a given. What I don't understand is, for the same amount of memory for GDDR3 & GDDR4, (768MB for GTX) why would only 8 chips be utilized on a 384-bit wide bus? That is the reason for my GDDR4 32-bit per chip question. On a 384 bit bus, using eight GDDR4 chips, would each chip have to be 48-bits wide? 8x48=384. GDDR3 is currently 12x32=384.
What am I missing? I read the specs on GDDR4 in a few reviews. Mainly the differences between GDDR3 and 4, and they seem to indicate that GDDR4 is 32 bit memory chips.
This is why I think a G80 going from GDDR3 to GDDR4 will still utilize 12 chips for 768MB of memory. This is not really mind blowingly important, I'm just curious now. Tell me where I messed up here.

Keys

Youre right, both GDDR 3 and 4 are 32bit chips.

Furthermore, it isnt really a single 128, 256, or 384bit memory controller. They are multiple 64 bit controllers in parallel. The "old" (heh) Geforce 7000 series had 4 64bit memory controllers, the new 8000 series will have 6. It should, unless there is an actual bus increase on DDR4 i havent heard about, require 12 chips for the 384bit bus.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: apoppin
Originally posted by: Gstanfor
Ever hear the phrase "time to market" apoppin? It bit ATi in the ass with highend r4xx and ATi were lucky not to have a limb amputed with R5xx. Competitive products are useless if they are months late to market.

why yes i have . . . and the market share has moved back-and-forth between ATI and nvidia . . . . 3-5% often at the expense of Intel.

strangely - according to your fantasy - 'luck' is on ATi's side

they have not only suvived, they have prospered and right now - the x1900 series is a worthy competitor to the 7800 series . .. the xtx being the single GPU performance champ.

You idiot, market share means squat-diddly. It's profit margins and earnings that matter.

We can clearly see who's superior there with ATi being bought for less than 5 billion and nvidia worth 10 biliion.

Thats what time to market gets you - free money, 'cause the other guy can't directly compete. That will be G80's largest advantage.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: Gstanfor

You idiot, market share means squat-diddly. It's profit margins and earnings that matter.

We can clearly see who's superior there with ATi being bought for less than 5 billion and nvidia worth 10 biliion.

Thats what time to market gets you - free money, 'cause the other guy can't directly compete. That will be G80's largest advantage.

Don't forget that Nvidia has been in the chipset business a lot longer and has had several successful generations of Nforce under their belt.

But I do agree, Nvidia has gotten in when the margins are fattest for most of the last 2 years. X1900XTX was a flash in the pan in terms of being able to skim the market -- they made good money on the X1900 series, but weren't able to charge outlandish prices for very long. Heck, Nvidia was/is charging more for the 7900GTX!
 

lopri

Elite Member
Jul 27, 2002
13,221
612
126
I just saw a post of a person (who is supposedly in-the-know) @B3D that says G80 will be comparable to 2x 7950GX2. Now that's what I wanted to see!
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |