nVidia GT200 Series Thread

Page 12 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Aberforth

Golden Member
Oct 12, 2006
1,707
1
0
Originally posted by: Rusin
So basically:
If Nvidia keeps it's trend Geforce GTX 280 could be at HD2900XT level in terms of real power consumption and GTX 260 would be very close to 8800 GTX [Don't know does this ask too much optimism..even from me ]

Who really gives a dam about Power? LOL...if a user was economical and responsible he'd have settled for 8800 GT. Why people buy TRI, QUAD SLI setups? If they spend that much money on their girlfriends or meaningful pursuits they'd rarely have Upgradematism.
 

dv8silencer

Member
May 7, 2008
142
0
0
Originally posted by: Aberforth
Originally posted by: Rusin
So basically:
If Nvidia keeps it's trend Geforce GTX 280 could be at HD2900XT level in terms of real power consumption and GTX 260 would be very close to 8800 GTX [Don't know does this ask too much optimism..even from me ]

Who really gives a dam about Power? LOL...if a user was economical and responsible he'd have settled for 8800 GT. Why people buy TRI, QUAD SLI setups? If they spend that much money on their girlfriends or meaningful pursuits they'd rarely have Upgradematism.

There is more to power consumption than just economics...
Heat?
 

Kuzi

Senior member
Sep 16, 2007
572
0
0
Originally posted by: KhadgarTWN
I believe TDP to real 3D load ratio of RV 770 is much like the patterns of RV 670
but if it is like the ratio of R600XT, then I won't be surprise if 4870 outnumbered GTX260 in power comsumption.

What I cannot figure out is
Despite of 4870 may have much lower perf/power ratio then GTX260, someone still claimed that 4870 is suprior in term of best perf: Power balance

R600XT was a 80nm part (420mm2 die size), with 512bit using GDDR3.
R770XT is a 55nm part (256mm2 die size), with 256bit using GDDR5.

There is no comparison as you see. ATI seems to have learned something from the R600XT. The R770 is smaller, cooler, faster, cheaper, just better in every way.

I think the TDP of the GTX260 will be around 175W, and the 4870 will be around 140W. And the 4870 will be a little slower in performance.
 

Kuzi

Senior member
Sep 16, 2007
572
0
0
Originally posted by: Rusin
So basically:
If Nvidia keeps it's trend Geforce GTX 280 could be at HD2900XT level in terms of real power consumption and GTX 260 would be very close to 8800 GTX [Don't know does this ask too much optimism..even from me ]

I think you are on the mark about the GTX260 being at the 8800GTX level in power consumption.

But the GTX280 is 576 mm2 die size, while the HD2900XT was 420mm2. Both using GDDR3 and a 512bit memory bus. So the GTX280 can easily have a TDP above 200W.
 

superbooga

Senior member
Jun 16, 2001
333
0
0
Originally posted by: Aberforth
Who really gives a dam about Power? LOL...if a user was economical and responsible he'd have settled for 8800 GT. Why people buy TRI, QUAD SLI setups? If they spend that much money on their girlfriends or meaningful pursuits they'd rarely have Upgradematism.

ATI and nVIDIA do. Performance/watt is considered the best determination of how good the architecture is.

That being said, I expect performance/watt for these new GPUs to be on par with the previous generation.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Originally posted by: Rusin
You said in that other thread that Nv fanboys should stay on this thread. Why don't you stay on that ATi 4xxx thread?

So 157W TDP is your limit and GTX 260's 182W would be too much? There's 16% difference and it would take a miracle if GTX 260 wouldn't have better performance/wattage ratio. Those performance ratio's.. that "20-30% faster than 9800 GTX" was speculation and that difference between 9800 GTX and 9800 GTX were from this test http://plaza.fi/muropaketti/ar...dia-geforce-9800-gtx,2 . They test game performance, not time demo performance..and it's on my native language and I trust them. I used minimum framerates in my calculation; on average frame rates difference between 9800 GTX and GX2 would have been larger. I some times use ComputerBase's tests if there's nothing better available.

Yes I did say something to that effect. I said Iwould try to leave the fanbois alone. NOT NV.

But I do have a problem with your speculation. First look at the GTX280 specs. Fanbois are saying 2X faster than last gen. Now I love speculation as much or more than most.
The problem I am having is the specs of the ATI 4870 looks like it to is a 100% increase imperformance if not more because of the shader clock increase and the extra shaders along with TM . Even tho ATI didn't increase ROP. One has to remember that ATI cards are true DX10 parts (DX10.1). The specs for DX10 were changed because NV couldn't have a DX10.1 card ready . SO nv cried ATI had inside info way befor NV got the same info. Which is a fact. Infact an old banned poster said this before the 600 was released .
He new just like all the ATI engineers new.

This is the only time I have ever agreed with NV on anything. They had every right to be pissed about ATI having inside info. But ATI was doing the Xbox so they had legael info. But that didn't make it right. SO MS left the DX10.1 part out to level the playing field.


Now NV. Its just plain BS that NV is trying to stop DX10.1 patches that improve ATI performance by alot. SO don't exspect alot of games with DX10.1 until NV has a card that can do .DX 10.1

Remember when ATI/NV were being investagated for price fixing. ATI dropped their prices alot. NV didn't . The Best thing for all of us is to hope ATI wins this round and than see what happens with pricing. ATI may keep there's below $500. wouldn't that be refreshing for ALL of us?(Top end)

By the way I tried doing folding home without a CPU. But the DAm PC wouldn't start !LOL!

 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Since when were ATi cards "true" DX10 cards? what has DX10.1 got to do with it being "true"? You do realise when a GPU supports a certain amount of requirements under the DX10 spec, it can be clsasified as a DX10 card?

And erm, not to be rude or anything but where are you getting all these "facts" from? (the rest of your post)

 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
what the hell is going on

4870 & gtx 280 are badass, gtx 280 costs more & is faster. The end.

No just kidding, keep going I need more inside info
 

semisonic9

Member
Apr 17, 2008
138
0
0
From what I've heard, the 4870 is coming in at $350, while the gt260 is expected around $400-450? And the gt280 $500-600?

If the nVidia chips don't drastically out-perform the ATI pieces, I can't see NV holding their prices at those levels for that long. I'm pretty sure they'll lower their prices to maintain market share. This could force them into a loss situation, since, as far as I know, they're still having trouble with yields.


~Semi
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: semisonic9
From what I've heard, the 4870 is coming in at $350, while the gt260 is expected around $400-450? And the gt280 $500-600?

If the nVidia chips don't drastically out-perform the ATI pieces, I can't see NV holding their prices at those levels for that long. I'm pretty sure they'll lower their prices to maintain market share. This could force them into a loss situation, since, as far as I know, they're still having trouble with yields.


~Semi

Where did you hear they were having trouble with yields? Just curious.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,000
126
Where did you hear they were having trouble with yields?
He probably means because the die size is bigger than the G8x/G9x they'll get less chips per wafer.
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: keysplayr2003
Originally posted by: semisonic9
From what I've heard, the 4870 is coming in at $350, while the gt260 is expected around $400-450? And the gt280 $500-600?

If the nVidia chips don't drastically out-perform the ATI pieces, I can't see NV holding their prices at those levels for that long. I'm pretty sure they'll lower their prices to maintain market share. This could force them into a loss situation, since, as far as I know, they're still having trouble with yields.


~Semi

Where did you hear they were having trouble with yields? Just curious.

Well just common sense, and the very low clocks.

nVidia has said previously that yields on G92 were not very good; that's with a 330mm^2 chip @ 65nm. Now we are talking about a 576mm^2 chip on the same process... certainly yields are not going to be good. Why do you think nVidia launches chips like the 8800GS or the GTX 260? Yield is poor and it makes sense for them to take semi-defective chips and sell them for reduced cost.

A 576mm^2 chip is unheard of in the consumer space.... that's Itanium-level die size, for chips that sell for $1000's each. Selling a chip that big for <$400 along with PCB, memory, cooling, etc... is not going to be ideal at all.

 

shabby

Diamond Member
Oct 9, 1999
5,782
45
91
Originally posted by: Extelleron
A 576mm^2 chip is unheard of in the consumer space.... that's Itanium-level die size, for chips that sell for $1000's each. Selling a chip that big for <$400 along with PCB, memory, cooling, etc... is not going to be ideal at all.

People said the same thing about the g80 on a 90nm process...
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Originally posted by: Cookie Monster
Since when were ATi cards "true" DX10 cards? what has DX10.1 got to do with it being "true"? You do realise when a GPU supports a certain amount of requirements under the DX10 spec, it can be clsasified as a DX10 card?

And erm, not to be rude or anything but where are you getting all these "facts" from? (the rest of your post)


If you would Please look up the orginal specs. of DX10 . You will find that it was changed at the 11Th hour. THE DX10.1 was left out . IF you don't understand what that means to IN the GPU development read about it and learn. NV couldn't have a part ready for the orginal DX10 specs. SO they cried to the game developers about it . Who in turn put pressure on MS. SO MS gave them 1 year. Talk about bad smell in pressuring the industry and slowing progress NV is the King. There are alot of people out there on the net saying DX10.1 brings alot to the table. Only NV and fanbois are arguing about it. NV is saying to the world . GPU rule PC,s . Yet it is NV who is trying to hold back progress. Because ATIs tech is more advanced. Its not me making it up . Its just the facts and people who are in denial over the facts. Befor the R600 was released I remember the debates here about DX10 and NV not having a DX10. Product . NOW most of us know its a fact. and NV still doesn't have a DX10 card and they won't have after the release the 200 series. IS what NV has is a revised DX10 card. NV can't do DX10.1 and thats why MS. changed the orginal specs to fit NVs needs. FACT!
 

Aberforth

Golden Member
Oct 12, 2006
1,707
1
0
Originally posted by: Nemesis 1
Originally posted by: Cookie Monster
Since when were ATi cards "true" DX10 cards? what has DX10.1 got to do with it being "true"? You do realise when a GPU supports a certain amount of requirements under the DX10 spec, it can be clsasified as a DX10 card?

And erm, not to be rude or anything but where are you getting all these "facts" from? (the rest of your post)


If you would Please look up the orginal specs. of DX10 . You will find that it was changed at the 11Th hour. THE DX10.1 was left out . IF you don't understand what that means to IN the GPU development read about it and learn. NV couldn't have a part ready for the orginal DX10 specs. SO they cried to the game developers about it . Who in turn put pressure on MS. SO MS gave them 1 year. Talk about bad smell in pressuring the industry and slowing progress NV is the King. There are alot of people out there on the net saying DX10.1 brings alot to the table. Only NV and fanbois are arguing about it. NV is saying to the world . GPU rule PC,s . Yet it is NV who is trying to hold back progress. Because ATIs tech is more advanced. Its not me making it up . Its just the facts and people who are in denial over the facts. Befor the R600 was released I remember the debates here about DX10 and NV not having a DX10. Product . NOW most of us know its a fact. and NV still doesn't have a DX10 card and they won't have after the release the 200 series. IS what NV has is a revised DX10 card. NV can't do DX10.1 and thats why MS. changed the orginal specs to fit NVs needs. FACT!

Any GPU that has shader processors or supports SM 4.0 is a DirectX10 card. Where API's are processed in GPU instead of CPU, this is based on new Windows Display Driver Model. I suggest you read this: http://developer.download.nvid...ffects-siggraph-06.pdf and this: http://developer.nvidia.com/ob...10-instancing-gdc-2006

There is no necessity for major architecture change for DX 10.1, a simple firmware upgrade would make existing card to run 10.1 apps but they don't do that due to business reasons. No one is trying to hold back the progress, it's just that companies are getting incompetant due to complexity of design, investment and time frame. There can never be a card with 1000 shader processors due to technical limitations, DX10 itself is designed to limit hardware upgrade cycles so gamers don't whine about PC-CONSOLES etc.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
SAY What??? NV can do DX10.1 with a firm ware update. PURE BS. I believe NV shaders aren't up to the task of DX10.1 . Give me a nv. propaganda article to read . TRUE DX10 has unified shaders . FACT!

YA we can do DX10.1 but we choose not to for business reasons. PURE RIGHT OUT AND OUT BS> BS. ITS greater than BS because its a lie to hurt the comp. and hold back progress. People all remember AEG and the level NV is willing to go to keep ya in their pocket. With most the game developers in NV pocket tell me they aren't trying to stall progress. The good lord gave ya a good brain . Use it.

Here is a good read. I had better links but I choose this one .

http://www.theinquirer.net/en/...ith-dis-unified-shader


Say what ever you please . Trueth is NV can't do DX10.1 because they don't have true unified shaders. FACT! Thats not saying NV can't play DX10.1 . It can but not like ATI can not even close. THe one game that is out that supports DX10.1 That Gives ATI A performance Advantage does nothing for NV teck. NOTHING/ NOT/ ZERO is being casterated for nv for online play . THE WAY IT MEANT TO BE PLAYED!!! Makes me smile every time I read that.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: keysplayr2003


Where did you hear they were having trouble with yields? Just curious.

I assume that he is extrapolating from this.

For the first time, a specific reason was given for the lower than expected G92 yields: Michael Hara claimed it was testing procedures, rather than manufacturing itself, which were the problem. If really true, that would imply they were likely too conservative.



 

Aberforth

Golden Member
Oct 12, 2006
1,707
1
0
Originally posted by: Nemesis 1
SAY What??? NV can do DX10.1 with a firm ware update. PURE BS. I believe NV shaders aren't up to the task of DX10.1 . Give me a nv. propaganda article to read . TRUE DX10 has unified shaders . FACT!

YA we can do DX10.1 but we choose not to for business reasons. PURE RIGHT OUT AND OUT BS> BS. ITS greater than BS because its a lie to hurt the comp. and hold back progress. People all remember AEG and the level NV is willing to go to keep ya in their pocket. With most the game developers in NV pocket tell me they aren't trying to stall progress. The good lord gave ya a good brain . Use it.

Well, Lord gave me good brain to design and program games so couch potatoes can play it. It is you who ought to do some reading before yelling crap. DX10 is just a software, the GPU firmware has the instructions to process it, if the base architecture is same then it is capable of handling subsequent versions with a firmware upgrade, but they don't do it. No one does it- they make new GPU's instead.
 

Kuzi

Senior member
Sep 16, 2007
572
0
0
Originally posted by: Rusin
...if Nvidia keeps up with their trend GTX 280 would be around 8800 GTX level in realistic power consumption.

I'd say the GTX280 would have 50-60W higher realistic power consumption compared to the 8800GTX, so 130+60=190W.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |