nVidia GT200 Series Thread

Page 33 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Originally posted by: taltamir
amd doesn't have a great platform, AMD has a CHEAP but balanced platform... and intel doesn't have a great platform, intel has an awesome CPU with craptacular video.

If intel puts the hurt on nvidia by refusing to license nehalem chipset making ability to them, and nvidia gives in and trades SLI for it, it would be very bad for AMD.

Although nVidia could always go with via and their x86 license.

As I understand it . Intel doesn't want SLI period on Nehalem . Why should they . Intel/ATI /AMD are getting really cuddlely kind of scary . But maybe AMD has seen the light . Thanks to the close working relationship Intel and ATI have developed . Intel ATI have a few teck license agreements in place . NV isn't getting A QPling license .

Look at it like this . If AMD 4000series is => than NV 280/260 . Intel doesn't need a SLi . Intel just plain flat doesn't need NV .

NV turning to VIA has to have pissed AMD off . It allows another player into the game A good thing for us . But not for AMD it isn't.

SO ya have AMD buys ATI . ATI is now resuming its relationship with Intel (gooGLE) . AMD Intel kiss and make up. Were does this leave NV in 09.

 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
It would hurt intel sales quite a bit to alienate the enthusiast market that they are allegedly trying to get with nehalem. A high percentage of people who want the latest/greatest are hardcore enthusiast gamers. These people understand that acceptable cpu performance with outstanding video performance is the ideal formula in most cases. If intel completely blocks sli from nehalem then that will be a huge boon to amd as MANY enthusiasts will go with shanghai/sli. Some of those might settle for keeping their ageing system longer.
 

allies

Platinum Member
Jun 18, 2002
2,572
0
71
Originally posted by: nRollo
Originally posted by: JPB
Originally posted by: nRollo
Originally posted by: Aberforth
Originally posted by: JPB
GT200 scores revealed

THANKS TO NVIDIA'S shutting us out, we are not handcuffed about the GT200 numbers, so here they are. Prepare to be underwhelmed, Nvidia botched this one badly.

Since you probably care only about the numbers, lets start out with them. All 3DMark scores are rounded to the nearest 250, frame rates to the nearest .25FPS. The drivers are a bit old, but not that old, and the CPU is an Intel QX9650 @ 3.0GHz on the broken OS, 32-bit, SP1


Numbers

Ouch, are these numbers real? I've never seen such an incompetent company really...they've been pushing their SLI agenda just to prove it they are out of brains.

When I see real reviews showing NVIDIA has been surpassed in performance or image quality, I'll believe they're second best in the world at what they do.

Until then, I'll keep thinking they're the best in the world at what they do, rather than "incompetent".

No refuting heh ?

I think the *keyword* in your post is *until* :thumbsup:

You are under NDA though right nRollo ?

I can't comment on the performance of these parts due to NDAs I've signed.

You're reading too much into my "until" comment. When I posted it my thought was:
"NVIDIA has been, and is, the world leader in graphics performance and image quality. We haven't seen any benchmarks or slides showing that has changed, so until we do, I'll have to assume they are still."

Had nothing to do with upcoming reviews, I know nothing about the RV770 or R700 that I can compare to what I know of the GT200 series.

You mean since 8800GTX right (1.5 years)? Before that nvidia consistently got beaten in both performance and IQ.
 

HurleyBird

Platinum Member
Apr 22, 2003
2,759
1,455
136
Originally posted by: Foxery
GPU architecture (and the nature of its work) may simply not lend itself well to being broken apart in this manner, either.

Of course it does. Unlike CPUs, GPUs are very parallel creatures. Make a link that is quick enough with low enough latency and you can get ~100% scaling.
 

HOOfan 1

Platinum Member
Sep 2, 2007
2,337
15
81
Originally posted by: allies

You mean since 8800GTX right (1.5 years)? Before that nvidia consistently got beaten in both performance and IQ.

9700 and 9800 over the FX series was the only time ATI held a clear advantage over nVidia....and before 9700 ATI's drivers were an absolute joke.
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: bryanW1995
It would hurt intel sales quite a bit to alienate the enthusiast market that they are allegedly trying to get with nehalem. A high percentage of people who want the latest/greatest are hardcore enthusiast gamers. These people understand that acceptable cpu performance with outstanding video performance is the ideal formula in most cases. If intel completely blocks sli from nehalem then that will be a huge boon to amd as MANY enthusiasts will go with shanghai/sli. Some of those might settle for keeping their ageing system longer.

If AMD's R700 cards end up being faster than the GTX 200 cards, which it looks like, then there will be no need to worry about SLI compatability. People will just go with 4870 X2 Quad-CF on X58....AMD quad-crossfire drivers will likely improve, and it will become a competitive platform.

Nemesis has a point in that Intel has the power to shut nVidia out of the enthusiast market. If nVidia can't make a chipset for Nehalem and Intel boards do not have SLI, then the top end of the enthusiast market belongs ATI graphics cards. Either that, or you could go with an AMD CPU + nVidia mobo for SLI. Which one will people choose? It seems both benefit AMD, and don't really hurt Intel.

Originally posted by: HOOfan 1
Originally posted by: allies

You mean since 8800GTX right (1.5 years)? Before that nvidia consistently got beaten in both performance and IQ.

9700 and 9800 over the FX series was the only time ATI held a clear advantage over nVidia....and before 9700 ATI's drivers were an absolute joke.

I wouldn't say that. The X800XT PE was clearly > 6800 Ultra and the X850XT PE > 6800 Ultra by even a wider margin. The 6850 Ultra or 6800 Ultra Extreme never was really released, so you can't count that (even though I don't think it beat the X800/X850's anyway).

The X1800XT clearly beat its competitor, the 7800GTX 256MB, in everything that involved AA/AF. The 7800GTX 512MB was never widely available so you can't count it, and even if you do, it only held the lead for 2-3 months. X1900XTX > 7900GTX, in performance a bit and in IQ by far. Certainly if you look at games from 2007-2008 on X1900 & 7900 cards, the X1900XTX can be ~2x 7900GTX in some situations (i.e. Crysis).

And to BFG, as Martimus said, I am not talking about multi-GPU via software (drivers). I am talking about a multi-die GPU that to the driver is a single-GPU. While I don't think software-based multi-GPU is bad, and it's a good interim solution, in the long run I feel a hardware connection between the GPUs is needed.

And Foxery, actually it will translate into lower power/heat, because it could have been built on a 55nm process if you cut GT200 into 4 die. 55nm isn't yet mature enough for a 400-500mm^2 GPU, which is what GT200 would have been on 55nm. But for a chip 144m^2 in size? It would have been fine, and reduced the die size of the chip, along with decreasing heat output, allowing for higher clocks/better performance.

 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Originally posted by: bryanW1995
It would hurt intel sales quite a bit to alienate the enthusiast market that they are allegedly trying to get with nehalem. A high percentage of people who want the latest/greatest are hardcore enthusiast gamers. These people understand that acceptable cpu performance with outstanding video performance is the ideal formula in most cases. If intel completely blocks sli from nehalem then that will be a huge boon to amd as MANY enthusiasts will go with shanghai/sli. Some of those might settle for keeping their ageing system longer.


OK . Now how many times have I heard the enthusiast here and other forums say this. Buying a $1200 cpu is stupid. Because thats what were talking about here the high dollar high end . with the QPlink . The mid range without QPlink to GPU is were NV may have to play on an intel system .

I like this . Why . Because it gives ATI the high end and in the Middle it helps AMD compete against intels Middle . I like that for AMd . Intel playing really well with AMD right now . and in the future I hope these to can work more closely together

 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Well Extelleron, GPUs are already multi core chips if you haven't noticed. Theres absolutely no need for several GPU dies combined on one big die. Actually that makes no sense what so ever. Those individual GPU dies will have its own memory controller, and other unnecessary duplicated units (pure video). The biggest problem is inter GPU communication. This has always been the pitfall for multi chip solutions. Just how do you communicate these GPUs with one another? You keep mentioning hardware connection. Well, if your talking about MCM like core 2 quads, then that would be a very inefficient system (something intel knows, and a reason why nehalem is a native quad design). In a core2 quad, One core 2 talks with the other one through the FSB which imo is a very bad design choice because you can actually have jittering due to the latencies of inefficient inter CPU communication in a Core2 quad, especially in a multi threaded environment. For a MCM type GPU, how are you going to do this?

I can point to other things like memory interface, (You simply dont double, if you have 2 GPUs that have 128bit buses), not to mention multiple framebuffers per GPU die, alot more complexity involved in PCB design, and the list goes on. Your not seeing the bigger overall picture here. Die sizes are important in some sense, but to ignore the other more important parameters int he consequence of reducing the die size is quite foolish.

What your forgetting is that a GPU is already made up of tiny cores that communicate with each other natively (through "hardware connection" as you've said), which share the same memory sub system and other resources. Breaking these to several dies just makes everything inefficient especially when dealing with the communication of the chips.

And people keep claiming this chip to be hot and power hungry beast. Well thats half true. This chip would probably be drawing less power during idle than a 9800GTX and its been confirmed already. Idle temps would be pretty low too. The only time youll think of the card as hot and power hungry is during load i.e running an intensive 3d app.
 

Wreckem

Diamond Member
Sep 23, 2006
9,461
996
126
Originally posted by: Extelleron
Originally posted by: bryanW1995
It would hurt intel sales quite a bit to alienate the enthusiast market that they are allegedly trying to get with nehalem. A high percentage of people who want the latest/greatest are hardcore enthusiast gamers. These people understand that acceptable cpu performance with outstanding video performance is the ideal formula in most cases. If intel completely blocks sli from nehalem then that will be a huge boon to amd as MANY enthusiasts will go with shanghai/sli. Some of those might settle for keeping their ageing system longer.

If AMD's R700 cards end up being faster than the GTX 200 cards, which it looks like, then there will be no need to worry about SLI compatability. People will just go with 4870 X2 Quad-CF on X58....AMD quad-crossfire drivers will likely improve, and it will become a competitive platform.

Nemesis has a point in that Intel has the power to shut nVidia out of the enthusiast market. If nVidia can't make a chipset for Nehalem and Intel boards do not have SLI, then the top end of the enthusiast market belongs ATI graphics cards. Either that, or you could go with an AMD CPU + nVidia mobo for SLI. Which one will people choose? It seems both benefit AMD, and don't really hurt Intel.

Originally posted by: HOOfan 1
Originally posted by: allies

You mean since 8800GTX right (1.5 years)? Before that nvidia consistently got beaten in both performance and IQ.

9700 and 9800 over the FX series was the only time ATI held a clear advantage over nVidia....and before 9700 ATI's drivers were an absolute joke.

I wouldn't say that. The X800XT PE was clearly > 6800 Ultra and the X850XT PE > 6800 Ultra by even a wider margin. The 6850 Ultra or 6800 Ultra Extreme never was really released, so you can't count that (even though I don't think it beat the X800/X850's anyway).

The X1800XT clearly beat its competitor, the 7800GTX 256MB, in everything that involved AA/AF. The 7800GTX 512MB was never widely available so you can't count it, and even if you do, it only held the lead for 2-3 months. X1900XTX > 7900GTX, in performance a bit and in IQ by far. Certainly if you look at games from 2007-2008 on X1900 & 7900 cards, the X1900XTX can be ~2x 7900GTX in some situations (i.e. Crysis).

And to BFG, as Martimus said, I am not talking about multi-GPU via software (drivers). I am talking about a multi-die GPU that to the driver is a single-GPU. While I don't think software-based multi-GPU is bad, and it's a good interim solution, in the long run I feel a hardware connection between the GPUs is needed.

And Foxery, actually it will translate into lower power/heat, because it could have been built on a 55nm process if you cut GT200 into 4 die. 55nm isn't yet mature enough for a 400-500mm^2 GPU, which is what GT200 would have been on 55nm. But for a chip 144m^2 in size? It would have been fine, and reduced the die size of the chip, along with decreasing heat output, allowing for higher clocks/better performance.

See all these theories go out the door because the Feds are breathing heavily down the neck of Intel.

 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
if intel simply locks out nvidia, ESPECIALLY if intel corroborates with AMD to do so, then the feds and the European commission are gonna come down hard on them.
Heck, the EU is so strong in this regard that MS is actually making IE8 fully standards complient... no more of this "MS broken implementation of standards" as the standard for most websites. And the EU is pressing hard, so we will soon see an MS office with open document support.

This is the absolute WORST time for intel to lock anyone out of anything.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Its not the EU there worried about the most, but also the FTC investigation going on.

USb 3.0 and intel's changed attitude
Atleast thats good news for us. Imagine having compatibility issues! USB 1.0 was something very similiar with this.

But anyway, stop going OT guys. Were less than a week away from launch!
 

Piuc2020

Golden Member
Nov 4, 2005
1,716
0
0
Originally posted by: allies
Originally posted by: Piuc2020
Crysis just doesn't play well with the current architecture of cards, something is limiting cards in Crysis heavily and just incrementing shaders, rops, etc, linearly is obviously not going to fix the performance until the bottleneck is discovered and fixed.

Poor coding is the problem IMO.

It's easy to say that but when the engine produces similar visuals to other games out there (a mix of medium-high) it runs just as well if not better than those other engines.

I'd like to see the Source or Unreal Engine 3 produce similar visuals (Crysis on Very High) without getting a similar drop in performance.

Perspective people, please.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
AMD/Intel don't have to collaberate at all . They simply have to stay on the road they are both traveling. NV went a differant direction with their tech . ATI Intel already had agreements inplace befor AMD bought them . All they have to do is honor those agreements. Nv by turning to via is hurting AMD. AMD is the one that needs the help . Intel is just extending a helping hand . Until AMD gets things under control than resume the battle . Read the links in the DX11 thread. It offers much insight.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Let's keep the chatter to the GT200 guys. If you want to talk about multi-core stuff or AMD vs. Intel, please spin it off in to its own thread.
 

Janooo

Golden Member
Aug 22, 2005
1,067
13
81
Originally posted by: nRollo
Originally posted by: tuteja1986
rape fest begins :! the 512MB 7800GTX all over again :!

It may be a little premature to forecast pricing and availability on the GTX260/280 as they haven't launched yet.

Often parts listed as "for sale" pre- launch command a higher price, because supply is short as most vendors honor NDA, demand is high.

Should we be concerned?
 

Hauk

Platinum Member
Nov 22, 2001
2,806
0
0
Originally posted by: ViRGE
Let's keep the chatter to the GT200 guys. If you want to talk about multi-core stuff or AMD vs. Intel, please spin it off in to its own thread.

Good idea. These two threads are bloated...
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: SteelSix
Originally posted by: Nightmare225
New from Fudzilla.com, GTX 280 will really be $499 and 260 will be $399

http://www.fudzilla.com/index....=view&id=7853&Itemid=1

Probably responding to intense pressure from ATI's pricing plans....

This would be a good move if it's true. I hope it is...

Not IMO... I was really hoping that GTX 280 was going to be worthy of a $650 price tag. If this is true, they're competing with price before the launch that doesn't say much about performance.
 

Nightmare225

Golden Member
May 20, 2006
1,661
0
0
Originally posted by: nitromullet
Originally posted by: SteelSix
Originally posted by: Nightmare225
New from Fudzilla.com, GTX 280 will really be $499 and 260 will be $399

http://www.fudzilla.com/index....=view&id=7853&Itemid=1

Probably responding to intense pressure from ATI's pricing plans....

This would be a good move if it's true. I hope it is...

Not IMO... I was really hoping that GTX 280 was going to be worthy of a $650 price tag. If this is true, they're competing with price before the launch that doesn't say much about performance.

Only 4 days to find out...
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: tuteja1986
Originally posted by: Piuc2020
I think ATI is going to win this generation, those are outrageous prices, crossfire HD4870s would cost less and (according to rumours) might end up being faster than a GTX 280.

This is nice because ATI will finally get back in the game and the fierce competition will make it even better for us customers.

well looking at the early leaked benchmark , i ain't impressed with either of them (R4XX or GT2XX).


CRYSIS 1920x1200 AA OFF & High Quality settings :
GT260 : 29.75
GT280 : 35.75

$449 for the 260, $649 for the 280 is nvidia recommended price i think.

now to wait for detail article on 16th.

Rumors of yield problem maybe the reason for the high price as they can't pump out enough cards to meet the demand.


those numbers were "charlied", I'd wait for something a little more reliable.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
nvidia could always release a 1.5GB GDDR5 + 512bit bus part and completely dominate the 30+ inch displays market segment ...

Which, btw, if you can afford to pay over 500$ for a video card, you should be owning...

BTW, WTH is wrong with this thread... I go drink a cup of OJ and there is half a page of posts up.. I go to sleep, and there are whole new pages.
I thought the mods were deleting my posts at first (i use ff quicksearch to find my last post to locate my place in the thread, because the direct link in the notification always brings me to page one, and the "last viewed post" link doesn't work), but then I saw my posts are just buried under a torrent of new ones.
 

cm123

Senior member
Jul 3, 2003
489
2
76
Originally posted by: cm123
EVGA Corporation GEFORCE GTX 280 PCIE 1GB GDDR3 2PORT DVI-I HDTV
MFG#: 896-P3-1280-AR

its listed in disty (wholesale) in the $680's

one online company selling it now for $733.80

http://www.shopblt.com/partnum/u086_896p31280arh.html

Seems like way too much money - and I nearly always buy the lastest and greatest - maybe its just me though...

guess all the talk of it coming so cheap since the 4870 is so fast just not looking like that's going to happen - lets hope though the 4870 is fast as GTX 280 though, in hopes help drive pricing just a little.

As fyi... the disty lists shipping date of 06-28-2008 ( I can order it actually right now) and the company above lists 06-26-2008 - shows only 5 units coming into each warehouse which for this disty is sign not much coming 1st shipment on 06-28 (20 units total)

Working on getting more info - will post it if I can (and can release that info) - NDA should lift on Monday from what understand now.

sorry on the edit - got GTX 260 (very little for now) info too...

EVGA Corporation GEFORCE GTX 260 PCIE 896MB GDDR3 2PORT DVI-I HDTV
MFG#: 896-P3-1260-AR

no disty price yet - same online company selling it for $511.53

http://www.shopblt.com/partnum/u086_896p31260arh.html


$656 (from another disty) ASUS GEFORCE GTX280 1G DDR3 2*DVI-I 512BIT HDTV HDCP PCIE2.0
part# ENGTX280/HTDP/1G

$654 - PALIT VCX GTX280 1024MB GDDR3 DUAL-DVI HDCP HDMI & CRT TV-OUT VIDEO CARD
part # XNE/TX280+T305

$411 - PALIT VCX GTX260 896MB GDDR3 DUAL-DVI HDCP HDMI & CRT PCI-E
XNE/TX260+T394

remember all but the online companies pricing is from disty which is wholesale - normally that means our price about 5% higher than that (not always) at places like NE / ZZ and so on...

also, drop part number into google - of course will find few more things about for each card...



Merged into main thread.

Video Mod BFG10K.

I may have some more models on Friday and maybe some specs/pics can post as well -


Just Curious - so anyone here go ahead and put order in for one of these yet?

 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |