8800gtx preview

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

KeithTalent

Elite Member | Administrator | No Lifer
Administrator
Nov 30, 2005
50,231
117
116
Originally posted by: Elfear
Originally posted by: nanaki333

ack! where'd you see that at?

i'm so torn. it's not like there's going to be many games taking advantage of dx10 immediately so i COULD wait.

they better have a damn good card coming out to be 2 months behind nvidia!

Here is where I saw the rumor. Scroll down to post #2282. The other guys seem to respect CJ so I figured the rumor must be at least as credible as the others we've heard.

Also here: Link

ATI R600 GPU is expected to be released in Early Q1 on an 80nm (and perhaps 65nm too) process. The R600 is the successor to the R580 core and is expected to be built on a 65nm process and will be fully DirectX 10 compliant, utilising a Unified Shader Model architecture. Current rumours suggest that R600 will feature 64 Shader pipelines (processing both vertices and pixels) with 32 TMUs and 32 ROPs running at a clock speed of around 800Mhz. R600 is expected to interface to 512MB of 2Ghz+ GDDR4 Memory over a 512-bit interface. R600 is reported to consume up to 250W (twice that of R580) and may require two PCI Express power connectors.


Ok they tested on DT using a QX6700, so should I assume my "crappy" X2 3800+ will limit this cards capabilities? I really do not want to do a full upgrade until RD600, but I would love to up my frames in Minesweeper.

 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: nitromullet
Originally posted by: munky
In case nobody noticed, the rumored power consumption specs always turn out to be blown out of proportion compared to the actual power consumption. It's good to see the g80 not using ridiculuusly a lot of power, even my 430 watt psu would easily handle it.

I agree, but why the dual 6-pin molex plugs if it only draws 4% more juice than an X1950XT?

I'm wondering the same thing... is it possible the test wasn't done with GPU at 100% load? are we hitting some kind of bottleneck?
 

Tanclearas

Senior member
May 10, 2002
345
0
71
I'm going to need a bigger box. Looks like it's about a half inch too long for the Sonata. Crazy.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Those scores look really degrading to the X1950XTX.

Here Anandtech scored ~20 more than they did with the same card at a higher resolution with the same amount of AA in Quake 4.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Wow. Just wow.

I think DT just talked me into buying one of these bad boys.

>90% performance improvement in HL2 and Quake 4?? Native HDR+AA, angle independent AF, 16xQ single card AA...

Should be a beast in my rig running 1680x1050!
 

KeithTalent

Elite Member | Administrator | No Lifer
Administrator
Nov 30, 2005
50,231
117
116
Originally posted by: Matt2
Wow. Just wow.

I think DT just talked me into buying one of these bad boys.

>90% performance improvement in HL2 and Quake 4?? Native HDR+AA, angle independent AF, 16xQ single card AA...

Should be a beast in my rig running 1680x1050!

I was just thinking the same thing. I should be able to max everything at 1920x1200 with this beast.

So do you guys think I would be able to utilize this card to its full potential with the rest of the rig in my sig?
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
"When compared to AMD?s current flagship ATI Radeon X1950 XTX, the GeForce 8800GTX only consumes 24% more power at idle. The power consumption differences under load decreases to around 4%. Considering the performance differences, the GeForce 8800GTX is no worse than AMD?s ATI Radeon X1950 XTX in terms of performance-per-watt."

so relatively speaking, the power consumption is about the same as the ati.. where are all those condeming the power consuption fo the xtx now (tho if rumors hold true regarding r600 requiring 2x the power of r580, seems there would be good reason to complain about that)?

at any rate, looks to be a very nice product. i didn't see it in the article referenced here, but some have mentioned it will have "angle independant" AF, which will be very nice. i'll be curious to see what the price/performance comes out before i get too excited tho - the last generation things got back to sane levels ($299 gt's at release, etc) and would hate to see it reach ridiculous levels again..

and an inch and a half longer than the radeon? sadly, this won't fit in my mid-tower case (ttake tsunami) without some modifications (or i'll have to remove all the hard drives). why can't they make the pcb a bit taller instead of going longer all the time? would make fitting it a lot more easier


 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: KeithTalent
Originally posted by: Matt2
Wow. Just wow.

I think DT just talked me into buying one of these bad boys.

>90% performance improvement in HL2 and Quake 4?? Native HDR+AA, angle independent AF, 16xQ single card AA...

Should be a beast in my rig running 1680x1050!

I was just thinking the same thing. I should be able to max everything at 1920x1200 with this beast.

So do you guys think I would be able to utilize this card to its full potential with the rest of the rig in my sig?

I'm pretty sure you're going to be CPU limited at only 2GHZ, but then again, at 1920x1200, you might be ok because you're mostly GPU limited at such a high resolution. Especially with AA and all the goodies.
 

KeithTalent

Elite Member | Administrator | No Lifer
Administrator
Nov 30, 2005
50,231
117
116
Originally posted by: Matt2
Originally posted by: KeithTalent
Originally posted by: Matt2
Wow. Just wow.

I think DT just talked me into buying one of these bad boys.

>90% performance improvement in HL2 and Quake 4?? Native HDR+AA, angle independent AF, 16xQ single card AA...

Should be a beast in my rig running 1680x1050!

I was just thinking the same thing. I should be able to max everything at 1920x1200 with this beast.

So do you guys think I would be able to utilize this card to its full potential with the rest of the rig in my sig?

I'm pretty sure you're going to be CPU limited at only 2GHZ, but then again, at 1920x1200, you might be ok because you're mostly GPU limited at such a high resolution. Especially with AA and all the goodies.

Good stuff, thanks! I may need to sell my 3800 and 1900xt and upgrade to a 4400 & 8800gtx now, but only if the 8800 comes in under $700 Canadian :frown:
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: KeithTalent
Originally posted by: Matt2
Originally posted by: KeithTalent
Originally posted by: Matt2
Wow. Just wow.

I think DT just talked me into buying one of these bad boys.

>90% performance improvement in HL2 and Quake 4?? Native HDR+AA, angle independent AF, 16xQ single card AA...

Should be a beast in my rig running 1680x1050!

I was just thinking the same thing. I should be able to max everything at 1920x1200 with this beast.

So do you guys think I would be able to utilize this card to its full potential with the rest of the rig in my sig?

I'm pretty sure you're going to be CPU limited at only 2GHZ, but then again, at 1920x1200, you might be ok because you're mostly GPU limited at such a high resolution. Especially with AA and all the goodies.

Good stuff, thanks! I may need to sell my 3800 and 1900xt and upgrade to a 4400 & 8800gtx now, but only if the 8800 comes in under $700 Canadian :frown:

You could always overclock! Think of it as a FREE upgrade to that 4400!
 

KeithTalent

Elite Member | Administrator | No Lifer
Administrator
Nov 30, 2005
50,231
117
116
Originally posted by: Matt2
Originally posted by: KeithTalent
Originally posted by: Matt2
Originally posted by: KeithTalent
Originally posted by: Matt2
Wow. Just wow.

I think DT just talked me into buying one of these bad boys.

>90% performance improvement in HL2 and Quake 4?? Native HDR+AA, angle independent AF, 16xQ single card AA...

Should be a beast in my rig running 1680x1050!

I was just thinking the same thing. I should be able to max everything at 1920x1200 with this beast.

So do you guys think I would be able to utilize this card to its full potential with the rest of the rig in my sig?

I'm pretty sure you're going to be CPU limited at only 2GHZ, but then again, at 1920x1200, you might be ok because you're mostly GPU limited at such a high resolution. Especially with AA and all the goodies.

Good stuff, thanks! I may need to sell my 3800 and 1900xt and upgrade to a 4400 & 8800gtx now, but only if the 8800 comes in under $700 Canadian :frown:

You could always overclock! Think of it as a FREE upgrade to that 4400!

I would love to, but have absolutely no idea how to do it. I'd probably end up melting something.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: KeithTalent
I would love to, but have absolutely no idea how to do it. I'd probably end up melting something.

It's really a lot more painless than most people think.

You could probably get 2.4-2.5GHZ without even increasing the voltages and keeping the temperatures low.

You should read up on the AMD OC threads in CPU/Overclocking forum. Even if you still dont want to do it, at least you'll know how and have the option.
 

KeithTalent

Elite Member | Administrator | No Lifer
Administrator
Nov 30, 2005
50,231
117
116
Originally posted by: Matt2
Originally posted by: KeithTalent
I would love to, but have absolutely no idea how to do it. I'd probably end up melting something.

It's really a lot more painless than most people think.

You could probably get 2.4-2.5GHZ without even increasing the voltages and keeping the temperatures low.

You should read up on the AMD OC threads in CPU/Overclocking forum. Even if you still dont want to do it, at least you'll know how and have the option.

Will do, thanks man. I'm kind of lazy and I don't want to have to put a new heatsink and everything on it, so if I can get away with a mild OC without doing that stuff I will definitely give it a try.

Then get my 8800 and be off to the races....
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
Originally posted by: josh6079
Those scores look really degrading to the X1950XTX.

Here Anandtech scored ~20 more than they did with the same card at a higher resolution with the same amount of AA in Quake 4.

I think everyones jumping the gun too fast here.. The scores look suspicious indeed, although I dont doubt the GTX will be that fast or faster , but the x1950xtx is alot faster than that
 

Elfear

Diamond Member
May 30, 2004
7,126
738
126
Originally posted by: KeithTalent

I was just thinking the same thing. I should be able to max everything at 1920x1200 with this beast.

So do you guys think I would be able to utilize this card to its full potential with the rest of the rig in my sig?

I highly doubt the 8800GTX will be able to run games maxed out at 1920x1200. Most games it will fly through at that res but I imagine some will still bring it to it's knees. With the clocks in my sig I still get slowdown in a few areas with Oblivion at 1920x1200 HDR+4xAA and I doubt the 8800GTX will be much faster.
 

KeithTalent

Elite Member | Administrator | No Lifer
Administrator
Nov 30, 2005
50,231
117
116
Originally posted by: Elfear
Originally posted by: KeithTalent

I was just thinking the same thing. I should be able to max everything at 1920x1200 with this beast.

So do you guys think I would be able to utilize this card to its full potential with the rest of the rig in my sig?

I highly doubt the 8800GTX will be able to run games maxed out at 1920x1200. Most games it will fly through at that res but I imagine some will still bring it to it's knees. With the clocks in my sig I still get slowdown in a few areas with Oblivion at 1920x1200 HDR+4xAA and I doubt the 8800GTX will be much faster.

Yeah, but for the majority of games I play at the moment it will.

It would be nice to bump up some of the effects on Oblivion though, that's for sure. That game kills my system big time at my native res. :frown:
 

jonnyGURU

Moderator <BR> Power Supplies
Moderator
Oct 30, 1999
11,815
104
106
The power consumption measured is based on the power that DailyTech's G80 required to run those particular applications. Naturally, the apps aren't quite using the G80 core to it's full potential and power consumption can be upwards of 225W per card.

 
Apr 17, 2003
37,622
0
76
Originally posted by: josh6079
Those scores look really degrading to the X1950XTX.

Here Anandtech scored ~20 more than they did with the same card at a higher resolution with the same amount of AA in Quake 4.

i think those scores are very suspicious. a 1950XTX only puts up 34 frames in quake 4 @ 1600x1200 w/4x AA? i dont think so...
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: Corporate Thug
Originally posted by: josh6079
Those scores look really degrading to the X1950XTX.

Here Anandtech scored ~20 more than they did with the same card at a higher resolution with the same amount of AA in Quake 4.

i think those scores are very suspicious. a 1950XTX only puts up 34 frames in quake 4 @ 1600x1200 w/4x AA? i dont think so...

DT =/= AT.

Remember, different sites use different methods, settings and timedemos.

They could for one haved used 16xHQ AF along with TRAA/AAA as well in a different timedemo.

But it looks like G80 is on 80nm.

Link

While ATI prepares to roll out five 80nm grahics processor units (GPUs) in November, Nvidia plans to speed up its 80nm-production schedule of by rolling out revised versions of its G72 and G73 chips to compete with the rival, according to a Chinese-language Commercial Times report.

The revised versions of G73 (codenamed G73-B1) and G72, along with the highly anticipated G80, will be targeted to compete with ATI solutions from mid-range to high-end market segments, the paper noted.

This could explain the lower power consumption.
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,358
8,447
126
Originally posted by: CaiNaM
"When compared to AMD?s current flagship ATI Radeon X1950 XTX, the GeForce 8800GTX only consumes 24% more power at idle. The power consumption differences under load decreases to around 4%. Considering the performance differences, the GeForce 8800GTX is no worse than AMD?s ATI Radeon X1950 XTX in terms of performance-per-watt."

so relatively speaking, the power consumption is about the same as the ati.. where are all those condeming the power consuption fo the xtx now (tho if rumors hold true regarding r600 requiring 2x the power of r580, seems there would be good reason to complain about that)?

actually the power consumption isn't an extra 24% at idle. the power consumption of the system as a whole is 24% higher. a radeon 1950xtx consumes just 33 watts at idle. a gf8800 gtx consumes 45 more watts than that, or 136% more watts at idle (actually a little lower, as power supply inefficiencies enlarge the consumption difference, so more like 110%). most people's computers are idle more often than not, so that is a sizeable increase.

at load the x1950xtx consumes about 125 watts, so the 8800 consumes about 135 watts. or 61% more than the 7900GTX (and only ~8% more than the xtx).
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: Elfear
Originally posted by: KeithTalent

I was just thinking the same thing. I should be able to max everything at 1920x1200 with this beast.

So do you guys think I would be able to utilize this card to its full potential with the rest of the rig in my sig?

I highly doubt the 8800GTX will be able to run games maxed out at 1920x1200. Most games it will fly through at that res but I imagine some will still bring it to it's knees. With the clocks in my sig I still get slowdown in a few areas with Oblivion at 1920x1200 HDR+4xAA and I doubt the 8800GTX will be much faster.

We all know there will be speed penalties when using dual cards in Crossfire/SLI. What I mean by that is that you'll never get 2x single card performance.

Take into consideration that these benches show G80 >90% in Hl2 and Quake 4, I doubt that Crossfired X1900XTXs are 90% faster than a single X1900XTX. probably more like 60-70% depending on resolution and settings (If you get Crossfire working properly).

Every generation we also see GPUs become more efficient. While you're X1900s might get slowdowns, it is very possible that G80 can handle higher resolutions without as much overhead.
 

HexiumVII

Senior member
Dec 11, 2005
661
7
81
OMG i think its finally time to upgrade my 6800GT. In NFS Carbon i only get 10FPS on mediumish!
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |