G80 Stuff

Page 11 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
An Nvidia fanatic complaining that others are ATI fanatics. Anyone else see the irony?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: keysplayr2003

It's all in the attitude my friend. You would have ZERO complaints and/problems with others if you just kept things real. Who cares what the others do. If it does bother you so much, at least find a proven fact to back yourself up, and WITHOUT the tude. You come across insane. Now how is anyone supposed to take you seriously if you come across mentally imbalanced? And about that innocent until proven guilty thing. Doesn't work in here, because we are all jurors of our peers in this little community.
:Q

QFT and rarely do i agree with Keys . . . THIS is coming from someone [keys] who really likes nvidia, has defended Rollo, and thinks AEG is OK.

IF HE says this, THEN Gstanfor is really "over-the-top".
:shocked:

"filth from canada" is a little hard to take - for sane people that are fans of ANY company.

 

lopri

Elite Member
Jul 27, 2002
13,221
612
126
Can we all stick to the topic and take off-topics to PMs? It is quite annoying to scroll thrugh the useless posts to find something meaningful.

Now on topic - Actually if those leaked 3DMark scores bear any truth, it is not very impressive as long as DX9 performance is concerned. SLI'ed 7900GTO w/ E6600 @3.6GHz will give more than 10K in 3DMark06. So we're looking at 2x single GPU perfornace increase. And 7950GX2 users aren't missing alot. Now, DX10 support and enhanced image quality is a different subject and again my opinion is on the assumption that the said 3DMark scores are true.
 

Dethfrumbelo

Golden Member
Nov 16, 2004
1,499
0
0
Originally posted by: lopri
Can we all stick to the topic and take off-topics to PMs? It is quite annoying to scroll thrugh the useless posts to find something meaningful.

Now on topic - Actually if those leaked 3DMark scores bear any truth, it is not very impressive as long as DX9 performance is concerned. SLI'ed 7900GTO w/ E6600 @3.6GHz will give more than 10K in 3DMark06. So we're looking at 2x single GPU perfornace increase. And 7950GX2 users aren't missing alot. Now, DX10 support and enhanced image quality is a different subject and again my opinion is on the assumption that the said 3DMark scores are true.

Still, beating the performance of SLI'ed 7900GTXs by 20% or so ain't bad. Since this architecture is quite a leap from the previous generation, I'd think the drivers would take a while to mature.


 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
I wouldn't be too troubled about 3dmark results, personally, though I don't doubt nvidia will use them to showcase G80 performance (just like they did with the nv40 launch).

Personally, like I've indicated previously I think G80 will be extremely solid, however, I have a feeling that it will be constrained by GDDR3 memory, which is why I'll take more interest in the refresh product. I'm not exactly feeling limited by my current setup at the moment either.
 

thilanliyan

Lifer
Jun 21, 2005
11,944
2,175
126
Originally posted by: Gstanfor
The filth spews forth from Canada because that is where ATi is headquartered.

Then why not say instead, "the filth that spews from ATI."?? If you believe ATI spews filth then say THAT...ATI being Canadian (and they're American now anyway) has nothing to do with what you're saying. Should I go around saying that all Aussies are NVidia fanboys just because YOU are one?

Then again, you're not here to win any "popularity contests" are you? You're just here to be as much of a(n) ________ as possible right? (I think everyone can think of an appropriate word to fill in that blank with.)

Now, on topic, has it actually been confirmed that these cards will be released in quantity in mid-November? Hope it's not delayed so I can ste-up.
 

Madellga

Senior member
Sep 9, 2004
713
0
0
Originally posted by: Cookie Monster
Rumours from B3D is that G80 scores in

3dmark05 19xxx
3dmark06 12xxx

Using a conroe E6600.

Note that this could be stock. CPU clock, motherboard, ram used are not known. Drivers used are also not known.

How much does 7950GX2/X9150XTX score in 3dmark05/06? i hardly look at those benchs anymore.

Cookie, my first rig scores around 9400 on 3dmark06. 7950GX2 clocked at 600/700, E6600 at 3.4GHz.

If that's true, we are looking at 25% and up higher score for a single card. Plus o/c.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Didn't both vendors at one point try to inflate scores in 3Dmark applications by corrupting the work-loads given by Futuremark?

It's because of that that I never really trust 3DMark scores to tell me how well my card performs. We need to see some numbers in games! Why tease us with worthless 3DMarks?
 

Nightmare225

Golden Member
May 20, 2006
1,661
0
0
Originally posted by: Madellga
Originally posted by: Cookie Monster
Rumours from B3D is that G80 scores in

3dmark05 19xxx
3dmark06 12xxx

Using a conroe E6600.

Note that this could be stock. CPU clock, motherboard, ram used are not known. Drivers used are also not known.

How much does 7950GX2/X9150XTX score in 3dmark05/06? i hardly look at those benchs anymore.

Cookie, my first rig scores around 9400 on 3dmark06. 7950GX2 clocked at 600/700, E6600 at 3.4GHz.

If that's true, we are looking at 25% and up higher score for a single card. Plus o/c.

Also, remember that 3dmark measures the rest of your system as well...
 
Oct 19, 2000
17,860
4
81
Originally posted by: josh6079
Why tease us with worthless 3DMarks?
Just look at 3Dmark scores simply as what they are....an apples-to-apples comparison when using a constant (usually the CPU) and a variable (usually the GPU). With the resulting scores, it's easy to tell what kind of leap over other gen cards you're getting.

My point being, 3DMark is far from worthless, as it very well has it's place in the benchmarking biz. It's single downfall is that's it's only worth this thing, and that's it.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
You're right, and perhaps I should have stated that I personally think it is worthless since I don't build a system to compare it or bench it, but to get what I want from it: smooth gameplay in my games while using the best of their graphics.
 

lopri

Elite Member
Jul 27, 2002
13,221
612
126
So I guess we'll see the similar pattern as in 'double-the-performance' of current (previous) generation. And the trade being DX10 / new IQ enhancement and higher power requirement / heat (noise). If that's the case, GTO / GTX SLI doesn't look shabby at all, considering that we won't have decent DX10 games until 9800GTX shows up and current G70 drivers are as mature as can be.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
...If that's the case, GTO / GTX SLI doesn't look shabby at all...

Yep, the only concern I have with G80 SLI is power draw and possibly the water cooling requirement of the GTX.
 
Oct 19, 2000
17,860
4
81
Originally posted by: josh6079
You're right, and perhaps I should have stated that I personally think it is worthless since I don't build a system to compare it or bench it, but to get what I want from it: smooth gameplay in my games while using the best of their graphics.
You're right too, each person has their own types of benchmarks that they like to see. Whether or not 3DMark is worthless, you still can't beat regular ol' game benchmarks for real-world performance numbers.
 

VooDooAddict

Golden Member
Jun 4, 2004
1,057
0
0
Originally posted by: R3MF
regardless, i will await the 0.65u refresh for a less power hungry version.
and hopefully a return to lower clocked but fully functional GT versions as i hate buying something with crippled hardware.

It could also a binning of the G80 chips. Once the process is more mature they would be silly not to give the refresh full functionality like they did with the
7800GT -> 7900GT
8800GTS -> 8900GTS ?

We can hope.
 

BassBomb

Diamond Member
Nov 25, 2005
8,390
1
81
i should be happy so long as the prices are right.... if i can buy a 8800GTS by december at ~400CDN i should be very happy (provided it beats a X1900XT by a fair amount)

price along with overall performance is most important to me... i dont really care about noise/heat/power unless its comparing specifically between two models or if my psu cannot handle it

 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: theprodigalrebel
***crawls into the battle-zone***

***thinks out aloud***
The thing about Crusader & Wreckage is that they don't claim to be objective in the GPU wars - they admit (directly or indirectly) that they hate ATi and love nVidia unconditionally. This makes them cool. I got no problem with that. Loyalists exist everywhere.

But Gstanfor....scares me.

***makes a run for the border***

Holy crap. I actually got props from someone on this forum. Nice. Might print this out and put it on the fridge.

Cant wait to buy G80 though.
Compared to my dual 6800GTs for $850 back in the day, this $650 purchase doesnt make me bat an eye. Sooner I can spill the cash, the better!
 

hans007

Lifer
Feb 1, 2000
20,212
17
81
Originally posted by: Cookie Monster
Originally posted by: YoungGun21
I would assume R600 will be 80nm, seeing as the X1950Pro which is due for release later this month is 80nm.

nVIDIA already have a 80nm product in the form of 7700 mobile series. So with the 7650 series soon.

well one of the x1300 versions already out is 80nm also.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Acanthus
I am quite convinced that an R600 adaptation to the PC is not going to compete with the top end 8800 series.

Itll slaughter the 7900 series, and if they compete on price they can edge into low and mid range.

Unless ATi pulls something insane, like keeping the Nmos cache for "free" antialiasing, and also increasing their bus width.

Somehow I'm not convinced of that at all. Based on what I've seen, I can expect the g80 to perform about the same level as a 32-40 pipe g70 would. I dont know if anyone actually thinks the g80 will have the equivalent of 128 g70 pipes, but that number is so far beyond realistic expectations, I'm not even gonna entartain the possibility. I did say there will not be a 32-pipe g71, so I'm also saying there will not be a 128 "pipe" g80, unless their definition of a pipe is completely different from what we've seen before.
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: jiffylube1024
Originally posted by: Crusader
8800GTX will be mine.

Time to own some ATI, as their fans wait forever for a chip that will at best equal this beast. Enjoy your X800s boys. I'd gladly take even a repeat of the Geforce6 story all over again. This is appearing to be measuring up to a walloping on ATIs hide though. Be interesting when AMD takes full control of ATI if they continue to allow ATI to lose money on the high end just to remain in the market.
Thank you Nvidia. I salute thee. GIVE THE PEOPLE WHAT THEY WANT!

Viva la Geforce 8!

Why you consistently root for monopoly is beyond me... You're either mental or on the payroll somewhere.

No, I dont want ATI out. I just like to see them hurting and their fanboys on the run. Dont want an NV monopoly though.
I'm guessing someone else would step up to Nvidia anyway, even if AMD backs down.
There definitely room for at least 2 in this market.
So I havent been fretting the possibility like the ATI crones have been. If ATI goes, who cares.. this is capitalism.. someone else will step up, and ATI and their fanboys will have been vanquished.
Sounds like a win/win to me. But I like todays situation, with the ATI takeover. Good enough to call a victory to me.

I dont like high prices. And I'm not affiliated with any of these corporations in any way beyond Nvidia is my Chevrolet of video cards. The circle of love really revolves around my driver experiences with both over the past 10 years.. (since 3dfx died I've tried both many times as I was on the fence between the two).
The fact NV is pumping out some great hardware as well, is just a nice bonus.

I buy for software support first and foremost.. as I need a Windows desktop that doenst flicker.
http://forums.anandtech.com/messageview...atid=31&threadid=1936573&enterthread=y
http://www.rage3d.com/board/showthread.php?t=33853327
Among a multitude of other issues I've run across. Just not willing to keep burning up cash on ATI hardware and feel like I purchased a half-arsed product?
Its a consumers right.. and you'd be a moron to continue to pile your cash into their coffers as you would be encouraging their incompetence.

I dont pay the same money to get basically what is equal performance and IQ and deal with flickering desktops among half the other issues I've ran across.
With NV taking a clear performance lead again, ATI means absolutely nothing to me.
I have stated in the past though that the X1900XT has been a phenomenal value regardless of its flaws due to its bargain bin pricing.

Theres a point where bargain basement pricing equalizes things. ATI has resorted to the massive price dips in the past, and it works.. at least on me. Nothing wrong with a ~$200 X1900XT if the NV is far more.

For me though, I need more than that and those cards are old news, Geforce 8 does the trick in both a polished product software-wise and also industry leading performance.
The power increase is very minimal IMO. 400watt vs 450watt? I've been on 550 for two years now. You dont use a measly 400watter and start buying $650 video cards. Price means nothing when I consider this a bargain at $650.. I've paid far more, and gotten far less in the past.

Good luck on those Christmas sales is all I can say. NVs got ATIs number though (where it counts.. sales), again. :thumbsup:
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Crusader
You dont use a measly 400watter and start buying $650 video cards.

Not anymore you dont. But apparently everyone was when they whined about the stratospheric power consumption of the x1900xtx :roll:
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: munky
Originally posted by: Crusader
You dont use a measly 400watter and start buying $650 video cards.

Not anymore you dont. But apparently everyone was when they whined about the stratospheric power consumption of the x1900xtx :roll:

I dont believe I personally ragged on it hard for power consumption, per se.
But add it in to ATIs other issues and you have yourself a perplexing product. It should have been noted for the X1900s life cycle that it has higher heat output/power requirements/louder operation. Thats only being fair.

For me, an even bigger issue is how the 2D/3D clock rates are set, and games that use pseudo fullscreen are far slower on ATI cards than games that use true fullscreen mode.
http://forums.anandtech.com/messageview...atid=31&threadid=1935076&enterthread=y

Add in all these niggling, *ahem* EXTREMELY annoying issues.. when Nvidia has essentially equal IQ and performance (all things considered), quiet operation, less heat/power requirement, and I've always been bewildered at you ATI guys.
I dont get it.

But I do get G80.
 

Nightmare225

Golden Member
May 20, 2006
1,661
0
0
Originally posted by: Crusader
Originally posted by: munky
Originally posted by: Crusader
You dont use a measly 400watter and start buying $650 video cards.

Not anymore you dont. But apparently everyone was when they whined about the stratospheric power consumption of the x1900xtx :roll:

I dont believe I personally ragged on it hard for power consumption, per se.
But add it in to ATIs other issues and you have yourself a perplexing product. It should have been noted for the X1900s life cycle that it has higher heat output/power requirements/louder operation. Thats only being fair.

For me, an even bigger issue is how the 2D/3D clock rates are set, and games that use pseudo fullscreen are far slower on ATI cards than games that use true fullscreen mode.
http://forums.anandtech.com/messageview...atid=31&threadid=1935076&enterthread=y

Add in all these niggling, *ahem* EXTREMELY annoying issues.. when Nvidia has essentially equal IQ and performance (all things considered), quiet operation, less heat/power requirement, and I've always been bewildered at you ATI guys.
I dont get it.

But I do get G80.

Well said, someone who finally understands that spending less money on similar-performing hardware doesn't mean you're getting the same quality.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |