G80 Stuff

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

TecHNooB

Diamond Member
Sep 10, 2005
7,460
1
76
128 unified shaders would be so sweet. Cant wait to see the benchmarks on these things.
 

Nightmare225

Golden Member
May 20, 2006
1,661
0
0
Originally posted by: TecHNooB
128 unified shaders would be so sweet. Cant wait to see the benchmarks on these things.

Homina, Homina, Homina.
I haven't drooled this much since the original Geforce 256!

2nd week of November means the 5th-to the 11th right? My step-up expires on the 13th! SCORE! SO happy... Plus, its going to be a hard launch with retail availablity, so EVGA probably won't keep the G80 out of its step-up plan! So happy!
 

Sable

Golden Member
Jan 7, 2006
1,127
99
91
Even the GTS looks insanely powerful if you read the specs.

8800GTS
500MHz Core Clock
900MHz Mem Clock
640MB GDDR3 memory
320-bit memory interface (64GB/s)
96 unified shaders clocked at 1200 MHz

If they'd just released that as the GTX everyone would have been :thumbsup::shocked:

I suppose the good thing is they didn't do that and then wait round for ATI's card and then go BAM -> GTXXX and release the 128 shader version.
 

Sable

Golden Member
Jan 7, 2006
1,127
99
91
Actually the thing I'm looking forward to most is an image comparison against the 7900's.
 

R3MF

Senior member
Oct 19, 2004
656
0
0
i do prefer a unified approach because i approve of GP-GPU usage, and the more generic the processor the better.

i wonder the if the variable 128MB+64bit vs 256MB+64bit memory that leads to 640MB+320bit vs 768MB+128bit memory is a result of separating IQ fucntions like AA from general resource usage?

with the GTS you get 8x super-duper AA for FREE
with the GTX you get 16x super-duper AA for FREE

doesn't make a lot of sense if its for something like the Geometry Shader portion of the GPU, because you would be basically saying to the GTS user that they should really only consider their new card as a very fast DX9 card, as its a bit crippled in DX10............

regardless, i will await the 0.65u refresh for a less power hungry version.
and hopefully a return to lower clocked but fully functional GT versions as i hate buying something with crippled hardware.
 

potato28

Diamond Member
Jun 27, 2005
8,964
0
0
Originally posted by: theprodigalrebel
Originally posted by: aggressor
Now I want to see benchmarks

It's just 3D Mark scores but knock yourself out: click.

And what's really amazing is, these specs look just like the ones that were leaked on vr-zone and started that 'G80 Demystified' thread.

BTW, the Dailytech article says that nvidia recommends a 400W PSU for the 7950GX2 and 450W PSU for the 8800GTX. The 7950GX2 consumes about 140W, which makes the 8800GTX around 190W, right? Not the monstrous 250-300W number that has been floating around for ages?

ROFL!:laugh:
 

Genx87

Lifer
Apr 8, 2002
41,095
513
126
Good god this sounds like a beast. For that kind of power requirement, i am hoping this thing can do everything at max and double the performance of the 7950GX2.

 

shamgar03

Senior member
Jul 13, 2004
289
0
0
Originally posted by: martinez
*Beavis*hehehehe

They said nine inches

*Butthead*huhuhuhuh and dual slot.


One scottish guy: Twice as long as a man!
Another scottish guy (after shooting fireballs from his eyes and lightning bolts from his arse): Some men be longer than others...
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
A couple of questions for Kristopher (if NDA permits)...

I'm sure you've seen the G80 spy pics from this thread http://forums.anandtech.com/messageview...atid=31&threadid=1935564&enterthread=y
From the information provided in the originating link, it is indicated that the cards pictured are 280mm (11 inches) long, but DT indicates that the cards are slighly less then 9 inches long. This is certainly good news. However, since the cards pictured in the spy pics apparently aren't identical to production cards, how close are they? Specifically, does the GTX version require watercooling, do the GTS and/or the GTX have dual 6-pin pci-e power plugs, and how similar is the cooler pictured to stock?

Obviously, I understand that you are under NDA, so you may not be able to answer these questions. On the flipside though, if anyone at NVIDIA reads my post it's pretty obvious that I'm just trying to get my rig ready so I can buy their product. Hopefully they will give us some more details on power requirements and have some approved PSU's listed on their websites prior to the launch.

Thanks for the info.
 

R3MF

Senior member
Oct 19, 2004
656
0
0
will their be a 8xxx series card that is the same size as a 7900GT?

my Silverstone LC16M case will not accept anything bigger.
 

Ackmed

Diamond Member
Oct 1, 2003
8,478
524
126
Interesting if true.

Will be funny to see some people claim the power req's dont matter when and if this card does use as much as they claim it will. As it is now, thats one of the keys NV fans harp on, the low power req's of current cards. To me, it wont matter, just as it doesnt now.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
I still dont see how you need an 800w psu for 2 of these cards when they recommend 450w for one GTX. That is completely stupid, since SLI has existed its been shown innumerable times that 2 cards in SLI do not consume 2x the power of one card. Its more like 1.5x. Almost like the performance ratio of SLI. So if 450w are enough for one card than 675w psu should be plenty to power a GTX SLI configuration.

If you dont believe me, here.
 

Dethfrumbelo

Golden Member
Nov 16, 2004
1,499
0
0
If the increase in power consumption is at least in-line with the increase in performance, then I imagine nobody will be complaining.

It looks like they aren't going to provide their own power brick with the card as some had speculated. Gonna need 2 PSUs for this stuff.

 

Budarow

Golden Member
Dec 16, 2001
1,917
0
0
I really love it when a new generation of video cards comes out (i.e., all the HOT "older" cards drop in price)
 

hardwareking

Senior member
May 19, 2006
618
0
0
This thing will cream all of present gen cards.(according to the specs anyway)And by the looks of it,R600 better be good or it'll be in trouble too.

 

Pabster

Lifer
Apr 15, 2001
16,987
1
0
Looks like the wallet has just taken a big hit :laugh:

I predicted 8800GTX would be faster than 7900GTX SLI. We'll see
 

Capt Caveman

Lifer
Jan 30, 2005
34,547
651
126
I can't believe I'm thinking about upgrading of video card for the 4th time in less than two years. Ugh!!!

More info from Kris:

"G80" To Feature 128-bit HDR, 16X AA

More G80 features abound

As if we mere mortals needed more reasons to be excited about G80, here are a couple more tidbits: 128-bit high dynamic-range and antialiasing with 16X sampling.

The high dynamic-range (HDR) engine found in GeForce 7950 and Radeon series graphics cards is technically a 64-bit rendering. This new HDR approach comes from a file format developed by Industrial Light and Magic (the LucasFilm guys). In a nutshell, we will have 128-bit floating point HDR as soon as applications adopt code to use it. OpenEXR's features include:

* Higher dynamic range and color precision than existing 8- and 10-bit image file formats.
* Support for 16-bit floating-point, 32-bit floating-point, and 32-bit integer pixels. The 16-bit floating-point format, called "half", is compatible with the half data type in NVIDIA's Cg graphics language and is supported natively on their new GeForce FX and Quadro FX 3D graphics solutions.
* Multiple lossless image compression algorithms. Some of the included codecs can achieve 2:1 lossless compression ratios on images with film grain.
* Extensibility. New compression codecs and image types can easily be added by extending the C++ classes included in the OpenEXR software distribution. New image attributes (strings, vectors, integers, etc.) can be added to OpenEXR image headers without affecting backward compatibility with existing OpenEXR applications.

NVIDIA already has 16X AA available for SLI applications. The GeForce 8800 will be the first card to feature 16X AA on a single GPU. Previous generations of GeForce cards have only been able to support 8X antialiasing in single-card configurations.

This new 16X AA and 128-bit HDR will be part of another new engine, similar in spirit to PureVideo and the Quantum Effects engines also featured on G80.

 

Dethfrumbelo

Golden Member
Nov 16, 2004
1,499
0
0
At high res, I can't tell much difference going above 4XAA, nevermind 16XAA. Great for people with 32" monitors I guess.

Still, I'd rather take an ultra-high res with 4XAA than low res with 16XAA... too many texture/geometry details are lost.

 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
I'm glad to see all the features this core will have, but NV really needs to start being more specific about power and cooling requirements. I don't imagine that I won't buy it because it doesn't have the features I want, but rather because I can't feed and cool two of them.
 

Zenoth

Diamond Member
Jan 29, 2005
5,196
197
106
If R600 has very similar specs, but requires less power to juice the beast up, I'll go with them. Also, if ATi still maintain better overall image quality, like it is right now, I will go with them, since I always prefered Anisotropic-Filtering over Anti-Aliasing, especially if they keep, and even "upgrade" their less/non angle-dependent A-F technology (HQ A-F) in R600.

Honestly I am very excited about the G80 specs, but I won't do anything impulsive. I will wait until both G80 and R600 are put on a table and get compared between each others, and not just benchmarks, but architecture comparisons, technology advantages, and such.

Patience is always a virtue, especially in the GPU/CPU domain.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
So we will get 128bit floating point HDR with 16xAA available... :Q

Not to mention NV has this quantum physics engine in their G80 core.

This is really getting interesting.
 

CP5670

Diamond Member
Jun 24, 2004
5,527
604
126
16x AA is actually already available on single cards. You just need to use a third party program to turn it on, although it's only practical for old games. (but looks awesome in those)
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |