HOLY CRAP!! NVIDIA GeForce 6800 Ultra PCI-E.........S L I!!!!!!!!

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Nebor
Originally posted by: apoppin


(don't worrk ati fanboys; soon you'll be touting the virture of ATI's new X900 XT-PE MAXX cards)

That reminds me of that episode of Top Gear w/ the Evo and the WRX where they're talking about the full names. The Subaru Impreza WRX STI Mark 4 or something like that...
While we're completely OT, i musta missed it.

i didn't name these cards.

:roll:
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,426
8,388
126
maybe i just don't get it, but what is the point of taking up two slots per card and then not exhausting the hot air?

oh, and in the article, you put 'weary' where you meant to say 'wary'
 

0roo0roo

No Lifer
Sep 21, 2002
64,862
84
91
eh, don't scoff. buy one card when its expensive. in a year when your no longer top of the line, buy 2nd card and recieve boost! no? thats what voodoo2 sli owners did much of the time. once their v2 got a tad sluggy, they got the sli and lasted a bit longer.
 

biostud

Lifer
Feb 27, 2003
18,407
4,968
136
Originally posted by: ElFenix
maybe i just don't get it, but what is the point of taking up two slots per card and then not exhausting the hot air?

True, it seems like they should try getting the hot air out of the case, not just circulate inside. But I guess they think people who buys these cards have +5 Case fans

Personally I think it would be very strange if the nForce4 didn't include support for two 16xPCIe ports. Nvidia might have a divison for videocards and one for chipsets, but I doubt that they would launch a 'new' technology and then sit tight until some other decides to make a chipset that support it. Specially if you take the Intel/ATi vs. AMD/nVidia chipset alliance into account. Nvidia could have a strong comeback at the chipset market if they were the first to bring not only PCIe to A64 but also two 16xPCIe ports.

But that's all just speculations
 

flexy

Diamond Member
Sep 28, 2001
8,464
155
106
Originally posted by: 0roo0roo
eh, don't scoff. buy one card when its expensive. in a year when your no longer top of the line, buy 2nd card and recieve boost! no? thats what voodoo2 sli owners did much of the time. once their v2 got a tad sluggy, they got the sli and lasted a bit longer.

in a year there are OTHER, better cards around

I really think SLI is retarded (my $0.2)....twice the power usage, twice the noise, twice the money you spend just for performance and NO new features... welcome to 15 years ago, nvidia
 

flexy

Diamond Member
Sep 28, 2001
8,464
155
106
Based on that, ATI will have to have their own dual card solution, if only to save face, since not a lot of people will be buying dual cards. (I think I might though. )

so....if (according to your words) "not a lot of people will buy...."...why on earth would they "have" to come out with it ? Of course they do NOT if there is no market - otherwise they would be insane...and (excuse me)..there is much more exciting stuff around (and to come in the future) than SLI <--
 

0roo0roo

No Lifer
Sep 21, 2002
64,862
84
91
Originally posted by: flexy
Originally posted by: 0roo0roo
eh, don't scoff. buy one card when its expensive. in a year when your no longer top of the line, buy 2nd card and recieve boost! no? thats what voodoo2 sli owners did much of the time. once their v2 got a tad sluggy, they got the sli and lasted a bit longer.

in a year there are OTHER, better cards around

I really think SLI is retarded (my $0.2)....twice the power usage, twice the noise, twice the money you spend just for performance and NO new features... welcome to 15 years ago, nvidia

that much better? then again v2's weren't 400 dollar cards, i believe they were 250? and then kept dropping.

it still could be good for a mid range user. buy mid range card, wait a year or slighyt more...and that mid rangewill be really cheap. add that and ur going for a while longer. course if u always spend 400 dollars on a card each year you might be able to scoff at this.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Well, at least nVidia gained SOMETHING from aquiring 3DFX aside from missed deadlines and lousy GPU designs.

In all seriousness though, this looks like a killer feature. Awesome. It's total overkill at the moment, but in a year or so it will be an amazing boost to say the least.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
I am really on the fence about this one.

On one hand I like the idea of being able to increase performance by simply getting 2 of the same components. It offers the advantage of a modular upgrade and the cost breakup that goes along with that.

On the otherhand, it doesn't solve the real problem. Chips get bigger, hotter, and more expensive. We need a new tech to fix this, not half assed solutions that simply combine two already exisiting parts. To put it into perspective, how many of you would have ever taken 3dfx's SLI solution over a GeForce 2 GTS? The GeForce 2 GTSis clearly superior technology in that it is much smaller, comsumes less power, and perfomrs better.
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
Originally posted by: flexy
Originally posted by: 0roo0roo
eh, don't scoff. buy one card when its expensive. in a year when your no longer top of the line, buy 2nd card and recieve boost! no? thats what voodoo2 sli owners did much of the time. once their v2 got a tad sluggy, they got the sli and lasted a bit longer.

in a year there are OTHER, better cards around

I really think SLI is retarded (my $0.2)....twice the power usage, twice the noise, twice the money you spend just for performance and NO new features... welcome to 15 years ago, nvidia

I would agree that going out and buying 2 top of the line cards for SLI would lean towards the retarded side. But there are places where SLI could pay good dividends. One being the upgrade path. In a year these cards will be cheaper, so adding a second one would work great. The quoted numbers are around 77% performance boost, which is well beyond the boost of going from a 9700Pro to a 9800XT a 2 generation jump in releases. So if for the same price or less and better perofmance, I can buy one top of the line card and a second one later at a lower price than buying one top of the line and then another top of the line to replace it a year from now, would seem like a no brainer. New features with new generation? Who cares? When are new features ever implemented during the lifetime of the first generation or cards that introduce them?

And again, there are some interesting things that can be done with SLI, that cannot be done as well with one really fast card. Whether or not NVidia decides to go those routes and broaden the feature package of their SLI implementation only time will tell. It could be a great new feature for gamers, or it could stink. I wouldn't declare it either one at this point.
 

biostud

Lifer
Feb 27, 2003
18,407
4,968
136
Originally posted by: nitromullet
I am really on the fence about this one.

On one hand I like the idea of being able to increase performance by simply getting 2 of the same components. It offers the advantage of a modular upgrade and the cost breakup that goes along with that.

On the otherhand, it doesn't solve the real problem. Chips get bigger, hotter, and more expensive. We need a new tech to fix this, not half assed solutions that simply combine two already exisiting parts. To put it into perspective, how many of you would have ever taken 3dfx's SLI solution over a GeForce 2 GTS? The GeForce 2 GTSis clearly superior technology in that it is much smaller, comsumes less power, and perfomrs better.

I doubt they will stop improving their GPU's. It's so easy to demand lots of new technology and at the same time have a short production cycle. You simply can't get double performance, new features, less heat and power consumption all at the same time. CPU went from .13 to .09 micron while the topline GPUs still are beeing manufactured at .13 due to production difficulties. It sounds like you don't think they're aware of the major problems of making videocards. Maybe you should take a patent on your revolutionary idea about making chips that are much smaller, comsumes less power, and perfomrs better.
 

TGHI

Senior member
Jan 13, 2004
227
0
0
I can't even afford 1 6800 ultra - let alone 2...and a board with PCI express...and probably a new processor. It's great news for the rich....but I don't care....*sniff*....leave me alone....
 

Wahsapa

Diamond Member
Jul 2, 2001
3,004
0
0
so the tab nobody knew about for nv45 cards was really for sli?

could i buy one nv45 now and another later and then the SLI peice? and just download the drivers? does it have to be two of the exact same cards? could it be a 6800u and gt on sli?

is ati gonna release a dual core r423..haha
 

Schadenfroh

Elite Member
Mar 8, 2003
38,416
4
0
Originally posted by: flexy
Originally posted by: 0roo0roo
eh, don't scoff. buy one card when its expensive. in a year when your no longer top of the line, buy 2nd card and recieve boost! no? thats what voodoo2 sli owners did much of the time. once their v2 got a tad sluggy, they got the sli and lasted a bit longer.

in a year there are OTHER, better cards around

I really think SLI is retarded (my $0.2)....twice the power usage, twice the noise, twice the money you spend just for performance and NO new features... welcome to 15 years ago, nvidia

/bookmarks post for reference when attitude change occurs with x800 MaXX
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: Schadenfroh
Originally posted by: flexy
Originally posted by: 0roo0roo
eh, don't scoff. buy one card when its expensive. in a year when your no longer top of the line, buy 2nd card and recieve boost! no? thats what voodoo2 sli owners did much of the time. once their v2 got a tad sluggy, they got the sli and lasted a bit longer.

in a year there are OTHER, better cards around

I really think SLI is retarded (my $0.2)....twice the power usage, twice the noise, twice the money you spend just for performance and NO new features... welcome to 15 years ago, nvidia

/bookmarks post for reference when attitude change occurs with x800 MaXX

Oh come on, you're both equally biased, only in different directions. :roll:
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Well as everyone mentioned it will have it's pros and cons

1. Expensive cost
2. Power consumption
3. High temperatures (supposingly)
4. availability

On the other hand imagine the benefits of a powerful 6800Ux2 .
Imagine what high res and AA/AF you use if you have a 22'' monitor. Most of these ppl pay for a high end card and they can't even play new games at 16x12 with some AA/AF. Can you imagine going higher? :shocked:

I think if you can afford it it will be a killer card. I hope ATI has an answer for this.
And don't believe for a sec that "who cares? Only a small market percentage will have a system like this."
These things play significant role in marketing biz and will manage a serious hit at ATI, if they don't respond.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,453
10,121
126
Originally posted by: beatle
Interesting how Nvidia is borrowing the acronym from the defunct 3dfx technology. They're just calling it "Scalable Link Interface" instead of "Scan Line Interleaving." It's a familiar term to any hardcore gamer.

To the average (non-technical) gamer, that was familiar with the results of 3Dfx's SLI technology, all they know is that "SLI makes games faster!". So I'm certain that it will be useful for marketing purposes to gamers if NVidia continues to use that term.
 

KpocAlypse

Golden Member
Jan 10, 2001
1,798
0
0
Originally posted by: Nebor
ATI will absolutely have to come out with a similar technology, otherwise they will never regain the performance crown again. Think about, Nvidia has never made a product that was half as fast as it's ATI counterpart, even the 'failure' cards the gf4 (vs 9700) and the 5900 (vs 9800) weren't that slow. So Nvidia wouldn't even have to come out with the best chips anymore. Their cards could all perform 20% slower than ATIs, but when you stack two of them together, they still win.

Based on that, ATI will have to have their own dual card solution, if only to save face, since not a lot of people will be buying dual cards. (I think I might though. )

I wouldn't say "have too" This will give Nivida some bragging rights as to who is the fastest. But well, I'm not about to spend 900 bucks for it. Hell i had a hard time spending 400 on my SLI rig back in the day..

And yea, i see a MAXXX solution in the future, or a similar tech by ATI. But its going to be bragging rights only, i don't see them making to much $$$$ on it.

EDIT:: not to mention, in 12 months this kind of setup might get owned by a setup X900 or 7900 Nvidia or something. It was sad to see my old Rage Pro/Diamond Voodoo II setup get owned by the GF or even my G400 MAX at times.. But it will be uber for longer, this kind of power would be good in games for like 3 years.. Maybe.. Did i also say that you will never need more than 640k of RAM?

2nd EDIT: Spelling is leet
 

VirtualLarry

No Lifer
Aug 25, 2001
56,453
10,121
126
Originally posted by: Steg55
Oh and a message for GTaudiophile
Fanboyism is more or less accepted these days - everyone is biased towards either nVidia or ATi in some way. Obvious fan boys posts will be shuned by the otherside and accepted by the side you support. However, insulting the genius of 3DFX will get you nowhere, and the community will happily stand together and support 3DFX, nVidia or ATi fanboy alike. 3DFX will forever remain as the best EVER producer of graphics processing units and holding the most 'Firsts' in the graphics industry. Hell, every time your machine renders a textured polygon its using techniques first developed by 3DFX with the first ever 3D proccessor. This may well be a fanboy post - but the difference is that EVERYONE can respect the might of 3DFX.
Steg

It's always fun to read a fanboy's post directed to another alleged fanboy from the "other camp".

Anyways, if you were to study your computer-graphics history, you might realise that SGI's work in 3D rendering predates 3Dfx's, probably by more than a decade, actually. As far as advanced hardware 3D ASICs go, 3DLabs (not 3Dfx) was there first too, with full hardware-acceleration of the OpenGL pipeline. (SGI's solutions, afaik, were not single dedicated ASICs, but rather whole dedicated subsystems.)

Oh yeah, SGI invented GL. (This was before it was renamed "OpenGL", and became multi-platform.)

What 3Dfx did though, and I give them full credit for this - is to jump-start the consumer PC market for advanced 3D hardware accelleration on the desktop. They were in fact the first true PC 3D "accelerator", especially in that they were dedicated for 3D rendering only, and required to piggy-back onto a regular 2D graphics card as well.

So while 3Dfx may have been pioneers in their own right, and I give them the respect that they are/were due, they did not invent 3D graphics themselves.

What brought about their downfall too, was ironically their own arrogance, and their belief that they could do no wrong, which is unfortunately a belief shared by many of their supporters. They refused to *really* innovate, sticking with 16-bit color instead of introducing 32-bit color support like their competitors, and stuck with PCI and a PCI-like AGP implementation, instead of embracing the AGP standard's full feature-set like their competitors. Indeed, I see NV's resurgent interest in "SLI" technology, to be a hallmark of true innovation here. Even though the technology was pioneered on the PC by 3Dfx in this case (although this new version is actually different technology, load-balancing instead of true SLI), the fact that NV is willing to go the extra mile to support it, ahead of their competitors that don't (yet) have an equivalent feature, is IMHO "innovative", and sets them ahead of the pack, for now.

I really hope that the concept of "load-balancing" in general, on the PC hardware platform, gets a lot more interest. It's one of the features that SGI's graphics subsystems were IMHO famous for, in that they could dynamically vary the 3D rendering load of their graphics systems, in order to maintain a steady frame-rate, by reducing lesser details from the scene during periods of heavy load.

No doubt that if companies like ATI tried this, they would be blasted by claimed of "cheating" in their driver-set, even if doing such would lead to an advantage during gameplay, where smooth framerates are critical. The way things stand today, games have varying "loads" that they place on the PC's graphics subsystems, and that in order to maintain a smooth minimum framerate under those maximum loads, they have to over-specific their graphics power, simply to handle these smaller peak loads. It would be much better if the loads could be dynamically reduced, in order to maintain a smooth framerate instead. Therefore you wouldn't have to break the bank on the fastest accelerator out there. Then again, that might cut into profit margins on the high-end cards, so that is another reason why current PC graphics companies don't add that feature to their drivers.

If some company ever does though, I nominate to call it "SmoothFrame" support.

In fact, depending on how NV and/or AW is doing their load-balancing between cards in their driver set, they could apply similar techniques to reduce loads on a single graphics card.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,453
10,121
126
Originally posted by: nitromullet
I am really on the fence about this one.

On one hand I like the idea of being able to increase performance by simply getting 2 of the same components. It offers the advantage of a modular upgrade and the cost breakup that goes along with that.

On the otherhand, it doesn't solve the real problem. Chips get bigger, hotter, and more expensive. We need a new tech to fix this, not half assed solutions that simply combine two already exisiting parts. To put it into perspective, how many of you would have ever taken 3dfx's SLI solution over a GeForce 2 GTS? The GeForce 2 GTSis clearly superior technology in that it is much smaller, comsumes less power, and perfomrs better.

What about the trends in CPU design, that are leaning towards combining two CPU cores onto one die, instead of continually creating more complex, hotter, more expensive chips?

Both Intel, AMD, and Sun are all doing this, with Sun alreadying having done this sucessfully in the past. Intel and AMD are just getting into this technology.

AMD's dual-core Opterons are said to be a drop-in replacement for original single-core Opterons, assuming that the mobo was designed properly for the power requirement standards that AMD released.

I think that SLI makes perfect sense, even moreso than regular CPUs, because the task of 3D graphics is easily and directly parallelizable.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Maybe you should take a patent on your revolutionary idea about making chips that are much smaller, comsumes less power, and perfomrs better.

Funny, obviously I understand that this is the holy grail of all GPU's and CPU's. My point was not that nVidia doesn't realize this, since I'm sure they do.

You simply can't get double performance, new features, less heat and power consumption all at the same time.

but, apparently, SLI was beaten by a single card solution that was smaller and consumed less power before...

... in 12 months this kind of setup might get owned by a setup X900 or 7900 Nvidia or something. It was sad to see my old Rage Pro/Diamond Voodoo II setup get owned by the GF or even my G400 MAX at times..
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
I think that SLI makes perfect sense, even moreso than regular CPUs, because the task of 3D graphics is easily and directly parallelizable.
It certainly does have some merit for the enthusiast. However, I would be very surprised if dual graphics cards in every pc was the next great solution to improving performance. Dual core may be the way to go, but you also have to consider the fact that NV40 has 220 million transistors, and there are supposed issues with yield as it is. I don't think they are ready to jump to 440 million yet. Again, there will be a new tech that solves this problem. It may come from nVidia, it may come from elsewhere. Dual video cards seems to me to be an interim solution IMHO.
 

Regs

Lifer
Aug 9, 2002
16,665
21
81
Nvidia knows how to flirt their cash cows.

This sucker is a beast and highly unpracticle. 450 dollars just for a x800 or 6800u, and they expect people to pay over a grand for this thing? Motherboard and PSU? Forget it. They'll make 5 of these and call it a night. Don't expect to be able to buy them.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Regs
Nvidia knows how to flirt their cash cows.

This sucker is a beast and highly unpracticle. 450 dollars just for a x800 or 6800u, and they expect people to pay over a grand for this thing? Motherboard and PSU? Forget it. They'll make 5 of these and call it a night. Don't expect to be able to buy them.
You mean ati does . . . they were the ones who "cheaped out" with the extendo vertsion of r300.

:roll:

and get used to it it is the wave of the future (if you at look beyond your games) for 3d workstation rendering - They will buy TONS of them.
 

Regs

Lifer
Aug 9, 2002
16,665
21
81
Well apoppin, if this is the future then I think its going in reverse. The first computer ever made could fit a full size gym or even a foot ball field. So by 2010 im hoping to see graphic cards do the same just like their ancestors did?
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |