- Oct 12, 1999
- 2,157
- 0
- 71
Originally posted by: MegaWorks
The sexiest card ever made :laugh:
Originally posted by: bigshooter
Too bad X1950Pro isn't coming out until October. I don't want to wait 2+ months for a card that may not be much faster than a 7900GT.
Originally posted by: tanishalfelven
the x1900xt already kills the 7900gt.
but no it will prbably be dual slot.
Originally posted by: tuteja1986
X1950 series are the 1st to use DDR 4 and also it has the 2nd version of crossfire which doesn't require dongle. Its now internal like SLI. It also has a much better quite cooler and the pro edition is a single slot. It also takes full advantage of 512bit ring bus with DDR 4 memory Bandwidth. The X1950XTX in theory should totally destroy the 7900GTX.
Originally posted by: tuteja1986
X1950 series are the 1st to use DDR 4 and also it has the 2nd version of crossfire which doesn't require dongle. Its now internal like SLI. It also has a much better quite cooler and the pro edition is a single slot. It also takes full advantage of 512bit ring bus with DDR 4 memory Bandwidth. The X1950XTX in theory should totally destroy the 7900GTX.
Also it comes out in August mid.
Originally posted by: Cookie Monster
Originally posted by: tuteja1986
X1950 series are the 1st to use DDR 4 and also it has the 2nd version of crossfire which doesn't require dongle. Its now internal like SLI. It also has a much better quite cooler and the pro edition is a single slot. It also takes full advantage of 512bit ring bus with DDR 4 memory Bandwidth. The X1950XTX in theory should totally destroy the 7900GTX.
Also it comes out in August mid.
Dude, know what your talking about.
The card is using 256bit GDDR4. NOT 512bit GDDR4 which is VERY expensive. The ring bus doesnt have much to do with this but its mostly about reducing memory latency. All it is that the GPU is 512bit internally, than it maps out to a 256-bit 8 channel memory interface. I guess all X1 series support up to 256bit memory interface. For a 512bit it would need a 1024bit ring bus. Quote me if im wrong.
And i guess ATi decided to copy NV with the internal connector. I was wanting more of dongless/no bridge etc crossfire. They just straight off copy NV.
The G80 is set to hit around september/october even before Vista is released. Its going to be on 90nm i believe.
Yes, but even dual-slots have intake fans, so you still can't install anything below it. So single slot uses two slots in essence, dual-slot uses three. By using a dual-slot card I'm sacrificing TWO slots below my video card. No thanks.Originally posted by: StrangerGuy
I don't understand this single-slot vs dual slot thing.
Why the fvck anyone wants a hot single-slot card that builds up heat in the case? Isn't exhausting hot air out of a dual-slot bracket far better? And as if anyone smart would install a expansion card that blocks airflow right beside a single-slot card. Geez.