Originally posted by: Ylurien
Isn't anyone else worried that overclocking their PCI-E bus is going to cause system instability? It seems like everyone is overjoyed by this news...?
If the PCI-E bus has to be overclocked for a 9600GT to achieve the same (actually slightly worse) results than an 8800 at stock PCI-E speeds, doesn't that mean the product (9600GT)is inferior? I realize that the number of shaders and ROPS and everything is quite different, but for the price, why would anyone still be considering the 9600GT?
I don't want to have to overclock my PCI-E bus just to get close to what I could get if I were to spend a few more dollars on an 8800GT. This new information certainly explains why the 9600GT runs even hotter than the 8800GT.
It seems to me like Nvidia is doing these kinds of things so that they can skip an entire generation of video card technology - and avoid the costs that would have gone into the research and development for that generation - and still maintain a lead over ATi.
Same goes for 780i. The 650i/750i never supported this in the first place.***Note: NVIDIA LinkBoost? technology has been removed as a feature from NVIDIA nForce® 680i SLI
Yes, but then Riva Tuner isn't in question given it was the one that found the discrepancy in the first place.Can't this be put to rest by testing it on Intel chipsets to see if the observed (as in benchmarked) performance changes versus an 8800GT on the same board under same conditions?
Originally posted by: Ylurien
Alright - brain lapse on my part. For some reason I thought the PCI bus was connected to the PCI-E bus, but of course that's not true.
So ok we can overclock the bus to get higher speeds with the 9600GT. But doesn't that effectively mean that we are overclocking how fast the card runs, which will increase temperatures? How can we then have much room to overclock the core or the memory of the card itself?
Isn't there a set amount that any card can be overclocked, whether it's coming from an overclock in the PCI-E bus or from an overclock on the card itself? If we overclock an 8800GT as far as it can go on the card itself, and then overclock a 9600GT using a combination of a PCI-E bus overclock and an on-board overclock, won't they be pretty much the same sort of Mhz boost for each card? Or is the 9600GT just inherently better with heat, etc., and therefore more overclockable?
I probably am too clueless about this, so if anyone could enlighten me...
Originally posted by: krnmastersgt
That makes a lot of sense actually, someone will probably say that isn't entirely fair but I like the way the 9600 is shaping out, and it's big brothers are soon to come out which if they follow the same designs, will be monstrous cards. Maybe we can finally play Crysis at 19x12 with max settings at over 60 fps.
Originally posted by: Demoth
Don't care how they did it, the power draw is still low relative to it's power.
...
Good riddance to 8400GTs and other garbage cards like it.
Originally posted by: nullpointerus
To find out exactly what is running off the PCI-E bus on your computer, you can open Device Manager and select the option View --> Devices by connection. Look for devices off the PCI- Express ports, not directly off the PCI bus itself.
Originally posted by: Piuc2020
Originally posted by: krnmastersgt
That makes a lot of sense actually, someone will probably say that isn't entirely fair but I like the way the 9600 is shaping out, and it's big brothers are soon to come out which if they follow the same designs, will be monstrous cards. Maybe we can finally play Crysis at 19x12 with max settings at over 60 fps.
Do all the wishful thinking you can, and even if quantum mechanics says otherwise, it won't change the fact that your claims are incoherent, unfounded and quite simply, unreal.
Originally posted by: krnmastersgt
That makes a lot of sense actually, someone will probably say that isn't entirely fair but I like the way the 9600 is shaping out, and it's big brothers are soon to come out which if they follow the same designs, will be monstrous cards. Maybe we can finally play Crysis at 19x12 with max settings at over 60 fps.
Originally posted by: Dadofamunky
Originally posted by: krnmastersgt
That makes a lot of sense actually, someone will probably say that isn't entirely fair but I like the way the 9600 is shaping out, and it's big brothers are soon to come out which if they follow the same designs, will be monstrous cards. Maybe we can finally play Crysis at 19x12 with max settings at over 60 fps.
Don't hold your breath
heh
Not an ATI fanboy, I just know that DX10 makes video cards cry
Originally posted by: krnmastersgt
Originally posted by: Dadofamunky
Originally posted by: krnmastersgt
That makes a lot of sense actually, someone will probably say that isn't entirely fair but I like the way the 9600 is shaping out, and it's big brothers are soon to come out which if they follow the same designs, will be monstrous cards. Maybe we can finally play Crysis at 19x12 with max settings at over 60 fps.
Don't hold your breath
heh
Not an ATI fanboy, I just know that DX10 makes video cards cry
Hopefully tri-sli will get us there I don't even play Crysis, nor do I want to, I just want to know that the most demanding game out there can actually be run at a high res with max settings and still remain beautiful and seamless.
Originally posted by: SniperDaws
Ok ive just done 2 benchies of 3dmark 06 heres the setup
Q6600 @ 3.2Ghz
2048 DDR2 XMS2
9600GT @ Stock 650/1625/900
P35C DS3R Rev 1.1 Intel P35 chipset.
Scores with PCI-e @ 100Hz
3dMark score 11527
SM2.0 4683
SM3.0 4387
CPU 5091
Score With PCI-e @ 110Hz
3dMark Score 12176
SM2.0 5003
SM3.0 4667
CPU 5081
So a little jump in performance there
Originally posted by: jaredpace
nah, he went into his bios with the card clocked at stock speeds. He went to PCI-E frequency and changed it from 100 to 110. booted windows and ran 3dmark06.
Then he went back to bios and set the PCI-E back to 100mhz (which is default) ran 3dmark again, and got a lower score.
Originally posted by: jaredpace
...
Nice it works on p35 boards, as well as nvidia, and therefore is not a 780/650/680i exclusive feature, and has nothing to do with linkboost.
And the 9600gt is the only nvidia card that does this.
Interestingly it reports the same core speed at 100, or 110 yet notice the obvious performance increase in 3dmark. I'd say that's a tad shady.